Big Data Consulting Architect

NYC, NY | Direct

Post Date: 12/27/2017 Job ID: atc-437 Industry: Consulting Services

BigData Consulting Architect-

 

Join a fast growing substantially funded start up in BigData/Hadoop/Container space (BDaaS). Enabling enterprises to extract more  sophisticated insight from data and information to  drive the advanced decisions and innovations of tomorrow.

 

The solution enables enterprises to implement Big-Data-as-a-Service platform running on their own data center infrastructure or in a hybrid architecture. 

 

Customers can provide on-demand agility and elasticity to analytics and data science teams  saving  upwards of 75 percent on infrastructure and operational costs by spinning up instant clusters for Hadoop, Spark, and other Big Data tools in a secure multi-tenant environment.

 

Your focus will be on architecting and developing analytics solutions to deploy and operate Big Data-as-a-Service platforms. 

 

 

Key Drivers Are:

 

 

Design, architecture and deployment of Big Data as a Service solutions using BigData platform.




Lead discussions on how to enable useful insights, predictive success and actionable interpretations from the Big Data Applications sitting on top of the BigData platform.




Work across different business units to translate their business requirements into technical functional specifications.




Work across different platform/system owners and vendors to craft out best deliverable solutions, balancing the scope, cost and timeline. Coordinate, supervise, and quality control the work of the Solution through-out the project period.




Act as a lead project manager and technical architect throughout the entire engagement




Assist customers in the evaluation and recommendation of all potential Big Data Use Cases technologies or solutions based on technical capability, functionality, cost vs benefit and risk criteria.




Interact with customers’ Data Scientists and Business Intelligence teams in developing the most effective analytics and advanced algorithms models that lead to optimal value extraction from the data




Lead the training/education activities in all deployments




Develop Solution Requirements, Solution Architecture, Deployment and test plans for all projects.




Participate in pre-sales activities when needed




Work with product management and engineering in ensuring project delivery success




Develop and evolve the Engagement Methodology




Assist in the creation of materials for new professional services, educational services or managed services offerings

 
To Succeed You Must Have:
Deep understanding of the rich data sources and Big Data Applications from a variety of industries in order to define the appropriate analytics architecture to satisfy client needs. 
Ability to manage and oversee large-scale deployments to key customers globally. 

Extensive experience in Big Data deployments and technologies (e.G. Docker containers, virtualization, Machine Learning, Deep Learning, Big Data Applications, Security, Access Controls, Network) as well as a proven track record in managing and leading complex projects  
Collaboration with internal/external stakeholders, team play, problem solver is MUST for this role.




10 or more years of experience consulting, delivering and managing Big Data / Data Analytics projects working in a Professional Services or Systems Integration Organization.




Deep understanding of complete end-to-end DataWarehouse / Business intelligence / Data Analytics solutions.




Advanced experience in requirements gathering, data modeling (ER, Dimensional), logical/physical design, ETL technology, programming and performance tuning.




Extensive architecture and delivery hands-on experience in more than one of the following solutions:




Data Ingestion: Flume, Scoop, Kafka, Talend, Informatic




Data Storage: HDFS, HBase




Data Processing: M/R, Spark, Hive Server, Impala, Tez, Spark-SQL, H2O, TensorFlow, Intel BigDL, SAS VIya, MSR




Data User Tools: Hive, CLI, Spark-SQL, Jupyter, Zeppelin, Tableau, R-studio, Eclipse, Pentaho




Hadoop Platforms: Cloudera, Hortonworks, MapR




Security/Access Control: LDAP, Kerberos




Competent in Internet and related technologies - HTML, Perl/PHP, CGI scripting, Python, Java, Java script, Java servlets, XML, XSL, JSON aplus.




Ability to describe ideas in a concise manner. Use collaboration to drive to a final solution. Able to make commitments and followthrough.




Experience in leading cross-departmental virtual teams.  Familiarity with working across multiple and virtual settings.




Masters/Bachelor in Computer Science, Information Management, Informatics or Engineering with experience in a related field with at least 3 years of experience in analytics domain. 




Knowledge and experience in Data Mining, Machine Learning, Statistics, Operations Research or related field would be an advantage.




Self-driven as well as ability to work in a team setting. Good communication, interpersonal and negotiation skills are essential.




Willing to travel up to 30% globally.

Not ready to apply?

Send an email reminder to:

Share This Job:

Related Jobs: