Big Data Delivery Consultant
Santa Clara-, CA 95054
Big Data Delivery Consultant-
Join a fast growing substantially funded start up in BigData/Hadoop/Container space (BDaaS). Enabling enterprises to extract more sophisticated insight from data and information to drive the advanced decisions and innovations of tomorrow.
The solution enables enterprises to implement Big-Data-as-a-Service platform running on their own data center infrastructure or in a hybrid architecture.
Customers can provide on-demand agility and elasticity to analytics and data science teams saving upwards of 75 percent on infrastructure and operational costs by spinning up instant clusters for Hadoop, Spark, and other Big Data tools in a secure multi-tenant environment.
Your focus will be on architecting and developing analytics solutions to deploy and operate Big Data-as-a-Service platforms.
Key Drivers Are:
Assist in the design, architecture and deployment of BigData as a Service solutions using BigData platform.
Assist in the discussions on how to enable useful insights, predictive success and actionable interpretations from the Big Data Applications sitting on top of the BigData platform.
Assist the lead project manager and technical architect throughout the entire engagement
Interact with customers’ Data Scientists and Business Intelligence teams in developing the most effective analytics and advanced algorithms models that lead to optimal value extraction from the data
Lead/assist in the training/education activities required in all deployments
Lead the BigData platform deployment and testing activities, such as network and security integration, platform implementation and application images rollout.
Lead the development and testing of application images
Assist in the development of the engagement’ s Solution Requirement, Solution Architecture, Deployment and test plans.
Participate in pre-sales activities when needed
Work with product management and engineering in ensuring project delivery success
Assist in the evolution of BigData PRECISION DeliveryMethodology
Assist in the creation of materials for new professional services, educational services or managed services offerings
Your focus will be on deployment and development of Big Data-as-a-Service platform for the customers. You will develop and test custom application images on top of BigData platform.
To Succeed You Must Have:
Key expertise: Cloudera Administration, Orchestration and Automation.
Expertise in programming (Python, Java, SDLC).
Must be on a customer service, professional services or customer support role (e.G. Customer engagement skills). Containers and/or Kubernetes knowledge (expert level is desirable).
Deep understanding of the rich data sources and Big Data Applications from a variety of industries in order to define the appropriate analytics architecture to satisfy client needs.
Ability to design BigData solutions and develop code, scripts, pipelines that leverage structured and unstructured data integration from multiple sources.
Extensive hands-on experience in Big Data deployments and technologies (e.G. Docker containers, virtualization, Machine Learning, Deep Learning, Big Data Applications, Security, Access Controls, Network) as well as a proven track record working in consulting engagements spanning from design to implementation in all phases of BigData projects.
· Collaboration with internal/external stakeholders, team play, problem solver is MUST for this role.
10 or more years of experience working in a Professional Services and/or Systems IntegrationOrganization.
5 or more years of experience consulting and delivering Big Data / Data Analytics projects
Good understanding of complete end-to-end DataWarehouse / Business intelligence / Data Analytics solutions.
Good experience in requirements gathering, data modeling (ER, Dimensional), logical/physical design, ETL technology, programming and performance tuning.
Extensive architecture and delivery hands-on experience in more than one of the following solutions:
o Data Ingestion: Flume, Scoop, Kafka, Talend, Informatic
o Data Storage: HDFS, HBase
o Data Processing: M/R, Spark, Hive Server, Impala, Tez, Spark-SQL, H2O, TensorFlow, Intel BigDL, SAS VIya, MSR
o Data User Tools: Hive, CLI, Spark-SQL, Jupyter, Zeppelin, Tableau, R-studio, Eclipse, Pentaho
o Hadoop Platforms: Cloudera, Hortonworks, MapR
o Security/Access Control: LDAP, Kerberos
· Competent in Internet and related technologies - HTML, Perl/PHP, CGI scripting, Python, Java, Java script, Java servlets, XML, XSL, JSON aplus.
Ability to describe ideas in a concise manner. Use collaboration to drive to a final solution.
Familiarity with working across multiple and virtual settings.
Masters/Bachelor in Computer Science, Information Management, Informatics or Engineering with experience in a related field with at least 3 years of experience in analytics domain.
Knowledge and experience in Data Mining, Machine Learning, Statistics, Operations Research or related field would be an advantage.
Self-driven as well as ability to work in a team setting. Good communication, interpersonal and negotiation skills are essential.
· Willing to travel up to 30% globally.