Big Data Consulting Architect-Bay Area
Santa Clara, CA 95054
BigData Consulting Architect-
Join a fast growing substantially funded start up in BigData/Hadoop/Container space (BDaaS). Enabling enterprises to extract more sophisticated insight from data and information to drive the advanced decisions and innovations of tomorrow.
The solution enables enterprises to implement Big-Data-as-a-Service platform running on their own data center infrastructure or in a hybrid architecture.
Your focus will be on architecting and developing analytics solutions to deploy and operate Big Data-as-a-Service platforms. Dev Ops, Cloud, Containers and Kubernetes.
Key Drivers Are:
Design, architecture and deployment of Big Data as a Service solutions using BigData platform.
Work across different business units to translate their business requirements into technical functional specifications.
Work across different platform/system owners and vendors to craft out best deliverable solutions, balancing the scope, cost and timeline. Coordinate, supervise, and quality control the work of the Solution through-out the project period.
Interact with customers’ Data Scientists and Business Intelligence teams in developing the most effective analytics and advanced algorithms models that lead to optimal value extraction from the data
Develop Solution Requirements, Solution Architecture, Deployment and test plans for all projects.
Excellent coding/development/testing skills primarily in Python.
In-depth knowledge of Hadoop (Cloudera, Hortonworks, primarily) as well as Big Data Open source technologies (Spark, Kafka, Hive, Flume, Storm, Sentry.
Exposure to Machine Learning and AI technologies (TensorFlow, BigDL, H20, Jupyter)
5 or more years experience in Dev Ops, Cloud, Containers and Kubernetes. Deep understanding of the rich data sources and Big Data Applications from a variety of industries in order to define the appropriate analytics architecture to satisfy client needs.
Ability to manage and oversee large-scale deployments to key customers globally.
Extensive experience in Big Data deployments and technologies (e.G. Docker containers, virtualization, Machine Learning, Deep Learning, Big Data Applications, Security, Access Controls, Network) as well as a proven track record in managing and leading complex projects
Collaboration with internal/external stakeholders, team play, problem solver is MUST for this role.
Extensive architecture and delivery hands-on experience in delivering BigData/Cloud solutions: Hadoop Platforms: Cloudera, Hortonworks, MapR
Willing to travel up to 30% globally.