Big Data DevOps Lead
Zeta Global
Job ID: ESf8j4UKF1mPinAjAAAAAA==
Location: United States
Description:
Summary:
As a Senior or Lead Big Data DevOps Engineer, you will be working with a team responsible for setting up, scaling, and maintaining Big Data infrastructure and tools in private and public cloud environments.
Main Responsibilities:
• Driving improvement of the efficiency of Big Data infrastructure.
• Coordinating cross-team infrastructure and Big Data initiatives.
• Leading Big Data – related architecture and design efforts.
• Ensuring availability, efficiency, and reliability of the Big Data infrastructure.
• Building and supporting tools for operational tasks.
• Evaluating, designing, deploying monitoring tools.
• Design and implementation of DR/BC practices and procedures.
• On-call support of production systems.
Requirements:
• 7+ years of experience working with Hadoop, preferably Open Source.
• 3+ years of leading Big Data, DevOps, SRE, DBA, or development team.
• Experience setting up and running Hadoop clusters of 1000+ nodes.
• Solid knowledge of NoSQL databases, preferably Cassandra or ScyllaDB.
• Experience running and troubleshooting Kafka.
• Working knowledge of at least one of: Terraform, Ansible, SaltStack, Puppet.
• Proficiency in shell scripting.
Nice to have:
• Experience with Prometheus.
• Experience managing Showflake.
• Solid knowledge of Graphite and Grafana.
• Python or Perl scripting skills.
• Experience with installing and managing Aerospike.
• DBA experience with one of: PostgreSQL, MySQL, MariaDB.
Remote: False
Min Salary: 0
Max Salary: 0
Salary Frequency: yearly
Posted Date: 10 days ago
Craft Resume Craft Cover Letter