Capstone is assisting our client in hiring a highly-motivated, top-performing Hadoop Administrator to play an integral part in analyzing, monitoring, and tuning our clientrsquos Hadoop ecosystem including HiveLLAP, Spark, Kafka, and HDFS. This position will design, develop, implement, configure, and support our Hadoop Infrastructure allowing the client to leverage data effectively for data streaming and ingestion, advanced analytics, and client-impacting applications. The ideal candidate will have a passion for the Hadoop ecosystem.
This individual will have a proven track record of working with leading-edge data technologies and be a dependable, hard-working, and creative problem-solver. Our client is a catalyst for change, providing actionable customer, patient, and employee insights that boost loyalty and drive business outcomes. Their unique model puts a dual focus on platform technology and professional servicesmdashmaking it easier to collect, analyze, and share feedback and behavioral data across the enterprise.
Responsibilities Responsible for implementation and ongoing administration and configuration of Hadoop data platform and Linux environments. Analyze, monitor, and tune Hadoop, Spark, HiveLLAP, Kafka, and HDFS data systems. Contributes to the production support role via an on-call rotation.
Responsible for regular patch maintenance and Java version upgrades when required. Development of Bash andor Python scripts to monitor and manage the Hadoop cluster. Manages configuration of high availability for all components in the Hadoop stack.
Responsible for implementation of Hadoop security, including Kerberos, Ranger Knox Performs administration of PostgreSQLMySQLMariaDB databases used for Hive metastore. Assist in the administration of Linux based hardware. Responsibilities include identification replacement of failed drives, OS-level configuration tuning, etc.
Performs Hadoop user management (new accounts, permissions access). Contributes to resolution of open support tickets, working with vendors as needed. Provides documentation and ongoing training to the engineering teams for the Hadoop platform.
Minimum Qualifications Bachelorrsquos degree in Computer Science or the equivalent through a combination of education and related work experience in Computer Engineering, Computer Science, Data Sciences or a related science or engineering field. Preferred experience managing and maintaining Hadoop clusters, including Hortonworks Data Platform or Cloudera Data Platform. Experience with programming and scripting languages including Python, BASH, sedawk.
Experience with managing and tuning CentOS or RedHat operating systems. Strong knowledge of Linux and Linux system internals and optimization. 3+ years of experience advanced relational and Big Data database systems including HiveLLAP, Impala, MySQL, MariaDB, PostgreSQL, Oracle, Microsoft SQL Server, etc.
Experience managing commodity hardware as well as cloud-based infrastructure such as Amazon Web Services. Working knowledge of Kafka and Spark. Demonstrated research, analytical, critical thinking, decision-making, consulting, and problem-solving skills.
Ability to work with limited direct supervision. Effective oral, written, and interpersonal communication skills.
Capstone IT, Inc.