• Administer and maintain Big Data apps including SAS, Qlik and Hortonworks Hadoop clusters across all environments
• Proactively monitor cluster health and perform performance tuning activities.
• Perform capacity planning and expansion activities
• Perform cluster maintenance with patching, user provisioning, automation of routine tasks, troubleshooting of failed jobs, configure and maintain security policies.
• Work with application teams and provide deployment support
• Provide input to the development of project initiation documents including objectives, scope, approach, estimates and deliverables.
• Define and validate non-functional (technical) requirements and establishing traceability between requirements and application architecture/design.
• Review 3rd-party vendor products and provide support during research and development, prototyping, and vendor evaluations
• Providing support / guidance to development teams during the analysis, development, and testing processes.
• Participate in collaborative development of data management, business intelligence and analytic architecture standards and best practices and facilitating understanding and adherence to those standards.
• Have strong verbal and written communication and interpersonal skills and demonstrated ability to work with others effectively in teams
• Able to work independently, and manage components of large engagement
• Able to work in a fast paced and demanding environment
• Should be a self-learner and good team player
• Strong problem skills with the ability to exercise mature judgment
A large government agency is implementing Big Data to consolidate applications and databases across the enterprise into a single storage solution. The client has chosen the Hortonworks Hadoop Distributed Platform (HDP) for managing it high cost in-memory appliances. The client is also expanding its business intelligence toolset to include SAS, QlikSense and Lumira data visualization tools, to provide advanced analytical data exploration capabilities.
The client's Technology Management team is looking for a Big Data Senior engineer to support the Big Data team. The Big Data Senior engineer will be responsible for providing technical and architecture support for the Big Data architecture consisting of HDP, SAS Grid Enterprise and QlikSense applications. The responsibilities will include:
· Administering Hadoop clusters, QlikSense and SAS in T&D and Production environments
· Performing application installs, upgrades, patching and troubleshooting efforts
· Assist with analyzing and developing processes for data ingestion, modeling, mining
Undergraduate degree required. Responsible for the engineering of big data solutions and multi-tiered data environments. Experience with large scale big data deployments in Government and large commercial environments preferred. They will lead initiatives utilizing big data solutions to provide actionable insights for addressing strategic and tactical mission objectives