Essential Functions
- Provide technical guidance for our Cloudera CDP technology with the willingness to learn and support Kafka and open-source DBMSs
- Upgrade, patch, monitor, optimize and resolve technical issues related to the technology managed by this team
- Understand functionality components of CDP and provide the best guidance to developers to meet business needs
- Work closely with architecture and developers to ensure optimal availability, performance, stability, etc for the technology managed by this team
- Participate in building and updating technical standards and ensure compliance
- Strong technical problem-solving skills and ability to teach others
- Provide guidance and set standards for teams to write data pipelines and build data repositories using current technologies
- Research/sustain competency relevant to current technology to maintain and improve functionality for organization’s Hadoop applications.
Required Experience
- Bachelor’s degree in Engineering, Computer Science, Information Technology or related discipline, or equivalent work or military experience.
- Minimum 2 years of experience with administrating a Hadoop cluster
- 2 years of experience with Hadoop related principles, practices, and procedures.
Preferred Qualifications
- Experience working with application developers to understand their needs in order to help them to access Big Data platforms.
- Experience with Agile methodologies including Scrum.
- Experience with managing Big Data products (Hadoop – HDFS, Impala, Hive, HBase, Spark, Yarn, Hue, KTS, Sentry, ZooKeeper, etc.)
- Experience with partitioning within Hadoop
- Experience with RESTful API, Services.
- Experience with Linux systems and scripting languages.
- Experience with functional programming (Scala, Python).