Role: Hadoop Administrator
Location: central singapore
JD:
Responsibilities:
- Design, deploy, implement, manage, and support a big data platform (on-premise or cloud-based).
- Design, manage, and maintain tools and scripts to automate system installation and configuration.
- Configure system security and implement regular patch updates.
- Integrate different components within the big data system, including analyzing source code for optimal integration strategies.
- Explore and implement new technologies like Kubernetes to automate system setup and operations.
- Collaborate with data scientists, analysts, and stakeholders to assist with data-related technical issues and support data pipeline infrastructure and data preparation needs.
- Participate in the design, development, and testing of a highly scalable and resilient data analytics platform for various domain applications.
- Conduct root cause analysis, implement proactive measures, and monitor their effectiveness.
Requirements:
- Bachelor's or Master's degree in Computer Science, Software Engineering, Information Systems, or a related field.
- 5+ years of hands-on experience in the Hadoop ecosystem and software development.
- At least 3 years of experience as a Hadoop Administrator, ideally managing multiple clusters.
- Experience installing, configuring, troubleshooting, and securing big data systems and clusters.
- Proficiency in Python and shell scripting.
- Strong desire to learn and adapt to new technologies.
- Familiarity with Linux/UNIX system administration, shell scripting, and automation tools (e.g., Ansible).
- Experience in DevOps and DataOps methodologies.
- Deep understanding of both relational and NoSQL databases (e.g., PostgreSQL, Oracle DB, Cassandra, MongoDB, Neo4J).
- Experience in operational support for delivering Big Data solutions.
- Excellent oral and written communication skills with strong analytical, problem-solving, and multitasking abilities.