Responsibilities
• Perform operation, installation, and monitoring of big data platforms
• Ensure the reliability and normal operation of multiple core systems for big data and online computing, resolving issues within the agreed SLA
• Evaluate infrastructure requirements for deployments and upgrade activities
• Develop and maintain technical documentations related to design, configuration, troubleshooting, backup, DR, etc
• Ensures deployment and operations of the data platform comply with prevailing security, regulations and audit requirements
• Provide recommendations on data governance practices and in-depth optimizations best practices
Requirements
• primary skillset: DevOps and system engineers in Unix
• Solid computer software basic knowledge; understand Linux operating system, storage, network IO and other related principles
• Familiar with one or more programming languages, such as Python/Go/Java/Shell/Ansible
• Experience with Cloudera, Informatica, or Denodo and its runtime components would be highly advantageous
• Relevant computing/distributed/big data system experience (Hadoop/HDFS/Kubernetes/Docker/OpenStack/Hadoop/Spark/Flin etc.)
• Good data structure and system design skills
• Good knowledge of information security
Next Step
- Prepare your updated resume and expected package.
- Simply click on 'Apply here' or email to [email protected] to drop your resume
- All shortlisted candidates will be contacted.
Tamanna Bilandi
EA Licence No. 91C2918
Personnel Registration No. R2096241