Job Purpose
Looking for an experienced Hadoop admin to join the Group Data function and manage the Hadoop based big data platform.
The role will be responsible of day-to-day operations of the big data platform. This includes monitoring, system upgrade, incident troubleshoot and service restoration, security patching, deployment, capacity management, etc.
The Job
Platform Upgrade and Maintenance
· Conduct the development and implementation of upgrades, and maintenance of the data platform
· Conduct timely patches based on vulnerabilities findings
· Monitor the platform and identify any performance gaps
· Optimize the platform infrastructure to meet scalability, performance, and security requirements
Capacity Management
· Monitor the utilization of the data platform and plan for any expansion if required
Issue/Incident Management
· Troubleshoot and fix the issue based on pre-defiled SLA
· Restore the services in case of system/service downtime based on pre-defined SLA
· Provide timely update to the users on the situation
· Escalate to management based on escalation matrix
User Access Management
· Onboard and offboard users to the platform based on the approval sought
· Conduct regular review of user accounts and take necessary actions to ensure users are valid and active
· Provision user access based on the approval sought
Data Security and Privacy
· Implement robust data security measures to safeguard sensitive information within the data platform
· Ensure compliance with data privacy regulations and industry standards
Deployment
· Conduct deployment of data pipelines developed by data engineers
Process
· Ensure operational Efficiency
· Participate in the creation of standards, guidelines, and procedures related to the data platform
· Monitor the performance, efficiency and effectiveness of operational processes and identify any areas of improvements to drive process optimization and efficiency
· Takes accountability in considering business and regulatory compliance risks and takes appropriate steps to mitigate the risks.
· Maintains awareness of industry trends on regulatory compliance, emerging threats and technologies in order to understand the risk and better safeguard the company.
· Highlights any potential concerns /risks and proactively shares best risk management practices.
Our Requirements
· Bachelor’s degree in IT, Computer Science, Software Engineering, Business Analytics, and equivalent of 5+ years of experience in distributed systems like Big Data Hadoop.
· Proficiency in Hadoop: Proven experience in Hadoop technologies (Spark, HDFS, Hive, Sqoop, Airflow, Docker)
· Managing data infrastructure: Experience in monitoring of data platform infrastructure and incident handling.
· Knowledge of SQL: Experience in MariaDB or any other relational database management system.
· Experience with ETL: Experience in managing and operating enterprise ETL tools
· Programming skills: Experience in one or more programming languages such as Python, Java, or Scala, and be able to develop custom shell scripts and applications for data operation and monitoring
· Data warehousing: good understanding of data warehousing concepts, including data architecture, dimensional modelling, and ETL processes.
· Experience with DevOps: Experience in DevOps tools and environment (BitBucket, Jira, Bamboo)
· Experience in Agile Methodology
· Collaboration: Work closely with data analysts, data engineers, data scientists, and business stakeholders to understand data needs and provide solutions to complex data problems. Should have good communication and collaboration skills.
· Critical thinking: Ability to analyze and understand complex problem. Strong in User Requirements Gathering, Troubleshooting, Maintenance and Support
· High level of integrity, takes accountability of work and good attitude over teamwork.
· Takes initiative to improve current state of things and adaptable to embrace new changes.
About Great Eastern
Established in 1908, Great Eastern places customers at the heart of everything we do. Our legacy extends beyond our products and services to our culture, which is defined by our core values and how we work. As champions of Integrity, Initiative and Involvement, our core values act as a compass, guiding and inspiring us to embrace the behaviours associated with each value, upholding our promise to our customers - to continue doing our best for them in a sustainable manner.
We work collaboratively with our stakeholders to look for candidates who exhibit or have the potential to embrace our core values and associated behaviours, as these are the key traits that we expect from our employees as they develop their careers with us.
We embrace inclusivity, giving all employees an equal opportunity to shine and play their role in exploring possibilities to deliver innovative insurance solutions.
Since 2018, Great Eastern has been a signatory to the United Nations (UN) Principles of Sustainable Insurance. Our sustainability approach around environmental, social, and governance (ESG) considerations play a key role in every business decision we make. We are committed to being a sustainability-driven company to achieve a low-carbon economy by managing the environmental footprint of our operations and incorporating ESG considerations in our investment portfolios; improving people’s lives by actively helping customers live healthier, better and longer; and drive responsible business practices through material ESG risk management.
To all recruitment agencies: Great Eastern does not accept unsolicited agency resumes. Please do not forward resumes to our email or our employees. We will not be responsible for any fees related to unsolicited resumes.