1. Responsibilities
· Analyse the Authority’s data needs and document the requirements.
· Refine data collection/consumption by migrating data collection to more efficient channels.
· Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs.
· Develop test plan and scripts for system testing, support user acceptance testing.
· Work with the Authority’s technical teams to ensure smooth deployment and adoption of new solution.
· Ensure the smooth operations and service level of IT solutions.
· Support production issues
2. What we are looking for
· Good understanding and completion of projects using waterfall/Agile methodology.
· Strong SQL, data modelling and data analysis skills are a must.
· Hands-on experience in DevOps deployment and data virtualisation tools like Denodo will be preferred.
· Understanding of reporting or visualization tool like SAP BO and Tableau is important.
· Track record in implementing systems using Hive, Impala and Cloudera Data Platform will be preferred.
· Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica.
· Strong SQL and data modelling and data analysis skills.
· Good understanding of analytics and data warehouse implementations.
· Ability to troubleshoot complex issues ranging from system resource to application stack traces.
· Track record in implementing systems with high availability, high performance, high security hosted at various data centres or hybrid cloud environments will be an added advantage.