ā¢ Analyze the Authorityās data needs and document the requirements.
ā¢ Refine data collection/consumption by migrating data collection to more efficient channels.
ā¢ Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs.
ā¢ Develop test plan and scripts for system testing, support user acceptance testing.
ā¢ Work with the Authorityās technical teams to ensure smooth deployment and adoption of new solution.
ā¢ Ensure the smooth operations and service level of IT solutions.
ā¢ Support production issues - Track record in implementing systems with high availability, high performance, high security.
ā¢ Good understanding and completion of projects using waterfall/Agile methodology.
ā¢ Good understanding of analytics and data warehouse implementations.
ā¢ Ability to troubleshoot complex issues ranging from system resource to application stack traces.
ā¢ Strong SQL, data modelling and data analysis skills are a must.
ā¢ Hands-on experience in big data engineering jobs using Python, PySpark, Linux, and ETL tools like Informatica.
ā¢ Track record in implementing systems using Hive, Impala and Cloudera Data Platform will be preferred.
ā¢ Hands-on experience in DevOps deployment and data virtualization tools like Denodo will be preferred.
ā¢ Understanding/ Hands on experience of reporting or visualization tool like SAP BO and Tableau will be beneficial but not required.
ā¢ Passion for automation, standardization, and best practices
ā¢ Good written and verbal communication and interpersonal skills, ability to understand the business requirement, communicate confidently with stakeholders.