Data Engineer
Job Purpose
This role is responsible for the design, development, testing, implementation and maintenance of data integration and data management solutions.
Job Requirement
- At least 10 years of IT working experience, with 6 years strong experience in Informatica as ETL Architect/Lead and 2 years leading remote team.
- Solid SQL skills, including experience querying and tuning large, complex data sets and performance analysis
- Knowledge in Python
- Azure Data Factory, or Azure synapse analytics or Fabric Data Pipelines
- knowledge in MQuery or Dataflow Gen2
- Strong knowledge in ETL implementation from source APIs to target data stores like Azure, Amazon S3 and on-premises storage
- Good knowledge in network and/or operating systems
- Certification in Microsoft Certified Fabric Analytics Engineer Associate preferred
- Data security, protection or privacy related certification is bonus
Responsibilities:
- Drive requirements gathering workshops to study as-is systems and processes and identify gaps between proposed design and gathered requirements
- Responsible for the high-level and detailed technical solution and design in adherence to data management and organization’s IT policies
- Responsible for the design, development, testing and implementation of data integration and data management solutions
- Responsible for data profiling and data quality improvement recommendation
- Lead technical members from different teams to deliver the end-to-end solution that meets time, scope and quality requirements
- Work with data engineers, data analysts and data scientists in the implementation of analytics solutions
- Work with OLTP systems and Enterprise Integration teams in the implementation of end-to-end data integration solutions.
Thanks, and Best Regards
Karanam Vijaya Kiran
(EA Registration no: R1443178)
Recruitment Manager
Hand Phone: +65 92333815
Helius Technologies Pte Ltd (EA Licence No: 11C3373)