x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore   »   Engineering Job   »   ETL Engineer
 banner picture 1  banner picture 2  banner picture 3

ETL Engineer

Flintex Consulting Pte. Ltd.

Flintex Consulting Pte. Ltd. company logo

Objectives of this position:

The objective of the position is to manage the extract/transform/load processes ensuring the data availability.

Responsibilities:


The holder of the position is mainly responsible for the following areas in coordination with his / her superior:

•Design, create, modify extract/transform/load (ETL) pipelines in Azure Data Fac-tory ensuring efficient data flow from source to destination.

•Ensure data accuracy and data integrity throughout the ETL processes via data validation, cleansing, deduplication, and error handling to ensure reliable and us-able data being ingested.

•Monitor the ETL processes and optimize ETL pipelines for speed and efficiency, addressing bottlenecks, and ensuring the ETL system can handle the volume, ve-locity, and variety of data.

•Participate in data modeling, designing of the data structures and schema in the data warehouse to optimize query performance and align with business needs.

•Work closely with different departments and IT team to under-stand data requirements and deliver the data infrastructure that supports business goals.

•Provide technical support for ETL systems, troubleshooting issues and ensuring the continuous availability and reliability of data flows.

•Ensure proper documentation of data sources, ETL processes and data architecture.

Requirements:


3 to 5 years of data engineering in Snowflake

3 to 5 years in upstream / downstream Retail industry and/or Supply Chain / Manufacturing domain

Sound Understanding of data quality principles and data governance best practices

Proficiency in data analytics languages like Python, Java, Scala, etc.

Knowledge of big data technologies like Hadoop, Spark and distributed computing frameworks to manage large scale data processing.

Proficient in using version control systems like Git for managing code and configurations.

SnowPro Core Certification and SnowPro Advanced Certification will be advantage

Sharing is Caring

Know others who would be interested in this job?