x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore   »   Information Technology Job   »   (Senior) Data Engineer
 banner picture 1  banner picture 2  banner picture 3

(Senior) Data Engineer

John Ethans International Pte. Ltd.

John Ethans International Pte. Ltd. company logo

Our client is a reputable large ICT organisation. We are sourcing on behalf of our client for a Data Engineer experience in cloud-based data engineering work. The seniority level depends on your depth of experience.


Responsibilities:

  • Translate data requirements from business users into technical specifications.
  • Architect and build ingestion pipelines to collect, clean, merge, and harmonize data from different source systems.
  • Monitoring of databases and ETL systems such as database capacity planning and maintenance.
  • Construct, test, and update useful and reusable data models based on data needs of end users.
  • Design and build secure mechanisms for end users and systems to access data in data warehouse.
  • Research, propose and develop new technologies and processes to improve agency data infrastructure.
  • Maintain data catalogue to document data assets, metadata and lineage.
  • Implement data quality checks and validation processes to ensure data accuracy and consistency.
  • Implement and enforce data security best practices i.e. access control, encryption, and data masking, to safeguard sensitive data.

Requirements:

  • A Bachelor’s Degree in Computer Science, Software Engineering, Information Technology, or related disciplines.
  • Deep understanding of system design, data structure and algorithms, data modelling, data access, and data storage.
  • Experience in using cloud technologies such as AWS, Azure, and Google Cloud.
  • Experience in architecting data and IT systems.
  • Experience with orchestration frameworks such as Airflow, Azure Data Factory.
  • Experience with distributed data technologies such as Spark, Hadoop.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Proficiency in writing SQL for databases
  • Familiarity with building and using CI/CD pipelines.
  • Familiarity with DevOps tools such as Docker, Git, Terraform.
  • Experience in designing, building, and maintaining batch and real-time data pipelines.
  • Experience with Databricks.
  • Experience with implementing technical processes to enforce data security, data quality, and data governance.

Sharing is Caring

Know others who would be interested in this job?