x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore   »   Information Technology Job   »   Cloud Data Engineer (AWS, Databricks, and Informatica IDMC)
 banner picture 1  banner picture 2  banner picture 3

Cloud Data Engineer (AWS, Databricks, and Informatica IDMC)

Helius Technologies Pte. Ltd.

Do you love healthtech and Singapore put together? Then you just might love this company

The company is a multiple award-winning Healthcare IT Leader that digitises, connects, and analyses Singapore's health ecosystem Its ultimate aim is to improve the Singapore population's health and health administrations by integrating intelligent, highly resilient, and cost effective technologies with process and people

A Cloud Data Engineer specializing in AWS, Databricks, and Informatica IDMC is responsible for building and maintaining a robust, integrated, and governed data infrastructure that leverages the strengths of the analytics platforms to extract valuable insights from data while ensuring data security, compliance, and high-quality data management.


Roles And Responsibilities:

• Design and architect data storage solutions, including databases, data lakes, and warehouses, using AWS services such as Amazon S3, Amazon RDS, Amazon Redshift, and Amazon DynamoDB, along with Databricks' Delta Lake. Integrate Informatica IDMC for metadata management and data cataloging.

• Create, manage, and optimize data pipelines for ingesting, processing, and transforming data using AWS services like AWS Glue, AWS Data Pipeline, and AWS Lambda, Databricks for advanced data processing, and Informatica IDMC for data integration and quality.

• Integrate data from various sources, both internal and external, into AWS and Databricks environments, ensuring data consistency and quality, while leveraging Informatica IDMC for data integration, transformation, and governance.

• Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and enrich data, making it suitable for analytical purposes using Databricks' Spark capabilities and Informatica IDMC for data transformation and quality.

• Monitor and optimize data processing and query performance in both AWS and Databricks environments, making necessary adjustments to meet performance and scalability requirements. Utilize Informatica IDMC for optimizing data workflows.

• Implement security best practices and data encryption methods to

protect sensitive data in both AWS and Databricks, while ensuring

Sharing is Caring

Know others who would be interested in this job?