- Singaporean only due to the nature of the role
- Required skills:Minimum 1 year of designing and implementing ETL solutions and Real-time Data Stream.
Experience in Python, Scala, and Java.
Experience in big data technologies (Apache, Spark).
Experience in AWS data storage solutions (S3, Redshift, Iceberg, Aurora).
Preferred certification and/or hands-on experience with AWS data services.
- Design and develop data ingestion solutions for big data.
- Build efficient and reliable data processing solutions.
- Design and implement data storage solutions.
- Develop scalable data pipelines for ingestion, transformation, and storage of large datasets.
- Optimize data pipelines for real-time and batch processing.
- Ensure data quality and integrity throughout the data pipeline by implementing effective data validation and monitoring strategies.