x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore   »   Information Technology Job   »   Python AWS Data Engineer
 banner picture 1  banner picture 2  banner picture 3

Python AWS Data Engineer

Argyll Scott Consulting Pte. Ltd.

We are seeking for a highly skilled Python / AWS Data Engineer on a 12-months contract basis to support our project in Singapore.


Responsibilities

  • Design, develop, and maintain complex data pipelines using Python for efficient data processing and orchestration.
  • Collaborate with cross-functional teams to understand data requirements and architect robust solutions within the AWS environment.
  • Implement data integration and transformation processes to ensure optimal performance and reliability of data pipelines.
  • Optimize and fine-tune existing data pipelines / AWS to improve efficiency, scalability, and maintainability.
  • Troubleshoot and resolve issues related to data pipelines, ensuring smooth operation and minimal downtime.
  • Collaborate with Data Scientists and Analysts to enable seamless data access and processing for analytics and reporting purposes.
  • Work closely with AWS services like S3, Glue, EMR, Redshift, and other related technologies to design and optimize data infrastructure.
  • Develop and maintain documentation for data pipelines, processes, and system architecture.
  • Stay updated with the latest industry trends and best practices related to data engineering and AWS services.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Proficiency in Python and SQL for data processing and manipulation.
  • Strong knowledge of AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda.
  • Experience with optimizing and scaling data pipelines for performance and efficiency.
  • Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment.
  • Good understanding of data modeling, ETL processes, and data warehousing concepts.
  • Effective communication skills and the ability to articulate technical concepts to non-technical stakeholders.

Preferred Qualifications

  • AWS certification(s) related to data engineering or big data.
  • Experience working with big data technologies like Spark, Hadoop, or related frameworks.
  • Knowledge of version control systems like Git.

Sharing is Caring

Know others who would be interested in this job?