x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore   »   Information Technology Job   »   Data Engineer - AWS, Snowflake, Python
 banner picture 1  banner picture 2  banner picture 3

Data Engineer - AWS, Snowflake, Python

Hcl Singapore Pte. Ltd.

Hcl Singapore Pte. Ltd. company logo

Responsibilities

  • To provide solutioning and drive the implementation of DOPS features which includes - building IAC infra, Git Lab re-structuring, Git lab Upgrades monitoring etc
  • Have the vision on data strategy and able to deliver the same.
  • Able to lead the design and implementation of data management processes, including data sourcing, integration, and transformation.
  • Able to manage and lead a team of data professionals, providing guidance, mentoring and foster a collaborative and innovative team culture focused on continuous improvement.
  • To evaluate and recommend data-related technologies, tools, and platforms.
  • Collaborate with IT teams to ensure seamless integration of data solutions.
  • Implement and enforce data security protocols and ensure compliance with relevant regulations.
  • Ability to work in team in diverse/ multiple stakeholder environment.
  • Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams.

Required Qualifications:

  • Bachelor’s degree in computer science or STEM (science, technology, engineering, or mathematics)/ related field.
  • At least 8+ years of strong data warehousing experience using RDBMS and Non-RDBMS databases.
  • At least 5 years of recent hands-on professional experience (actively coding) working as a data engineer (back-end software engineer considered).
  • Strong AWS knowledge in terms of designing new architecture and providing optimized solution for existing one.
  • Professional experience working in an agile, dynamic and customer facing environment is required.
  • Understanding of distributed systems and cloud technologies (AWS) is highly preferred.
  • Understanding of data streaming and scalable data processing is preferred to have.
  • Experience with large scale datasets, data lake and data warehouse technologies such as AWS Redshift, Google BigQuery, Snowflake. In-depth knowledge of Snowflake and its architecture is preferred.
  • Atleast 2+ years of experience in ETL (AWS Glue), Amazon S3, Amazon RDS, Amazon Kinesis, Amazon Lambda, Apache Airflows, Amazon Step Functions.
  • Strong knowledge in scripting languages like Python, UNIX shell and Spark is required.
  • Understanding of RDBMS, Data ingestions, Data flows, Data Integrations etc.
  • Technical expertise with data models, data mining and segmentation techniques.
  • Experience with full SDLC lifecycle and Lean or Agile development methodologies.
  • Knowledge of CI/CD and GIT Deployments.
✱   This job post has expired   ✱

Sharing is Caring

Know others who would be interested in this job?