Data Engineer
Responsibilities
- Design, develop, implement high performance, at-scale data solutions/pipelines (e.g. batch, real-time) to process large volumes of varied data formats from disparate sources
- Analyse, design and develop data elements/flows, dependencies and relationships, and physical/logical data models
- Maintain, monitor, optimise/tune performance, diagnose and troubleshoot data pipelines, processes and any data exchanges with other applications
- Prepare progress and activity reports on the platform’s status and health (e.g. SLAs, data quality)
- Collaborate with data scientists/analysts to develop and implement end-to-end data solutions
- Work on Agile/DevOps technologies, micro-services, and APIs
Requirements
- Bachelor’s degree in Computer Science/ Engineering, Information Systems, or related field
- 2-3 years of relevant working experience
- Previous experience in designing and developing data models, multiple data source integration, ETL/ELT pipelines
- Experience with Snowflake cloud data warehouse and relational SQL databases
- Familiarity with MS Azure Cloud computing (ADF, Databricks, ADLS, Azure Functions, Azure Stream Analytics etc.)
- Experience with Python, R, Linux/Unix/Windows scripting would be an added advantage
- Experience in development and collaboration tools (e.g. GitLab, Jira, Confluence)
- Effective verbal and written communication with ability to write thorough and clear documentation