Job Overview
We are seeking a highly skilled and motivated Data Operations Engineer to join our Data Solution team with focus on Commodity Trading, Logistics, Risk Management and Storage. The ideal candidate will be responsible for the development, deployment, and maintenance of our data pipelines and infrastructure, ensuring seamless data ingestion, transformation, and ongoing operations. You will work closely with data engineers, data scientists, and business stakeholders to ensure data quality, reliability, and performance.
Duties and Responsibilities
1. Data Ingestion and Transformation:
- Design, develop, and maintain data ingestion processes using Azure Data Factory, Azure Functions, and Logic Apps in close alignment with the DevOps team.
- Implement and manage data transformations using Databricks and Delta Live Tables, including the DLT-Framework.
- Develop and optimize ETL processes to ensure data is ingested, transformed, and stored effectively.
2. Maintenance and Operations:
- Monitor and maintain data pipelines to ensure continuous and reliable data flow.
- Troubleshoot and resolve issues related to data ingestion, transformation, and storage.
- Implement and manage automation tools to improve operational efficiency and reduce manual intervention.
- Ensure the scalability and performance of the data infrastructure.
3. Automation and Monitoring:
- Automate data workflows and processes to improve efficiency and reduce manual intervention.
- Implement & operate CI/CD pipelines
- Implement monitoring and alerting systems to ensure data pipeline reliability and performance.
4. Collaboration and Communication:
- Work closely with data engineers, data scientists, and business stakeholders to understand data requirements and operate or deliver solutions.
- Provide technical support and guidance to team members and stakeholders.
5. Data Quality and Governance:
- Implement data quality checks and validation processes to ensure data integrity.
- Adhere to data governance policies and best practices.
Qualifications
Bachelor’s degree in Computer Science, Artificial Intelligence, or a related field.
Experience
Proven experience of at least 4 years as a Data Operations Engineer or similar role ideally within an agile delivery work environment.
Skill and Competencies
- Strong technical skills in a cloud environment, including:
- Azure Data Factory, Azure Functions, and Logic Apps
- Databricks and Delta Live Tables, including the DLT-Framework
- Programming languages such as Python, SQL, and Scala
- CI/CD pipelines and DevOps best practices
- Familiarity with data governance and data quality best practices
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills in a hybrid- and remote Team-Setup.
- Previous experience in the field of commodity trading or banking with a focus on (credit-) risk management, product control, or trading itself is a big plus.
If you meet the qualifications and are interested in this opportunity, please submit your application via "Apply Now". We look forward to reviewing your credentials.