- Collaborate with data engineers to design and implement cloud-based and on-prem architectures, managing infrastructure across AWS and OpenShift Container Platform (OCP) with automation tools such as Jenkins and Ansible.
- Establish and manage CI/CD pipelines and data orchestration for testing, deployment, and monitoring of data platform components across hybrid environments.
- Maintain comprehensive documentation of data infrastructure and processes. Offer training and support to internal and external team members on DataOps practices.
Requirements
- Degree in IT, Computer Science, Data Analytics or related field
- 2 to 4 years of experience in Data Engineering, DevOps, or related fields.
- Proven experience working in a mature, DevOps-enabled environment with well-established cloud practices
- Familiarity with data platforms such as databricks, snowflake or WastonX and experience managing infrastructure across public cloud and on-prem environments, particularly with OpenShift Container Platform (OCP).
- Knowledge on automation tools such as Ansible, Terraform, and CLI tools across hybrid cloud environments.
- Competence in designing and implementing data ingestion, ETL frameworks, dashboard ecosystems, and data orchestration with hands-on coding skills such as Python, spark and SQL
- Working knowledge of CI/CD best practices, with experience in setting up and managing CI/CD pipelines for continuous integration, testing, and deployment.
- Ability to maintain clear documentation of data infrastructure and processes, with experience in providing training on DataOps practices.
Preferred:
- Certifications in cloud technology platforms (such as cloud architecture, container platforms, systems, and/or network virtualization).
- Knowledge of telecom networks, including mobile and fixed networks, will be an added advantage.
- Familiarity with data fabric and data mesh concepts, including their implementation and benefits in distributed data environments, is a bonus