Responsibilities:
- Design and implement scalable data platforms
- Develop and maintain ETL pipelines
- Maintain and review database schema design and performance tuning
- Ensure Database and Data platform high availability and disaster recovery, including backup and restore.
- Classifying and labelling sensitive data elements and collaborating with security and compliance teams to enforce data protection policies.
- Implementing and monitoring access controls to ensure data security and confidentiality.
- Monitoring and assessing data quality through processes such as profiling, validation, and cleansing.
- Collaborating with data owners and stewards to define and enforce data quality standards promptly.
- Establishing clear policies, standards, and procedures to ensure consistent and high-quality data across the organization.
- Working closely with internal stakeholders from cross-functional teams to understand data requirements and business processes.
- Acting as a liaison between business and technical teams to promote a collaborative approach to data governance.
- Ensuring compliance with relevant data protection regulations, industry standards, and internal policies.
- Conducting regular audits to assess adherence to data governance policies and recommending improvements as needed.
Requirements
- Bachelor's degree or equivalent in Computer Science or related field.
- Proficiency with ETL tools and data analytics platforms
- Minimum 5 years of experience In data engineering.
- Experience with data warehousing and data modeling
- Experience with DBA tasks such as performance tuning, backup and recovery
- Experience with scripting (Python/Scala) for automation
- Experience with change data capture and data replication
- Familiar with cloud-based infrastructures (AWS/Azure/GCP)