Qualifications-
- 3-5 years of experience in data engineering or a related field.
- Strong knowledge of data warehouse concepts, ETL processes, and data modeling techniques.
- Experience with cloud-based data platforms (e.g., AWS, SnowFlake).
- Proficiency in SQL and experience with NoSQL databases.
- Experience with big data technologies such as Hadoop, Spark, or Kafka.
- Knowledge of data governance principles and data privacy regulations.
Job-Specific Technical Skills:
- Proficiency in Python or Scala for data processing and automation.
- Experience with ETL tools (e.g., Apache NiFi, Talend, Informatica).
- Knowledge of data visualization tools (e.g., Tableau, PowerBI) to support data quality checks and pipeline monitoring.
- Familiarity with version control systems (e.g., Git) and CI/CD practices.
- Experience with container technologies (e.g., Docker) and orchestration tools (e.g., Kubernetes).
- Understanding of data security best practices and implementation.