Job description and responsibilities:
Understand the requirements and create the documentation for business specification, technical specifications, and SIT test cases for data pipelines and internal processes following industry best practice
Set up the data pipeline infrastructure strictly following the organization’s technical guidelines and best practice.
Perform data pipeline enhancements to reduce data quality and operational issues and improve the user experience.
Perform data pipeline monitoring and job recovery following team arrangement
Skillset Requirements:
• Bachelor degree in Computer Science, Computer Engineering or equivalent.
• At least 3 years' experience of working as a developer/analyst in data field.
- Experience with the Systems Development Life Cycle implementation methodology (SDLC). Hands-on experience with the creation of business specification, tech specification, SIT test documentation, UAT sign off process.
- Hands-on experience with the creation of business specification, technical specification, SIT test documentation, and UAT sign-off process.
- Good knowledge of implementing ETL pipelines using Informatica BDM on data warehouses and data platforms, such as RDBMS, Snowflake.
- Exposure and knowledge in the following technologies is advantageous:
- SDLC Process – Confluence, JIRA, Change request, SIT/UAT
- Big Data Platforms – Snowflake
- OS, Programming and Scripting: Linux, Python, Shell Script
- SQL Databases: Oracle, MS-SQL
- Understand and apply good industry practice of code versioning, testing, CICD workflow, and code documentation.
- Good communication skill required to interact with data stewards, data engineers, and business users to understand the requirements.
- Good at working with details and is meticulous for operations.