Mandatory Skills
- Possess a degree in Computer Science/Information Technology or related fields.
- At least 3 years of experience in a role focusing on the development and support of data ingestion pipelines.
- Experience with building on data platforms, particularly Snowflake.
- Proficient in SQL and Python.
- Experience in continuous integration and continuous deployment (CICD).
- Experience in Software Development Life Cycle (SDLC) methodology.
- Experience in data warehousing concepts.
- Strong problem solving and troubleshooting skills.
- Strong communication and collaboration skills.
- Able to design, implement solutions and perform code reviews.
- Agile, fast learner, and able to adapt to changes.
Skillsets (Good to have)
- Experience with Cloud environments (e.g. AWS).
- Experience in Java.
- Experience in ETL tools (e.g. Informatica).
Brief Job Description
Responsibilities:
- Work closely with data stewards, data analysts and business end-users to implement and support data solutions.
- Design and build robust and scalable data ingestion and data management solutions for batch loading and streaming from data sources using Python.
- Familiar with SDLC process: Requirement gathering, design and development, SIT testing, UAT support and CICD deployment for enhancements and new ingestion pipelines.
- Ensure compliance with IT security standards, policies, and procedures.