Job description & Responsibilities:
- Work closely with data stewards, data analysts, and business end users to implement and support data solutions.
- Design and build robust and scalable data ingestion and data management solutions for batch loading and streaming data sources using Python.
- Familiar with SDLC process: Requirement gathering, design and development, SIT testing, support UAT and CICD deployment for enhancement and new ingestion pipeline.
- Ensure compliance with IT security standards, policies, and procedures.
- Provide BAU support in terms of production job monitoring, issue resolution, and bug fixes.
- Enable ingestion checks and data quality checks for all data sets in the data platform and ensure the data issues are actively detected, tracked, and fixed without breaching SLA.
- At least 3 years of experience in a role focusing on development and support of data injection pipelines.
- Experience with building on data platforms using Snowflake.
- Proficient in SQL and Python.
- Experience with continuous integration and continuous deployment (CICD).
- Experience with Software Development Life Cycle (SDLC) methodology.
- Experience with data warehousing concepts.
- Strong problem solving and troubleshooting skills.
- Strong communication and collaboration skills.
- Able to design and implement solution and perform code review.
- Agile, fast learner and able to adapt to change.
SKILLS: Python, SQL, data warehousing, Snowflake
Please refer to U3’s Privacy Notice for Job Applicants/Seekers at https://u3infotech.com/privacy-notice-job-applicants/. When you apply, you voluntarily consent to the collection, use and disclosure of your personal data for recruitment/employment and related purposes.