Responsibilities & Requirement:
· Work closely with data stewards, data analysts and business end-users to implement and support data solutions.
· Design and build robust and scalable data ingestion and data management solutions for batch loading and streaming from data sources using Python.
· Familiar with SDLC process: Requirement gathering, design and development, SIT testing, UAT support and CICD deployment for enhancements and new ingestion pipelines.
· Ensure compliance with IT security standards, policies, and procedures.
Must Have Skillset:
· At least 3 years of experience in a role focusing on the development and support of data ingestion pipelines.
· Experience with building on data platforms, particularly Snowflake.
· Proficient in SQL and Python.
· Experience in continuous integration and continuous deployment (CICD).
· Experience in Software Development Life Cycle (SDLC) methodology.
· Experience in data warehousing concepts.
· Strong problem solving and troubleshooting skills.
· Strong communication and collaboration skills.
· Able to design, implement solutions and perform code reviews.
· Agile, fast learner, and able to adapt to changes.
Good to have (Skillset):
- Experience with Cloud environments (e.g. AWS).
- Experience in Java.
- Experience in ETL tools (e.g. Informatica).
SKILLS: Python, SQL, Snowflake, SDLC, SIT, UST, ETL, AWS, Informatica.
Please refer to U3’s Privacy Notice for Job Applicants/Seekers at https://u3infotech.com/privacy-notice-job-applicants/. When you apply, you voluntarily consent to the collection, use and disclosure of your personal data for recruitment/employment and related purposes.