Responsibilities
• Develop, maintain, and tune highly complex scripts using Python and Big Query.
• Managing and designing solutions related to Data Engineering, Data Analysis, Data modelling, Data Warehouse, Data security, ETL
• CI/CD Implementation of cloud solutions.
• Automation of Infrastructure deployment using Terraform and GitHub Actions.
• Develop analytical functionality and complex transformation that will be finally deployed in production data-platforms.
Skills/Requirement
• To be able to independently work on and assist in projects across the organization wherein on-premise applications would be migrated onto Google Cloud platform and work as an individual contributor.
• Sound knowledge of financial service logical data model (Teradata FSLDM), QLIK sense/Discovery, Query Surge, Power BI, Aldon, control-M, Bitucket, JIRA, Jenkins
• Manage communication with client, independently drive and deliver application modules.
• Good experience on Google Cloud Platform, Cloud Storage, Big Query,Dataflow,Dataproc,Composer,Python,Pyspark ,Terraform, GitHub and data pipeline development.
• Hands on experience ingesting source files ( csv/jsons/xmls) as source to GCP.
• worked on GCP real-time implementations(Migration project exposure)
• Candidate should also have good understanding of data landscape and should be able to understand end to end architecture.
• Having exposure to Banking domain is a plus
• Skills required ETL, Python, cloud, Terraform,Pyspark,GitHub, FSLDM and DB Querying using SQL/PLSQL