Your Role and Responsibilities
Strong AWS knowledge in terms of designing new architecture and providing optimized solution for existing one.
▪ In-depth knowledge with respect to Snowflake and its architecture.
▪ provide solutioning and drive the implementation of DOPS features which includes - building IAC infra, Git Lab re-structuring, Git lab Upgrades monitoring etc.,
▪ Should have the vision on data strategy and able to deliver the same.
▪ Should be able to lead the design and implementation of data management processes, including data sourcing, integration, and transformation.
▪ Able to manage and lead a team of data professionals, providing guidance, mentoring and foster a collaborative and innovative team culture focused on continuous improvement.
▪ To evaluate and recommend data-related technologies, tools, and platforms.
▪ Collaborate with IT teams to ensure seamless integration of data solutions.
▪ should have experience in Implementing and enforce data security protocols and ensure compliance with relevant regulations.
Required Technical and Professional Expertise
Experience as a Lead: 5+ years, should pose a strong character in leading a small team.
▪ Bachelor qualification in a computer science or STEM (science, technology, engineering, or mathematics) related field.
▪ At least 8+ years of strong data warehousing experience using RDBMS and Non-RDBMS databases.
▪ At least 5 years of recent hands-on professional experience (actively coding) working as a Lead handling support & production issues.
▪ Professional experience working in an agile, dynamic and customer facing environment is required.
▪ Understanding of distributed systems and cloud technologies (AWS) is highly preferred.
▪ Understanding of data streaming and scalable data processing is preferred to have.
▪ Experience with large scale datasets, data lake and data warehouse technologies such as AWS Redshift, Google BigQuery, Snowflake. Snowflake is highly preferred.
▪ Atleast 2+ years of experience in ETL (AWS Glue), Amazon S3, Amazon RDS, Amazon Kinesis, Amazon Lambda, Apache Airflows, Amazon Step Functions.
▪ Strong knowledge in scripting languages like Python, UNIX shell and Spark is required.
▪ Understanding of RDBMS, Data ingestions, Data flows, Data Integrations etc.
▪ Technical expertise with data models, data mining and segmentation techniques.
▪ Experience with full SDLC lifecycle and Lean or Agile development methodologies.
▪ Knowledge of CI/CD and GIT Deployments.
▪ Ability to work in team in diverse/ multiple stakeholder environment.
▪ Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams.