Strong AWS knowledge in terms of designing new architecture and providing optimized solution for existing one.
â–ª In-depth knowledge with respect to Snowflake and its architecture.
â–ª provide solutioning and drive the implementation of DOPS features which includes - building IAC infra, Git Lab re-structuring, Git lab Upgrades monitoring etc.,
â–ª Should have the vision on data strategy and able to deliver the same.
â–ª Should be able to lead the design and implementation of data management processes, including data sourcing, integration, and transformation.
â–ª Able to manage and lead a team of data professionals, providing guidance, mentoring and foster a collaborative and innovative team culture focused on continuous improvement.
â–ª To evaluate and recommend data-related technologies, tools, and platforms.
â–ª Collaborate with IT teams to ensure seamless integration of data solutions.
â–ª should have experience in Implementing and enforce data security protocols and ensure compliance with relevant regulations.
Required Professional and Technical Expertise :
Experience as a Lead: 5+ years, should pose a strong character in leading a small team.
â–ª Bachelor qualification in a computer science or STEM (science, technology, engineering, or mathematics) related field.
â–ª At least 8+ years of strong data warehousing experience using RDBMS and Non-RDBMS databases.
â–ª At least 5 years of recent hands-on professional experience (actively coding) working as a Lead handling support & production issues.
â–ª Professional experience working in an agile, dynamic and customer facing environment is required.
â–ª Understanding of distributed systems and cloud technologies (AWS) is highly preferred.
â–ª Understanding of data streaming and scalable data processing is preferred to have.
â–ª Experience with large scale datasets, data lake and data warehouse technologies such as AWS Redshift, Google BigQuery, Snowflake. Snowflake is highly preferred.
â–ª Atleast 2+ years of experience in ETL (AWS Glue), Amazon S3, Amazon RDS, Amazon Kinesis, Amazon Lambda, Apache Airflows, Amazon Step Functions.
â–ª Strong knowledge in scripting languages like Python, UNIX shell and Spark is required.
â–ª Understanding of RDBMS, Data ingestions, Data flows, Data Integrations etc.
â–ª Technical expertise with data models, data mining and segmentation techniques.
â–ª Experience with full SDLC lifecycle and Lean or Agile development methodologies.
â–ª Knowledge of CI/CD and GIT Deployments.
â–ª Ability to work in team in diverse/ multiple stakeholder environment.
â–ª Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams.