SUMMARY
We are looking for people who have a strong background in data science and cloud architecture to join our AI/ML Workload Services team to create exciting new offerings and capabilities for our customers! This team within the Professional Services group will be working with customers using Snowflake to expand their use of the Data Cloud to bring data science pipelines from ideation to deployment, and beyond using Snowflake's features and its extensive partner ecosystem. The role will be highly technical and hands-on, where you will be designing solutions based on requirements and coordinating with customer teams, and where needed Systems Integrators.
AS A SENIOR SOLUTIONS ARCHITECT - AI/ML AT SNOWFLAKE, YOU WILL:
- Be a technical expert on all aspects of Snowflake in relation to the AI/ML workload.
- Build, deploy and ML pipelines using Snowflake features and/or Snowflake ecosystem partner tools based on customer requirements.
- Work hands-on where needed using SQL, Python, Java and/or Scala to build POCs that demonstrate implementation techniques and best practices on Snowflake technology within the Data Science workload.
- Follow best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
- Maintain deep understanding of competitive and complementary technologies and vendors within the AI/ML space, and how to position Snowflake in relation to them
- Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments
- Provide guidance on how to resolve customer-specific technical challenges.
- Support other members of the Professional Services team develop their expertise.
- Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing.
OUR IDEAL SENIOR SOLUTION ARCHITECT - AI/ML WILL HAVE:
- Minimum 10 years experience working with customers in a pre-sales or post-sales technical role.
- Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos
- Thorough understanding of the complete Data Science life-cycle including feature engineering, model development, model deployment and model management.
- Strong understanding of MLOps, coupled with technologies and methodologies for deploying and monitoring models.
- Experience and understanding of at least one public cloud platform (AWS, Azure or GCP)
- Experience with at least one Data Science tool such as AWS Sagemaker, AzureML, Dataiku, Datarobot, H2O, and Jupyter Notebooks,
- Hands-on scripting experience with SQL and at least one of the following; Python, Java or Scala.
- Experience with libraries such as Pandas, PyTorch, TensorFlow, SciKit-Learn or similar.
- University degree in computer science, engineering, mathematics or related fields, or equivalent experience
BONUS POINTS FOR HAVING:
- Experience with Databricks/Apache Spark
- Experience implementing data pipelines using ETL tools
- Experience working in a Data Science role
- Proven success at enterprise software
- Vertical expertise in a core vertical such as FSI, Retail, Manufacturing etc.