Job Responsibilities:
- Data Architecture & Analytics Platform Development: Spearhead the design and implementation of data architectures and analytics platforms across on-premises, cloud, and hybrid environments.
- Information Delivery & Analytics Expertise: Lead efforts in data preparation, visualization, and advanced analytics using BI tools and cutting-edge AI/ML techniques for insights and predictive analytics.
- AI/ML Operations: Oversee the integration, deployment, and continuous monitoring of AI/ML solutions and products.
- Data Management: Ensure the ethical use and proper control of analytics products by applying expert-level knowledge in data management.
- Strategic Partnering: Collaborate with various business units at Nomura to shape and drive their information and analytics strategies, guiding their adoption roadmaps.
Core Skills Requirements:
- Scalable Data Pipelines: Design and develop robust data pipelines to efficiently process and integrate large datasets from diverse sources.
- Data Modeling & ETL: Construct and manage physical data models and ETL processes to guarantee data quality, integrity, and accessibility.
- Microservices Development: Build and maintain scalable, resilient microservices, including the development of efficient server-side APIs.
- Deployment & DevOps: Expertise in CI/CD, Jenkins, Ansible, and DevOps processes, along with enterprise integration patterns.
- Programming & Orchestration: Proficiency in programming languages such as Python and Java, with hands-on experience in orchestration tools like Airflow.
- Cloud Technologies: Skilled in cloud platforms like EC2, EMR, and Snowflake, with the capability to lead discussions on hybrid data architecture and design.
- REST Services & Scripting: Proficient in RESTful services, JSON data handling, Python 3, and Linux/Unix shell scripting.
- Modern Data Management: Expertise in contemporary data management methodologies and architecture, including the development of data products and the implementation of data mesh.
- Machine Learning Expertise: Experience with ML frameworks and libraries such as LangChain, TruLens, MLFlow, TensorFlow, Scikit-learn, or PyTorch.
- Model Deployment & Monitoring: Deploy machine learning models in production environments and monitor their ongoing performance.
- Data Collection & Analysis: Skilled in collecting, cleaning, and analyzing large datasets to train and evaluate machine learning models.
- Cultural Awareness & Team Collaboration: Ability to work effectively with diverse, cross-cultural, and globally dispersed teams, adapting to cultural differences.
- Adaptability: Flexible in managing multiple demands, shifting priorities, ambiguity, and rapid change.
- Stakeholder Management: Experience in managing senior stakeholders is a plus.
- Communication & Problem-Solving: Strong communication, presentation, and interpersonal skills, with the ability to analyze complex situations and propose actionable solutions.
- Constructive Challenge: Capable of constructively challenging current practices and requirements to maximize value to the firm.
Education and Experience:
- Degree or master’s in a quantitative field such as Computer Science, Statistics, or a related discipline.
- 5 years of relevant experience in data engineering, MLOps, or full-stack engineering, ideally within financial organizations.
- Experience working with multicultural, multidisciplinary, and globally dispersed teams.
Those who are keen for the role and would like to discuss the opportunity further, please click "Apply Now" or email Kin Long at [email protected] with your updated CV.
Only shortlisted candidates will be responded to, therefore if you do not receive a response within 14 days, please accept this as notification that you have not been shortlisted.
Morgan McKinley Pte Ltd
EA Licence No: 11C5502 | EAP Registration No: R2095054