- Min 8 years of experience.
- At least 8 years of data-related working experience, with at least 5 years of hands-on experience in data management and data analytics in the finance, investment, banking, and asset management industries.
- Good understanding of the principles, key controls and processes relating to data management.
- Good at working with details and is meticulous in operations and tasks.
- Good understanding of data modelling.
- Good understanding on OLTP, Data Lake, Lakehouse technologies that may include knowledge of Snowflake, S3, AWS Glue, DeltaLake, DataBricks.
- Good understanding of CI/CD pipelines, DevOps, DataOps, and their applicability in financial services.
- Experience in working with databases using database technologies (e.g. MS SQL, Snowflake, Redshift and PostgreSQL) and data integration products (e.g. Informatica, Data Factory, Bash).
- Extensive data-related experience in programming languages eg Python, SQL, Scala, Javascript, and similar.
- Experience in working with business intelligence and data visualization tools e.g. Tableau, PowerBI.
- Experience with SDLC methodology and/or agile methodologies like Scrum and Kanban and its related applications eg Confluence and JIRA.
- Good team player, with strong analytical skills and enjoys complex problem solving.
- Experience in data engineering platforms and virtualization eg Denodo, Jupyter, O365.
- Excellent written and spoken English and strong presentation skillsets.
- Strong inter-personal and people leadership skills to interact with diverse stakeholders.
Responsibilities:
- Working closely with data analysts and business end-users to implement and support data platforms using best-of-breed technology and methodology.
- Implement and design robust and scalable solutions to meet business needs and take operational considerations into account. Demonstrate technical expertise in the assigned areas.
- Manage operations and maintain SLAs. Implement automation in data management. Collaborate with data engineering, architecture, and governance.
- Perform data and metadata quality, modeling, lineage, cleansing, onboarding, registration, discoverability, access controls, migration, optimization, and cataloging. Execute, maintain and manage the whole data lifecycle.
- Conduct requirement workshops with stakeholders, analyze and translate business and other requirements holistically into data strategies, plans, and actions.
- Cover all aspects of data distribution and data service.
- Design and implement scalable data pipelines and data service modules with robustness in mind to support the growing demands from business users.
- Analyze and resolve day-to-day operational incidents and advisory to business users.
- Listen to upstream changes, perform impact analysis and mitigate-modify-implement.
- Do impact analysis, inform and distribute to downstream users and applications.
- Analyze systems operations data (SLAs, customer satisfaction, delivery quality, team efficiency etc.) to identify actionable trends for continual improvements.
- Data, application, and system checking and testing (eg UAT), CICD and go-live.