Group Technology and Operations (GTO) provides software and system development, information technology support services and banking operations. We have centralized and standardized the technology components into Singapore, creating a global footprint which can be utilized for supporting our regional subsidiaries and the branches around the world. We operate and support 19 countries with this architecture to provide a secure and flexible banking infrastructure. Our Operations divisions provide transactional customer services for our businesses while also focusing on cost efficiency through process improvements, automation and straight through processing.
Job Responsibilities
This role will lead the platform engineering aspects of Data Analytics domain, that supports BIG Data engineering, Data Integration, Data Quality, Analytics, ML model development and operationalisation.
- Work closely with technology partners to design and build technology solutions to meet business requirements/capability roadmap. Participate in defining capability and application roadmap
- Develop standardized frameworks, playbooks, practice guides and other artefacts that will allow the engineering team to work in a structured, standard manner
- Create/maintain principles, practices, and standards to get there
- Collaborate across Data Analytics technology domain to ensure execution plans are aligned with the application frameworks, chosen tools, engineering practices
- Drive change across the Data engineering team via personal credibility, experience
- Create awareness and buy-in amongst various stakeholders about architecture, design pattern, framework and new way of working. Needs to be able to act as a mentor/executor when required.
- Participate in cross-bank technology initiatives
- Participate in definition and selection of data platforms
- Guide the future direction of data strategy and processes, including intake, sources, database design and structure, data integrity and database tools
Job Requirements
- 10+ years of experience of services, product development, infrastructure and security experience as an architect or similar hands-on technology leadership role
- Deep knowledge of modern data technology stacks, software development tools, application and system performance monitoring, patterns and practices required to build highly available and scalable services
- Strong understanding of infrastructure, integration, virtualization, storage, networks and security technologies
- Strong understanding of Industry data model, reporting data model and other standard data model required to support build-out of static and dynamic analytical capabilities
- Expertise in large databases involving Oracle, Teradata
- Expertise in Data Integration tools/platform Informatica, IBM Data Stage
- Experience in Visualization tools/platform Qlik, Power BI
- Expertise in Big Data Engineering tools/platform Cloudera, Horton Works, APACHE
- Deep expertise in HDFS, HIVE, IMPALA, Kafka, SPARK, YARN is a plus
- Has operated in a mixed portfolio of cloud-native and legacy technologies
- Experience with large scale technology modernizations and platform migrations of data warehouse, data marts, reporting tools
- Recent experience with technology transformations, integrating effective Agile planning and development practices across large organizations
- Experience designing, building, and operating high volume, client facing SaaS products
- Understanding the cost of trade-offs of various technology decisions and be able to model short and long-term implications
- Operational experience of infrastructure management and administrative tools and skillsets eg: Linux shells, Apache Ambari, YARN, to build scalable and resilient data platforms
- Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP
- Experience with other aspects of data management such as data governance, metadata management, archival, data lifecycle management