x
Get our mobile app
Fast & easy access to Jobstore
Use App
Congratulations!
You just received a job recommendation!
check it out now
Browse Jobs
Companies
Campus Hiring
Download App
Jobs in Singapore   »   Jobs in Singapore - Suntec   »   Information Technology Job   »   Cloud Data Architect (AIR Lab)
 banner picture 1  banner picture 2  banner picture 3

Cloud Data Architect (AIR Lab)

Thales Solutions Asia Pte. Ltd.

Thales Solutions Asia Pte. Ltd. company logo
Location: Singapore, Singapore

Thales people architect solutions that are relied upon to deliver operational advantage at every decisive moment throughout the mission. Defence and armed forces customers rely on us to deliver the full range of defensive systems for land, sea, and air. From early warning, to threat neutralisation, our platforms cover all levels from very short-range systems, to extended protection across the entire battle-space including Airspace Mobility Solutions, Vehicles and Tactical Systems and Missile Defence, Optronics, and Radar.

Thales established its presence in Singapore in 1973 to support the expansion of aerospace-related activities in the Asia-Pacific region. Throughout the last four decades, the company grew from strength to strength and is today involved in the primary businesses of Aerospace (including Air Traffic Management), Defence & Security, Ground Transportation and Digital Identity & Security. Thales today employs over 2,100 people in Singapore across all its business areas.

The Aviation Innovation Research (AIR) Lab

The joint CAAS-Thales innovation lab known as ‘AIR Lab’ started to operate on 1st November 2019 with the objective to develop Proof of Concepts (POCs) or Minimum Viable Products (MVPs) of advanced and open technologies for future Air Traffic Management (ATM). Currently, the joint lab employs a team of 40 Thales engineers and 12 CAAS engineers and air traffic controllers operating in a vibrant ecosystem involving a number of Singaporean SMEs and start-ups, as well as key research institutes.

Thales engineers include 10 domain experts – with more than 10 years of experience – coming from Thales ATM centers of competence in France and Australia. These experts are coaching both the local Thales and CAAS engineers. 

The POCs and MVPs are co-developed with the CAAS engineers who have access to the same development tools and environment used by the Thales Engineers (provided by the Singapore branch of Thales Digital Factory). The POCs and MVPs are defined in collaboration with CAAS air traffic controllers through iterative workshops.

AIR Lab research outcomes will feed the next generation of products, including clearly disruptive outcomes addressing Safety and Security for Open architecture, data driven ATM Twin, Green Aviation, Trajectory Based Operations. 

As the AIR Lab was recently extended for another 3 years, 4 work streams are now embarked: Regional Experimental Platform, FF-ICE/TBO, Green Aviation, Future-proof Interoperable Platform-Agnostic, Safe and Secure Platforms. These work streams are supported by the AIR Lab DataLake platform which provides data transformation and serving services in the cloud.

AIR Lab 2.0 continues to benefit from research conducted in Europe through new architecture research which, among other objectives, aims at meeting much stricter safety standards for ground ATM systems in development by the European Aviation Safety Authority (EASA). This breakthrough architecture study is co-funded by the French government (CASSIOPEA).

Regional Experimental Platform (REP) has been initiated in AIR Lab 1.0 and has the view to addressing regional needs in coordination with the SESAR 3 activities conducted in AMS France.

Who We Are Looking For

As a Cloud Data Architect in AIR Lab, you are seen by the members of your Data team to not only be a pillar of strength but a source of motivation and inspiration as well. You should be someone who enjoys designing, discussing topics around processing patterns like data quality control, streaming SQL, data sources/sinks and data synchronization; streaming-backfill, stream-to-stream joins etc. You should be someone who cares about the quality of the technical implementation and delivery as much as you care about the quality of delivery. You should be someone who enjoys working in a team of diverse people with multiple ethnic and cultural backgrounds. You should be someone who enjoys diving into the technical details of figuring out a problem and be able to communicate the solution back to the team so that the members can learn from it. You should be someone who loves learning new technologies and find innovative ways to apply newfound knowledge and be courageous to encourage fellow team members to be like YOU and enjoy participating in all aspects of engineering activities in the AIR Lab.

Responsibilities:

  • Plan, maintain and improve the DataLake cybersecurity posture with regards to data governance and cybersecurity standards by working with other stakeholders (e.g., Data Assessment Office, Cybersecurity Office).
  • Plan, maintain and improve the DataLake service levels for reliable data flow, health of infrastructure (i.e., computer and storage).
  • Plan, maintain and improve the total-cost of ownership of the DataLake; this activity includes raising efficiencies around FinOps, CloudOps.
  • Maintain and improve the architecture transforming data between the DataLake and a distributed search and analytics engine (e.g., ElasticSearch)
  • Lead the evolution of the DataLake by conducting the following activities (non-exhaustively) exploring new methods, techniques, algorithms (e.g., data meshes, AI/MLOps infrastructure).
  • Plan, maintain the data model, data catalogue (e.g., event data, batched data, persisted, ephemeral).
  • Mentor and coach the data team.
  • To implement features by defining test, develop feature and associated automated tests. If appropriate, implement security tests and load tests.
  • Write and review the necessary technical and functional documentation in documentation repositories (e.g., backstage.io, JIRA, READMEs).
  • Work in an agile, cross-functional multinational team, actively engaging to support the success of the team.

Requirements:

Education

  • Bachelors in Computer Science or Information Technology
  • Master’s degree in Computer Science or Data Science, if applicable

Essential Skills/Experience

  • Proficiency in designing, implementing ETL data pipelines (with structured or unstructured data) using the frameworks like Apache Dataflow / Apache Beam, Apache Flink; proficient in deploying ETL pipelines into Kubernetes cluster in Azure cloud either as virtual machines or containerized workloads.
  • Proficiency in designing, implementing data lifecycle management using scalable object-storage systems like MinIO (e.g., tiering, object expiration, multi-nodal approach)
  • Demonstrated application of working with Continuous Integration and/or Continuous Delivery models; you are expected to be familiar with using Linux (e.g., shell commands)
  • Proficiency in distributed source code management tools like GitLab, Github
  • With respects to ETL pipelines, you are expected to demonstrate proficiency in the following.
    • pipeline configuration using Gitlab
    • environment management using Gitlab; it’s a bonus if you have demonstrated experience in deployment management (e.g., canary, blue/green rollouts) using Gitlab
  • Familiar with cloud deployment strategies to public clouds (e.g., Azure Cloud, AWS, GCP) and Kubernetes using virtualized and containerized workloads (e.g., Kaniko, Docker, Virtual Machine)
  • Good communication skills in English

Desirable Skills/Experience

  • Working knowledge of designing application with a “shift-left” cybersecurity approach.
  • Working knowledge of other languages (e.g., Python, Scala, Go, TypeScript, C, C++, Java)
  • Implement event-driven processing pipelines using frameworks like Apache Kafka, Apache Samza
  • Familiar with the cloud deployment models (e.g., public, private, community and hybrid)
  • Familiar with the main cloud service models: Software as a Service, Platform as a Service and Infrastructure as a Service.
  • Familiar with designing and/or implementing AI/MLOps pipelines in public cloud (e.g., Azure, AWS, GCP)

Essential / Desirable Traits

  • Possess learning agility, flexibility and pro-activity
  • Comfortable with agile teamwork and user engagement

At Thales we provide CAREERS and not only jobs. With Thales employing 80,000 employees in 68 countries our mobility policy enables thousands of employees each year to develop their careers at home and abroad, in their existing areas of expertise or by branching out into new fields. Together we believe that embracing flexibility is a smarter way of working. Great journeys start here, apply now!
✱   This job post has expired   ✱

Sharing is Caring

Know others who would be interested in this job?