ABOUT AirLab
The Aviation Innovation Research (AIR) Lab
The joint CAAS-Thales innovation lab known as ‘AIR Lab’ started to operate on 1st November 2019 with the objective to develop Proof of Concepts (POCs) or Minimum Viable Products (MVPs) of advanced and open technologies for future Air Traffic Management (ATM). Currently, the joint lab employs a team of 40 Thales engineers and 12 CAAS engineers and air traffic controllers operating in a vibrant ecosystem involving several Singaporean SMEs and start-ups, as well as key research institutes.
Thales engineers include 10 domain experts – with more than 10 years of experience – coming from Thales ATM centers of competence in France and Australia. These experts are coaching both the local Thales and CAAS engineers.
ROLE DESCRIPTION SUMMARY
As a Backend Software Engineer at AIR Lab, you will thrive if you have a passion for coding, designing, and engaging in discussions about APIs and data processing, including areas like data modeling, stream processing, and data quality control. You take pride in both the quality of your technical implementations and the overall delivery. You enjoy collaborating with a diverse team, bringing together various ethnic and cultural perspectives.
You are eager to dive deep into technical challenges, identify solutions, and effectively communicate your findings, helping the team learn and grow. You are enthusiastic about exploring new technologies and finding innovative ways to apply your knowledge. You also inspire and motivate your teammates to embrace continuous learning and take an active role in all aspects of engineering activities at AIR Lab.
KEY ACTIVITIES AND RESPONSIBILITIES
As aSoftware Backend Engineer, you are accountable for:
• Design and build APIs to provide data for the stream-aligned squads.
• Implement features and associated automated unit and integration tests. If appropriate implement security and load tests.
• Improve and maintain the DataLake cybersecurity posture w.r.t data governance and cybersecurity standards by working with other stakeholders (e.g., Data Architect, Data Assessment Office, Cybersecurity Office).
• Improve and maintain the DataLake service levels for reliable data flow, health of infrastructure (i.e., compute and storage) and security.
• Improve and maintain the total cost of ownership of data; this activity includes raising efficiencies around FinOps, CloudOps.
• Improve and maintain the architecture transforming data between the DataLake and a distributed search and analytics engine (e.g., ElasticSearch).
• Lead the technical evolution of the DataLake by exploring new methods, techniques, algorithms (e.g., data meshes, AI/MLOps infrastructure).
• Work with the Data Architect to effect best practices to the engineering organization.
• Write and review the necessary technical and functional documentation in documentation repositories (e.g., backstage.io, JIRA, READMEs).
• Work in an agile, cross-functional multinational team, actively supporting the team's success.
KEY KNOWLEDGE AND EXPERIENCE
To be successful in your role, you will have demonstrated and/or acquired the following knowledge and experience:
Education
- Bachelor's degree in Computer Science or Information Technology with a minimum of 3 years of working experience.
- Master's degree in Computer Science or Data Science, if applicable
- Mid-career switchers are encouraged to apply if you demonstrate relevant working experience of at least 3 years.
Essential Skills/Experience
• Working knowledge of designing and building software applications using Java and/or Kotlin as a main programming language; a bonus is if you have experience designing and deploying RESTful and/or GraphQL APIs
• Familiar with development and deployment technologies for cloud (e.g., Azure Cloud, AWS, GCP), Kubernetes and container workloads (e.g., Kaniko, Docker) in general.
• Proficient in designing and implementing data processing and business Logic
• Demonstrated application of working with Continuous Integration and/or Continuous Delivery models; you are expected to be familiar with using Linux (e.g., shell commands)
• Proficiency in distributed source code management tools like GitLab, Github and practice GitOps
• Effective communication skills in English
Desirable Skills/Experience
• Working knowledge of designing application with a “shift-left” cybersecurity approach.
• Working knowledge of other languages (e.g., Python3, Scala2 or Scala3, Go, TypeScript, C, C++17, Java17)
• Experience in designing ETL or event-driven processing pipelines (with structured or unstructured data) using frameworks like Spark/ Apache Beam, Kafka Streams, Flink; proficient in deploying these data pipelines into Kubernetes cluster in Azure cloud either as virtual machines or containerized workloads.
• Familiar with cloud deployment models (e.g., public, private, community and hybrid) and service models (Software as a Service, Platform as a Service and Infrastructure as a Service).
• Familiar with designing and/or implementing AI/MLOps pipelines in public cloud (e.g., Azure, AWS, GCP)
Essential / Desirable Traits
• Possess learning agility, flexibility and pro-activity
• Comfortable with agile teamwork and user engagement
YOUR CAREER AT THALES
Future opportunities will allow you to discover other domains or sites. You will be able to evolve and grow your competencies in different areas:
• Room and attention to personal development
• Build your talents in another domain of Thales Group, discovering new products, new customers, new country or go to a more complex Solution.
• Choose between a technical expertise or a leadership path
• Build an international career within a leading Engineering Group