As a Senior Data Engineer with a deep expertise in software development/programming and a keen passion for building data-driven solutions, you are at the forefront of evolving Big Data and Data Warehouse technologies. Avallis Financial is committed to establishing industry-leading capabilities in analytics, information management, and decision-making. Our key technology platforms include Cloudera Hadoop Big Data, Teradata Group Data Warehouse, and robust data management & governance practices.
At Avallis Financial, you will join a dynamic team of engineers dedicated to elevating the standards of our Insurance Sales enablement, enhancing Omnichannel Customer Experience, and refining KYC and Financial Needs & Analysis practices. You will leverage the latest technologies to tackle some of the most complex data-centric challenges faced by our financial advisors and customers.
For us, data is not just an asset; it is the cornerstone of innovation. It drives the development of our cutting-edge features and enables us to offer seamless experiences to our Financial Advisory Units and Insurance Partners.
Requirements:
In this role, you will get the chance to:
- Passionate about building next generation data platforms and data pipeline solution across multiple line of business(Life, ILP & General Insurance).
- Enthusiastic, be able to contribute and learn from wider engineering talent in the team.
- Ready to execute state-of-the-art coding practices, driving high quality outcomes to solve core business objectives and minimise risks.
- Capable to create both technology blueprints and engineering roadmaps, for data transformational journey.
- Can lead and drive a culture where quality, excellence and openness are championed.
- Constantly thinking outside the box and breaking boundaries to solve complex data problems.
We are also interested in hearing from people who:
- Have at least 5 - 7 years’ experience working at high-paced environment in a data engineering role.
- IT / Computer Science degree or other tertiary qualification related to development and / or development best practices.
- Are experienced in providing data driven solutions that source data from various enterprise data platform into Cloudera Hadoop Big Data environment, using technologies like Spark, MapReduce, Hive, Sqoop, Kafka; transform and process the source data to produce data assets; and transform and egression to other data platforms like Teradata.
- Are experienced in building effective and efficient Big Data and Data Warehouse frameworks, capabilities, and features, using common programming language (Scala, Golang, Java or Python), with proper data quality assurance and security controls.
- Are experienced in designing, building, and delivering optimised enterprise-wide data ingestion, data integration and data pipeline solutions for Big Data and Data warehouse platforms.
- Are confident in building group data products or data assets from scratch, by integrating large sets of data derived from internal and external data partners.
- Can collaborate, co-create and contribute to existing Data Engineering practices in the team.
- Have experience and responsible for data security and data management.
- Have a natural drive to educate, communicate and coordinate with different internal stakeholders.
Technical Skills:
- Experience in designing, building, and delivering enterprise-wide data ingestion, data integration and data pipeline solutions using common programming language (Scala, Java or Python) in a Big Data and Data Warehouse platform.
- Experience working with Microsoft Azure and Cloud frameworks for big data projects.
- Experience in building data solution in Hadoop platform, using Spark, MapReduce, Kafka and various ETL frameworks for distributed data storage and processing.
- Strong Unix / Linux Shell scripting and programming skills in Java, Golang, Scala or Python.
- Experience in working in Agile teams, including working closely with internal business stakeholders.
- Familiarity with data warehousing and data mart build experience in Teradata, Oracle or RDBMS system is a plus.
- Certification on Cloudera CDP, Hadoop, Spark, Teradata is a plus.
- Should have source version control experience in bitbucket, git, Jenkins, and DevSecOps.
- Ability to identify existing gaps at customer locations and provide apt extensible solutions across various Azure services, mainly Azure Purview, Function App, and Web Services.
- Advanced skills in the ability to effectively communicate a story through data (visualisations, storytelling).
Why Join Avallis Financial?
- Supportive, approachable, and empowered leadership team.
- Compact and resourceful team environment where outcomes achieved and recognised.
- A chance to advance and develop your career within a Financial Services organisation.
- Competitive salaries.
- Awards and incentives.
- Paid certifications.
- Supportive and approachable management.
- Fun, friendship, and family.
One of our core values at Avallis is “Harmony”: We are a team of diverse individuals who value inclusivity and create meaningful connections so we can win together.
At Avallis, we make hiring decisions based on your passion, skills, and experience.