The Opportunity
- Adecco is partnering with one of the most well-established and respected financial institution with a strong track record of success.
- They have a culture of innovation and continuous improvement, constantly looking for ways to improve their services and offerings.
Job Responsibilities
MASAI currently runs on an on-premises cluster, but various constraints (flexibility, infrastructure costs, etc) are pushing a move to a cloud-native implementation. You will ideally have good familiarity with both the core Hadoop platform (HDFS, Spark, Kafka, HBase, etc) as well as experience working with big data or big data adjacent technologies on the cloud (for example with EMRFS on AWS, Dataproc on GCP, Azure HDInsight, etc). Leveraging this skillset, you will adapt the existing MASAI implementation (authored primarily in Java) to run on the cloud, as part of a small crack team of very senior developers tasked with this critical and time-sensitive task.
Job Qualification
- At least 8 years of software development experience
- At least 5 years experience working with Java
- A strong understanding of recent Java language features, such as lambdas, streams, and futures
- Good knowledge of algorithms and data structures, with strong fundamentals in complexity analysis
- Strong ability to analyze code – understand execution flow & debug even without access to a debugger
- Experience with Maven, Git, writing and maintaining integration tests
- Strong familiarity with Linux and bash
- Good knowledge of SQL or an SQL-inspired dialect such as HQL
- Research, design, and develop software.
- Analyse user needs and develop software solutions
- Update software, enhances existing software capabilities, and develops and direct software testing and validation procedures.
- Work with other engineers to integrate hardware and/or software systems
An ideal candidate will also have expertise in some or all of the following:
- Hadoop Big data clusters and tech: Spark, Kafka, HDFS, ORC, Hive, HBase, YARN, Parquet, Zookeeper
- The implementation of these on various cloud providers (EMRFS, Dataproc, HDInsight, etc)
- Experience working with cloud providers and moving complex on-prem software to the cloud
- Jenkins and Ansible
- Jira or a similar issue-tracking system
- The Spring framework and IoC, and particularly Sping Boot and Swagger for restful web services
- Docker/Kubernetes and other container adjacent technologies
Additionally, knowledge of the following would be helpful although it is not required:
- Python and pyspark
- Web development fundamentals (HTML, Javascript, jQuery, ReactJS, etc)
- Protobuf, gRPC, Kryo, Avro, Snappy
- OLAP cubes: ActivePivot, Druid, Clickhouse
- Understanding of and interest in finance & financial markets, particularly interest rate derivatives in their many forms (Forwards, Futures, Swaps, Swaptions, etc), knowledge of the greeks (Delta, Gamma, Vega, Vanna, Volga, Cega, etc), risk management concepts (VaR, ES, etc), market data (discount and forecast curves, SABR volatility matrices, correlation and hybrid correlation cubes, etc).
Besides having greater development experience and more raw technical skill than the average candidate, as an expert developer you must be particularly self-motivated and driven. You will frequently be driving changes and balancing trade-offs that your management is not well equipped to understand, and you will need to be able to explain and advocate for these. This means having a clear understanding of the goals of the project, and pushing the team to achieve those goals – not merely passively completing tickets assigned to you, but truly leading and participating in the process.
Education Requirements
At least a Bachelor’s degree in any of these faculties:
- Computer Science
- Information Technology
- Programming & Systems Analysis
- Science (Computer Studies)
Next Step
Click “apply” or send resume to: Tamanna Bilandi [email protected]
EA Licence No.91C2918 | Personnel Registration No. R2096241