Mandatory Skills
Key Responsibilities
- Create complex, enterprise-transforming applications in high-energy, collaborative teams.
- Work with the latest tools and techniques in a hands-on coding environment, often utilizing pair programming.
- Build high-quality, maintainable code in a collaborative team environment.
- Demonstrate a solid understanding of model implementation, data structures, data manipulation, distributed processing, application development, and automation.
- Possess knowledge of consumer financial products, data systems, environments, and processes critical for the implementation of Risk and Finance models.
Essential Skills & Prerequisites
- Bachelor’s/Master’s degree in Computer Science, Engineering, Sciences, Mathematics, or a numerate subject, with 6 years of relevant experience.
- Hands-on development experience with PySpark, Scala Spark, and distributed computing.
- 4–6 years of experience in:
- Designing and developing applications using Python.
- Working on the Hadoop platform (Hive, HDFS, and Spark).
- 3–5 years of experience in Unix shell scripting and SQL.
- 2–3 years of experience in Spark programming.
- Knowledge of microservices architecture and cloud computing is an advantage.
- Familiarity with Java and Scala is beneficial.
Desired Skills
- Additional experience in developing service-based applications.
- Excellent analytical skills, including proficiency in MS Office and the ability to produce board-level documentation.
- Fluency in written and spoken English, with strong communication and interpersonal skills.
- Self-starter with the ability to meet challenging personal targets and maintain a big-picture outlook with attention to detail.
- Understanding of current technologies used by Tier 1 Investment Banking Institutions.
- Team-oriented mindset.
- Advantageous experience includes:
- Working in the financial or banking industry.
- Knowledge of Avaloq, Windows Batch Scripting, T-SQL, and Python.
Mandatory Skills Summary
- Hands-on development experience with PySpark, Scala Spark, and distributed computing.
- 4–6 years of experience in:
· Python development.
· Hadoop platform (Hive, HDFS, Spark).
- 3–5 years of experience in:
· Unix shell scripting.
· SQL development.
- 2–3 years of experience in Spark programming.