Key Responsibilities:
- Creating complex, enterprise-transforming applications on diverse, high energy teams.
- Working with the latest tools and techniques.
- Hands-on coding, usually in a pair programming environment.
- Working in highly collaborative teams and building quality code.
- The resource will be working on model implementation, data structures, data manipulation, distributed processing, application development, and automation.
- The resource will be working with consumer financial products, data systems and data environments, and processes that are necessary for the implementation of Risk and Finance models.
Essential Skills & Prerequisites:
- Degree in computer science or a numerate subject (e.g. engineering, sciences, or mathematics) or Bachelor's/Master degree with 4 - 8 years of experience.
- Hand-on Development experience with PySpark, ScalaSpark and Distributed computing.
- Development and implementation experience of applications for 4-6 years.
- 4 to 6 years' experience designing and developing in Python.
- 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark).
- 3 to 5 years' experience with Unix shell scripting.
- 3 to 5 years' experience with SQL.
- 2 to 3 years' experience with Spark programming.
- Knowledge of micro-services architecture and cloud will be added advantage.
- Knowledge of Java and Scala will be added advantage.
Desired:
- Additional experience on developing service-based application.
- Excellent analytical skills: Proficient in MS Office and able to produce board-level documentation.
- Fluency in written and spoken English; Good communication and interpersonal skills.
- Self-starter who sets and meets challenging personal targets, Detailed person, with a big picture outlook.
- Understanding of current technologies employed by Tier 1 Investment Banking Institutions.
- Must be a team player.