Mandatory Skills
Key Responsibilities
ยท Creating complex, enterprise-transforming applications on diverse, high energy teams
ยท Working with the latest tools and techniques
ยท Hands-on coding, usually in a pair programming environment
ยท Working in highly collaborative teams and building quality code
ยท The candidate must exhibit a good understanding of model implementation, data structures,
data manipulation, distributed processing, application development, and automation.
ยท The candidate must have a good understanding of consumer financial products,
data systems and data environments, and processes that are necessary for the
implementation of Risk and Finance models
Essential Skills & Prerequisites
ยท Degree in computer science or a numerate subject (e.g. engineering, sciences, or mathematics)
or Bachelor's/Master degree with 6 years of experience
ยท Hand-on Development experience with PySpark, Scala Spark and Distributed computing.
ยท Development and implementation experience of applications for 4-6 years.
ยท 4 to 6 years' experience designing and developing in Python.
ยท 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
ยท 3 to 5 years' experience with Unix shell scripting
ยท 3 to 5 years' experience with SQL
ยท 2 to 3 years' experience with Spark programming.
ยท Knowledge of micro-services architecture and cloud will be added advantage.
ยท Knowledge of Java and Scala will be added advantage
Desired
ยท A Bachelorโs degree or higher preferably in Computer Science or IT
ยท Additional experience on developing service-based application
ยท Excellent analytical skills: Proficient in MS Office and able to produce board-level
documentation
ยท Fluency in written and spoken English; Good communication and interpersonal skills
ยท Self-starter who sets and meets challenging personal targets, Detailed person, with a big
picture outlook
ยท Understanding of current technologies employed by Tier 1 Investment Banking Institutions
ยท Must be a team player
Mandatory Skills
ยท Hand-on Development experience with PySpark, ScalaSpark and Distributed computing.
ยท Development and implementation experience of applications for 4-6 years.
ยท 4 to 6 years' experience designing and developing in Python.
ยท 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
ยท 3 to 5 years' experience with Unix shell scripting
ยท 3 to 5 years' experience with SQL
ยท 2 to 3 years' experience with Spark programming