Mandatory Skills
Key Responsibilities
· Creating complex, enterprise-transforming applications on diverse, high energy teams
· Working with the latest tools and techniques
· Hands-on coding, usually in a pair programming environment
· Working in highly collaborative teams and building quality code
· The candidate must exhibit a good understanding of model implementation, data structures,
data manipulation, distributed processing, application development, and automation.
· The candidate must have a good understanding of consumer financial products,
data systems and data environments, and processes that are necessary for the
implementation of Risk and Finance models
Essential Skills & Prerequisites
· Degree in computer science or a numerate subject (e.g. engineering, sciences, or mathematics)
or Bachelor's/Master degree with 6 years of experience
· Hand-on Development experience with PySpark, Scala Spark and Distributed computing.
· Development and implementation experience of applications for 4-6 years.
· 4 to 6 years' experience designing and developing in Python.
· 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
· 3 to 5 years' experience with Unix shell scripting
· 3 to 5 years' experience with SQL
· 2 to 3 years' experience with Spark programming.
· Knowledge of micro-services architecture and cloud will be added advantage.
· Knowledge of Java and Scala will be added advantage
Desired
· A Bachelor’s degree or higher preferably in Computer Science or IT
· Additional experience on developing service-based application
· Excellent analytical skills: Proficient in MS Office and able to produce board-level
documentation
· Fluency in written and spoken English; Good communication and interpersonal skills
· Self-starter who sets and meets challenging personal targets, Detailed person, with a big
picture outlook
· Understanding of current technologies employed by Tier 1 Investment Banking Institutions
· Must be a team player
Mandatory Skills
· Hand-on Development experience with PySpark, ScalaSpark and Distributed computing.
· Development and implementation experience of applications for 4-6 years.
· 4 to 6 years' experience designing and developing in Python.
· 4 to 6 years' experience in Hadoop Platform (Hive, HDFS and Spark)
· 3 to 5 years' experience with Unix shell scripting
· 3 to 5 years' experience with SQL
· 2 to 3 years' experience with Spark programming