Experience: 10-15 Years
Role: Enterprise / Application Architect
Key Responsibilities:
· Provide technical vision and create roadmaps to align with the long-term technology strategy
· Create the target architecture for an application / set of applications with emphasis on platforms, reusability, scalability and security
· Work with stakeholders to define capabilities and principles for an application
· Work on governance aspects for the Data Analytics applications such as documentation, design reviews, metadata etc
· Document design ideas in well-crafted presentations and engage with stakeholders for their concurrence
· Review and help streamline the design for big data applications and ensure that the right tools are used for the relevant use cases
· Engage users to achieve concurrence on technology provided solution. Conduct review of solution documents along with Functional Business Analyst and Business Unit for sign-off
· Create technical documents (functional/non-functional specification, design specification, training manual) for the solutions. Review interface design specifications created by development team
· Participate in selection of product/tools via RFP/POC.
· Provide inputs to help with the detailed estimation of projects and change requests
· Execute continuous service improvement and process improvement plans
Key Requirements:
· 10-15 years of experience with Data Architecture in the banking domain including implementation of Data Lake, Data Warehouse, Data Marts, Lake Houses etc
· Experience in data modeling for large scale data warehouses, business marts on Hadoop based databases, Teradata, Oracle etc for a bank
· Expertise in Big Data Ecosystem such as Cloudera (Hive, Impala, Hbase, Ozone, Iceberg), Spark, Presto, Kafka
· Experience in a Metadata tool such as IDMC, Axon, Watson Knowledge Catalog, Collibra etc
· Expertise in designing frameworks using Java, Scala, Python and creation of applications, utilities using these tools
· Expertise in operationalizing machine learning models including optimizing feature pipelines and deployment using batch/API, model monitoring, implementation of feedback loops
· Knowledge of report/dashboards using a reporting tool such as Qliksense, PowerBI
· Expertise in integrating applications with Devops tools
· Knowledge of building applications on MPP appliances such as Teradata, Greenplum, Netezza is a plus
· Domain knowledge of the banking industry include subject areas such as Customer, Products, CASA, Cards, Loans, Trade, Treasury, General Ledger, Origination, Channels, Limits, Collaterals, Campaigns etc
Technical skillsets
· Data Modeling (ERWIN)
· Big Data Ecosystem preferably Cloudera (Spark, Presto, Hive, Hbase, Kafka, Impala, Apache Kudu, Ozone, Presto, Tez, Iceberg)
· Teradata
· Cloudera CDSW/CML, H2O.ai
At least, 2 to 3 technical certifications in any of the below technologies:
· Cloudera Hadoop distribution – Hive, Impala, Spark, Kudu, Kafka, Flume
· Data modelling tools (Erwin)
· Language – SQL, Java, Python, Scala
· Automation / scripting – CtrlM, Shell Scripting, Groovy
Experience or certification in the below technologies will be an added advantage:
· CI/CD software, Testing Tools - Jenkins, SonarQube
· Data Analytics Tools - Qlik Sense
· Version Control Tool - Aldon+LMe, CA Endeavor
· Deployment Tool kit -Jenkins
· Service or Incident Management (IcM) Tools - Remedy
· Source Code Repository Tool - Bitbucket
· Scheduling Tool - Control-M
· Defect Management Tool - JIRA
· Application Testing tool – QuerySurge
· Platforms provided by FICO, Experian, SAS or ACTICO for credit and portfolio management
EDUCATION
- Bachelor’s degree/University degree in Computer Science, Engineering, or equivalent experience