Role: - Software Consultant
Job level: 10+ Years
Technical Skills:
ETL Tools : Informatica Power Center 9.1/10.5, IBM WebSphere DataStage 8.0.1/9.1.2PX
Programming Languages : SQL, PL/SQL
Big Data Tools : HDFS, Hive, Hue, Sqoop, Spark
Operating Systems : Windows, UNIX, Linux
Scripting Languages : UNIX, Shell Scripting
Databases : Oracle, Netezza, AS400
Key Responsibilities:
· Ability to working in different domains: Banking, Telecom and Healthcare. Good knowledge on banking domain especially collections process of the bank.
· Strong experience in entire life cycle of data warehouse projects right from the beginning of requirement gathering to deployment in production.
· Interacted with users for requirement gathering and prepared functional specifications.
· Strong understanding of Datawarehouse concepts. Have experience in converting the requirements from clients to Logical and physical data models using IBM Infosphere Data Architect
· Strong knowledge in Informatica power center. Have experience in working with different source systems such as oracle, SQL server, XML &flat files (COBOL, csv etc..).
· Strong knowledge on Informatica administration activities like code migration using deployment groups, creating users, folders, purging objects and rebooting the servers etc...
· Strong knowledge in Performance tuning concepts of Informatica.
· Analyzing the threads and identifying the bottlenecks in mapping, increasing the buffer sizes of transformations according to the incoming data etc...
· Strong Knowledge in SQL scripting using SQL server and Oracle.
· Strong knowledge in performance improvement techniques of SQL Queries.
· Helped Client by maintaining the business glossary by Working on data governance activities like creating the business terms, forming the associations to other terms in business glossary, creating the data quality rules (to monitor the data), creating the assets and establishing the lineage using IBM information Governance catalog.
· Comprehensive knowledge in Dimension modelling concepts like Star Schema and Snowflake Schema, Understanding of Data warehousing concepts like Warehouse architecture, OLTP, OLAP ,Data Marts, ODS, Dimension and Fact tables.
· Good experience in Performance Tuning, Monitoring and Supporting Production environments.
· Analyzing the impact for the existing models
· Analyzing business functions and process descriptions
· Product and application mappings creation
· Analyzing Citi Data Dictionary for the unsecured products
· Generating Reconciliation reports.
· Working experience on SAS Viya (Studio & Intelligence Decisioning
Key Requirements:
· Bachelor's degree in computer science, information technology, or a related field
· At least 8 years of experience and knowledge in Control M, Unix Shell Scripting, and SQL from source code repository, version control, code merge to deployment and housekeeping
· At least 6 years of experience in Hadoop, Hive, Hbase, HDFS and Data Engineering
· Ability to multitask effectively and handle large amounts of data
· Highly driven, pro-active and a strong team player
· Excellent interpersonal skills and written and verbal communication skills in English