Responsibilities
- Create development standards based on Talend best practices / experience and lead other developers in implementation of these standards.
- Mentoring other developers who have less experience in Talend / Big Data.
- Support solution architect by leading ELT architecture design based on Talend and Big Data best practices / experience.
- Coordinating with BAs / Mappers and Project Management in planning change requests / defect fixes during project implementation and BAU phases.
- Advising BAs / Mappers on how mappings would be efficiently implemented in code to ensure optimal design.
- Provide and develop production grade Talend Big Data jobs with considerations on meeting the IT organization’s architecture standards.
- Work closely with data modelers/ system analysts on the required data interface/ requirement specifications.
- Conduct technical session/ clarification with data modelers and sub-system teams’ prior solution design, coding, and unit testing.
- Document and review design of Talend Big Data jobs and interface specification.
- Implement error and exception handling.
- Import jobs to Talend Data Catalogue.
- Integrate Talend jobs with Autosys and restart failed jobs from Autosys.
- Perform technical impact assessment, source code release and deployment checklist.
- Validate built conformance to required specifications.
- Support SIT and UAT activities e.g., perform defect analysis, troubleshooting and fixing.
- Coordinate and support Performance and Security Testing activities e.g., environment setup and test scope.
- Coordinate with infrastructure team on deployment and related activities.
- Provide enhancement and production support after project go live.
- Takes accountability in considering business and regulatory compliance risks and takes appropriate steps to mitigate the risks.
- Able to self-learn/ pick up application setup and support from vendor.
- Maintains awareness of industry trends on regulatory compliance, emerging threats, and technologies in order to understand the risk and better safeguard the company.
- Highlights any potential concerns /risks and proactively shares best risk management practices.
Skills Requirements
- Bachelor’s degree in Information Technology/Computer Science/Programming & Systems Analysis or related course.
- Talend Big Data version 7+ version : 4-7 years of relevant experience.
- Advanced knowledge of Oracle SQL, PL/SQL and Experience in Oracle supplied packages, Dynamic SQL, collections, Records and PL/SQL Tables.
- Experience in developing Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
- Advanced knowledge of Big Data Talend and experience on Talend components like tHive*, Thdfs*, tOracle*, tFilterRow, tAggregateRow, tMap, tNormalize, tDenormalize, tSortRow, tSampleRow, tFilelist and tReplicate, tOracleInput, tOracleOutput, tOracleConnection, tOracleOutputBulkExec etc.
- Good technical skills in UNIX shell scripting and scheduling.
- Experience of Hadoop (Hortonworks), development exposure in Spark and Hive; especially Spark version 2.
- Exposure in Data Warehousing.
- Experience designing Talend job orchestration through enterprise workload automation tool like Control-M; preferably Autosys.
- Working knowledge of Hortonworks data platform for data ingestions frameworks from multiple source systems e.g. AS400, Oracle Finance, MS SQL, etc.
- Development experience using Java, PL/SQL, SQL, Python, Scala with good knowledge of data models and data flows with strong understanding of dimensional and relational databases, including stored procedures, constraints, normalization, indexes and security.
- Insurance and financial reporting domain knowledge will be an advantage.
- Talend Big Data Integration and Big Data (Hadoop) developer certifications.
- Knowledge of computing, software design, use of parameterization, database and security concepts.
- Knowledge of best practices of data related or coding disciplines.
- Familiar with regulatory, security and industry guidelines.