Responsibilities
· Bachelor's Degree in Computer Science or equivalent;
· Minimum 5 years of experience in data engineering or software engineering.
· Possess a solid foundation in Computer Science concepts such as Cloud Computing Architecture, Distributed Computing, High-Velocity Data Processing, and Lambda Architecture;
· Proficient in Data modeling and managing Distributed Computing Platforms for efficient Data Processing;
Skills/Requirement
· Data Warehousing experience: 4 years minimum
· 3+ years working experience of Hadoop (Hortonworks) developer experience in Spark and
· Hive; especially Spark version 2
· Talend Big Data version 7+ Developer experience: having 3-5 years working experience
· deploying code to production
· 2+ years working experience in Talend Big Data version 7 on Hadoop