· Build data pipelines (ETL) to enable scalable processing of large volumes of data in AWS environment (Informatica, Pentaho)
· Organise, build and maintain data stores (e.g., RedShift, RDS) of structured and unstructured data (e.g., text, customer transactional data, work-flow/ticket/log data)
· Co-evaluate business needs and objectives
· Identify opportunities for data acquisition
· Conduct data analysis, working independently or with data analysts, and report on results
· Analyze and organize raw data
· Interpret trends and patterns
· Prepare data for prescriptive and predictive modeling
· Combine raw information from different sources
· Explore ways to enhance data quality and reliability
· Work with stakeholders to identify and solve business needs related to advanced data processing
Job Requirements
· Minimally a Degree in Computer Science, IT, or similar field
· At least 1 year experience working with SQL or as a data engineer or in a similar role.
· Proven work experience with ETL tool such as Pentaho Data Integration (KETTLE), Informatica.
· Excellent SQL knowledge and with proven experience with Data Analysis, Data Profiling and Data Processing.
· Good knowledge of working with databases such as Oracle, PostgreSQL, RedShift.
· Experience working with Tableau or similar BI tools.
· Technical expertise with data models, data mining, and segmentation techniques
· Ability to integrate data /data streams from different sources
· Great numerical and analytical skills
· Detail-oriented, with excellent organizational skills and experience
Interested candidate please click "APPLY" to begin your job search journey
We regret to inform that only shortlisted candidates will be notified.
EVO Outsourcing Solutions Pte. Ltd
• RCB No. 202233837K