Education / Experience:
· Diploma/Degree in Computer Engineering/Computer Science/Information Technology or related technical discipline
· Preferably 1 - 3 years' of working experience in related fields.
·
Job Description:
· Be part of a team to build and maintain data systems or pipelines.
· Perform setup, installation, configuration, troubleshooting and/or upgrade for COTS products.
· Develop or implement ways to improve data warehouses, data lakes or equivalent platforms.
· Involve in the creation of documentations e.g. design documents, troubleshooting guides etc.
Skill Sets:
· Knowledge and/or experience in data management or data engineering
· Experience with Linux commands and shell script
· Knowledge and/or experience in relational (including SQL) or NoSQL database (e.g. document, graph)
· Knowledge and/or experience in one or more of the following will be an advantage:
o Data / Search / Automation platforms such as Hadoop, Elasticsearch, Ansible respectively
o Data integration tools such as Talend, DataStage, Denodo
o Programming languages such as Python, Spark
o Microsoft Azure Cloud services such as Azure Data Factory, Azure Synapse Analytics
o Analytics platforms such as Databricks, Dataiku, Data Robot
· Good problem-solving skills
· Able to work independently and as a team
(EA Reg No: 20C0312)
Please email a copy of your detailed resume to [email protected] for immediate processing.
Only shortlisted candidates will be notified.