Key Responsibilities:
- Develop and maintain large-scale distributed systems using Hadoop ecosystem technologies such as HDFS, MapReduce, Cloudera Hive, Sqoop, and Impala.
- Implement and optimize data pipelines using Apache Spark to handle large data sets efficiently.
- Perform ETL processes to move data between systems, using tools like Sqoop and custom Shell Scripts.
- Collaborate with data scientists and analysts to integrate business intelligence tools such as Tableau for data visualization and reporting.
- Manage databases and data warehouses such as MySQL, Oracle, and Cloudera to ensure data integrity and availability.
- Utilize version control systems like Git and Bitbucket for code management and collaboration.
- Automate deployment and CI/CD pipelines with tools like Jenkins, Maven, and Autosys.
- Work closely with the development teams using Eclipse and IntelliJ for coding and debugging.
- Collaborate with cross-functional teams using tools such as Jira, Confluence, and ServiceNow to manage projects and incident reporting.
- Ensure data governance, security, and compliance with company policies and regulations.
Requirements:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Minimum of 5 years of experience into IT field.
- Proven experience with HDFS, MapReduce, Hive, Spark, Sqoop, and other Hadoop ecosystem tools.
- Proficiency in Shell Scripting and automation of data processing tasks.
- Strong experience in database management systems like MySQL and Oracle.
- Familiarity with Tableau for creating interactive dashboards and reports.
- Experience with Git, Bitbucket, Jenkins, Maven, and Autosys for source control and continuous integration.
- Hands-on experience with ticketing and project management tools like Jira, Confluence, and ServiceNow.
- Experience in working with IntelliJ and Eclipse for development.
- Strong analytical and problem-solving skills with a focus on delivering scalable data solutions.
- Experience in cloud platforms such as AWS, Azure, or GCP.
- Knowledge of DevOps practices and CI/CD pipeline automation.
- Familiarity with ServiceNow for IT service management.
Disclaimer: The company is committed to ensuring the privacy and security of your information. By submitting this form, you consent to the collection, processing, and retention of the information you provide. The data collected (which may include your contact details, educational background, work experience and skills) will be used solely for the purpose of evaluating your qualifications for the position you're applying for. Your data will be stored securely and retained for the duration necessary to fulfill our hiring process. If you are not selected for the position, your data will be kept on file for a limited period in case future opportunities arise. You have the right to access, correct, or delete your data at any time by contacting us at Quess Singapore | A Leading Staffing Services Provider in Singapore (quesscorp.sg)