Staff Skill
- Data Engineering & Architecture:
- Strong understanding of data engineering principles, data pipelines, and architectures.
- Expertise in building scalable, high-performance data systems.
- ETL/ELT Processes:
- Proficiency in designing, developing, and maintaining ETL/ELT pipelines.
- Experience with ETL tools such as Databricks Delta Lake, Google BigQuery, Azure Data Factory, SSIS.
- Programming & Scripting:
- Advanced skills in programming languages like Python, R, Java, Scala, or SQL.
- Ability to write clean, efficient, and reusable code.
- Database Technologies:
- Extensive experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server).
- Expertise in NoSQL databases (e.g., MongoDB, Cassandra).
- Familiarity with data storage technologies like Hadoop, HBase, Redshift, BigQuery, Snowflake.
- Data Warehousing & Data Lakes:
- Experience with data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake, Amazon Redshift).
- Expertise in working with data lakes (e.g., Azure Data Lake Storage (ADLS), Hadoop, Databricks).
- Cloud Platforms:
- Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of cloud-native data processing tools (e.g., AWS Glue, Azure Data Factory, Databricks).
- Data Integration & API:
- Ability to integrate data from multiple sources including relational, NoSQL, and external APIs.
- Experience with RESTful APIs and Web Services.
- Big Data Technologies:
- Knowledge of big data frameworks
- Familiarity with distributed data processing and storage systems.
- Data Modeling & Design:
- Proficient in data modeling concepts (e.g., star schema, snowflake schema, dimensional modeling).
- Experience in defining and implementing data models for analytical and operational needs.
- Microsoft Power BI & Reporting:
- Expertise in Power BI design, integration, and implementation for building interactive dashboards and reports.
- Experience with data source integration and visualization techniques.
- SharePoint Integration:
- Experience in integrating data solutions with SharePoint, enabling seamless data flow, document collaboration, Sharepoint Server Management, Customization and Development, permissions and security.
- Version Control & DevOps:
- Proficiency in version control systems (e.g., Git).
- Experience with CI/CD pipelines and DevOps practices to automate data pipeline deployment.
- Performance Optimization:
- Ability to optimize queries, data processes, and storage solutions for performance and cost-efficiency.
- Security & Data Governance:
- Knowledge of data security practices, encryption, and compliance requirements.
- Experience with data governance frameworks and tools.
- Problem-Solving & Analytical Thinking:
- Strong problem-solving skills and the ability to troubleshoot complex data issues.
- Ability to analyze large datasets and derive actionable insights.
- Communication & Collaboration:
- Excellent written and verbal communication skills for interacting with stakeholders and cross-functional teams.
- Ability to mentor junior engineers and work collaboratively in a team environment.
- Education: Bachelor’s degree in Computer Science, Engineering, or a related field.
- Experience: 5 years of experience as a Data Engineer or in a similar role.
Staff's current job responsibilities (Detail JD):
- Data Pipeline Development:
- Design, develop, and maintain scalable, reliable, and efficient data pipelines for processing structured and unstructured data.
- ETL/ELT Implementation:
- Create, optimize, and maintain ETL/ELT workflows to transform raw data into usable formats for business and analytical purposes.
- Data Warehousing & Data Lakes:
- Design and implement data storage solutions using data warehousing and data lake technologies such as Azure Synapse Analytics, Snowflake, or Databricks.
- Data Integration:
- Integrate data from multiple internal and external sources, ensuring accuracy, consistency, and timeliness of data delivery.
- Cloud-Based Data Solutions:
- Architect and implement data solutions on cloud platforms like AWS, Azure, or GCP, leveraging native cloud tools and services.
- Data Modeling:
- Develop and maintain logical and physical data models for transactional, operational, and analytical systems.
- Performance Optimization:
- Optimize database queries, data storage, and processing pipelines to ensure high performance and low latency.
- Microsoft Power BI Implementation:
- Develop and implement Power BI dashboards and reports, integrating various data sources and ensuring actionable insights for stakeholders.
- SharePoint Integration:
- Collaborate with teams to integrate SharePoint with data solutions, ensuring seamless access and collaboration across platforms.
- Data Security & Governance:
- Ensure compliance with data security policies, implement best practices for data governance, and maintain data quality standards.
- Collaboration & Communication:
- Work closely with data analysts, data scientists, and business teams to understand data needs and deliver solutions that align with business objectives.
- Mentorship & Leadership:
- Mentor junior engineers, provide technical guidance, and promote knowledge sharing within the team.
- Innovation & Continuous Improvement:
- Stay updated with emerging technologies, recommend new tools and techniques, and drive continuous improvement in data engineering practices.
- Documentation & Reporting:
- Prepare comprehensive technical documentation for data pipelines, architecture, and processes, ensuring knowledge transfer and clarity.
- Monitoring & Troubleshooting:
- Monitor data workflows, troubleshoot issues, and implement solutions to ensure seamless data operations.
Interested candidates, please click "APPLY" to begin your job search journey and submit your CV directly through the official PERSOLKELLY job application platform - GO Mobile. (https://sg.go.persolkelly.com/job/apply/10696)
We regret to inform you that only shortlisted candidates will be notified.
Gelangre Reyanelle Gelario | REG No : R1870995
PERSOLKELLY SINGAPORE PTE LTD | EA License No : 01C4394
This is in partnership with Employment and Employability Institute Pte Ltd (“e2i”). e2i is the empowering network for workers and employers seeking employment and employability solutions. e2i serves as a bridge between workers and employers, connecting with workers to offer job security through job-matching, career guidance and skills upgrading services, and partnering employers to address their manpower needs through recruitment, training and job redesign solutions. e2i is a tripartite initiative of the National Trades Union Congress set up to support nation-wide manpower and skills upgrading initiatives. By applying for this role, you consent to e2i’s PDPA.
By sending us your personal data and curriculum vitae (CV), you are deemed to consent to PERSOLKELLY Singapore Pte Ltd and its affiliates to collect, use and disclose your personal data for the purposes set out in the Privacy Policy available at https://www.persolkelly.com.sg/policies. You acknowledge that you have read, understood, and agree with the Privacy Policy.