Unlock Your Potential with a Growing and Dynamic Company.
At Ascentis, we are proud to be a leading CRM, digital, and social company with a track record of excellence. With over 50 prestigious awards under our belt, we have established ourselves as a trusted partner in servicing regional accounts across 13 countries in Southeast Asia.
Our success is rooted in our core values: PACT - professionalism, accountability, consistency, and teamwork. These values drive our dynamic and fast-paced environment, where results matter. We believe in fostering a culture of open communication, where every voice is valued, and ideas are encouraged to flourish. One of our key differentiators is our flat organizational structure, which promotes a collaborative and inclusive work environment. Our management team is hands-on and approachable, working side by side with our talented employees to drive innovation and achieve remarkable results. As a growing company, we offer tremendous potential for personal and professional growth. We embrace a meritocratic approach, recognizing and rewarding the contributions of talented individuals who thrive in a team-oriented setting.
We believe in providing ample opportunities for our employees to take on new challenges, learn, and advance their careers. If you are a passionate and driven individual seeking a rewarding career in a company that values your skills and potential, Ascentis is the place for you.
Join us on our journey of success and be part of a team that is shaping the future of our industry. We look forward to welcoming exceptional talent who share our values and are ready to make an impact. Together, we can ascend to new heights.
Responsibilities :
· Develop data pipelines (ETL / ELT flows) to collect and integrate data from diverse sources into data lake and data warehouse.
· Construct a data quality service to uphold the integrity of data models.
· Create reports using PowerBI or other Business Intelligence tools.
· Implement monitoring tools / scripts to detect and address performance bottlenecks.
· Optimize data extraction operations to avoid impacting the performance of source systems and enhance overall ETL / ELT job efficiency.
· Generate high-quality code to support data processing tasks.
· Collaborate closely with engineering and product management teams.
· Potentially engage with external partners or clients to collaborate on building data pipelines.
Requirements:
· 3-5 years experience designing and implementing scalable data processing solutions.
· BSc in Computer Science or equivalent practical experience.
· Hands-on experience in designing and implementing AWS Analytics services, with a focus on Glue, Athena, Lambda, Airflow, SageMaker, and Quicksight.
· Familiarity with AWS cloud platform, including S3 and EC2.
· Knowledgeable in Data Lake and Data Warehouse design, incorporating best practices in data modelling and data partitioning.
· Practical experience in constructing ETL / ELT data pipelines, adhering to best practices in data cleansing, tokenization, and ensuring high data quality.
· Proficiency in utilizing data visualization tools including Microsoft PowerBI and AWS Quicksight.
· Familiarity with databases such as SQL Server and PostgreSQL is advantageous.
· Practical working experience with Python.
*Interested candidates, kindly submit your updated CV to [email protected]