You’ll join the Data Intelligence team in creating data warehouse.
Responsibilities:
- Improve payment performance and drive business growth;
- Develop data pipelines and participate in the standard definition of data development;
- Build data quality system and complete data monitoring/verification process;
- Develop data-related product requirements, and be able to independently understand requirement, design schemes, and implement.
Qualifications:
- Bachelor's degree or above in Computer Science, Statistics, Mathematics or other related majors;
- At least 3 years of experiences in data engineering developing data pipelines;
- Proficient in at least one programming language such as Python, Java, Scala, Go, with a strong engineering background and interest in data;
- Prior experience with writing and debugging data pipelines using a distributed data framework (Hadoop/Spark/Flink/Storm, etc.);
- Familiar with OLAP engines (Hive/ES/Clickhouse/Druid/Kylin/Doris, etc.);
- Familiar with data warehouse architecture, data modelling methods and data governance; enthusiastic about data mining, strong business understanding and abstraction capabilities;
- Proficient in databases, strong SQL/ETL development ability;
- Experience in real-time data warehouse development is preferred.