Data Engineering Lead

  • Term: Permanent
  • Location: Shenzhen, China
  • Salary: Negotiable
  • Job Reference: 5984

Our client is a global Fintech company with offices across 10+ countries. They have various product lines like E-Wallet and Payment systems. The company offers great working environment, employee benefits and competitive salaries.

They are looking for an experienced product leader to join their fast-growing team in Nanshan, Shenzhen!

 

Responsibilities

+ Work with teams to build and continue to evolve data models and data flows to enable data driven decision-making

+ Design and implement alerting and testing mechanism to ensure the accuracy and timeliness of these pipelines. (e.g., improve instrumentation, optimize logging, etc)

+ Create user friendly libraries that make distributed batch computation easy to write and test for all users across company

+ Identify shared data needs across company, understand their specific requirements, and build efficient and scalable data pipelines to meet the various needs to enable data-driven decisions across company

+ Create a unified user data model that gives a complete view of our users across a varied set of products

+ Keep lower the latency and bridge the gap between our source systems and our enterprise data warehouse by refactoring and optimizing our core data pipeline jobs

+ Pair with user teams to optimize and rewrite business critical batch processing jobs in Airflow

+ Create robust and easy to use unit testing infrastructure for batch processing pipelines

+ Build a framework and tools to re-architect data pipelines to run more incrementally

+ Estimate and control capacity and utilization of computing and storage resources in our data infrastructure

 

Requirements

+ 3 years+ programming experience. Familiar with scripting and programming languages like: Python, Scala, or Java, etc

+ Experience with Kubernetes, Helm

+ Experience with big data framework and ecosystem (Hadoop, Spark)

+ Familiar with design and implement infrastructure on AWS or Google Cloud

+ Familiar with database or data warehouse technologies (Postgres, MySQL, Hive, Athena, BigQuery, Snowflake, etc)

+ Experience with Tableau

+ Have experience with Airflow or other similar scheduling tools

+ Have experience with real time data processing

+ Have experience with Machine Learning domain

Please send your CV to Shuyan.Huang@cogsagency.com

Apply for this position now

Join Cogs + today

Sign up in minutes with a dedicated contract accounts team to guide you.