company

Expert of Distributed Data Analysis Framework

Job Responsibilities

  • Design and implement the architecture of the next-generation serverless distributed data analysis framework. Make breakthroughs in key technologies such as DAG optimization, integration of batch processing and stream processing, and execution acceleration in scenarios such as batch processing, stream processing, and AI training and promotion.
  • Implement key technologies to support Huawei's internal and external industry applications, such as the AI training and promotion platform and digital intelligence convergence platform, explore new services/business models, and achieve business monetization.
  • Gain insights into and track the latest progress in the academia and industry, lead the evolution direction of data analysis technologies, and build Huawei's competitiveness in this field.

Job Requirements

  • Master's degree or above in computer science or related majors with 8+ years of relevant work experience (3-5 years for PhD)
  • Proficient in the theory and engineering practice of distributed systems, and in-depth understanding of the system implementation of data analysis frameworks such as Spark, Flink, Presto, and Plato
  • Have rich experience in software architecture design, and have experience in leading the architecture design and software development delivery of large-scale data analysis systems.
  • Preference will be given to those who have led or deeply participated in the development of relevant open source software.
  • Have the key capabilities of big data framework from requirement analysis to architecture design.
  • Master key technologies of mainstream big data frameworks and have in-depth understanding of emerging technologies in the industry.
  • Have rich experience in software architecture design. Experience in leading the architecture design and software development delivery of related systems is preferred.