Redian新闻
>
opening: front end, full stack, big data
avatar
opening: front end, full stack, big data# JobHunting - 待字闺中
E*h
1
多个职位open,全新项目和团队,有职业发展空间。prefer 3-5+以上工作经验。简历
请发送[email protected]/* */,谢谢大家!
更多职位:
https://angel.co/huami/jobs
-----------
Frontend Engineer
The big data team in Huami's US office, is looking for experienced front end
engineer to build state of art virtualization for data.
Responsibilities:
Build the first version of a customer facing web UI, support both mobile and
pc. Working closely with backend engineers to define the interface of
services, and UX to keep improving user experience.
Requirements:
Minimal 3+ years of experience on front end engineering.
Development experience with HTML, CSS, Javascript, on web and mobile
platform.
Familiar with Angular or React is big plus.
Experience of full stack engineering is big plus.
Experience of mobile app development is big plus.
-----------
Full stack Engineer
The big data team in Huami's US office, is looking for experienced full
stack engineer with passion to work in different areas like front end, web
services, and data processing.
Responsibilities:
Build backend system to integrate with big data platform, and providing
services to front end/mobile app.
Requirements:
(Our technical stack is AWS + open source)
5+ years experience on backend system, such as web services, database design
, performance tuning, etc.
Programming language: Java (preferred), Python, Go. Knowledge of scripting
language is plus.
Experience with AWS, such as AWS EC2, Kinesis, DynamoDB, RDS, S3, is big
plus.
Experience with front end technologies, such as HTML, CSS, Javascript,
Angular JS, is big plus.
Experience of mobile development is big plus.
-----------
Senior Data Engineer
The big data team in Huami's US office, is looking for experienced engineer
with big data background to join the team.
Responsibilities:
Work closely with internal teams and partners, to identity key requirements,
and come with the best solution to address their needs. Since team is small
, you may work on different areas of data processing pipeline, like platform
setup, system tuning and troubleshooting, data ingestion, ETL, data mining,
machine learning, virtualization and reporting, etc.
Requirements:
(Our technical stack is AWS + open source)
5+ years overall experience is desired. Hadoop, Spark and AWS experience is
big plus.
Programming language: Java (preferred), Python. Scripting language is plus.
Hands-on experience with Hadoop or Spark, such as HDFS, YARN, Hive, HBase,
Parquet, Spark Streaming, MLlib.
Hands-on experience with cloud infrastructure is big plus, such as AWS EC2,
Kinesis, DynamoDB, RDS, S3, Redshift.
Experience of setup, monitoring, tuning application and big data system, is
big plus.
相关阅读
logo
联系我们隐私协议©2024 redian.news
Redian新闻
Redian.news刊载任何文章,不代表同意其说法或描述,仅为提供更多信息,也不构成任何建议。文章信息的合法性及真实性由其作者负责,与Redian.news及其运营公司无关。欢迎投稿,如发现稿件侵权,或作者不愿在本网发表文章,请版权拥有者通知本网处理。