Python Data Engineer

Lucky Hunter

Python Data Engineer

Описание вакансии

The company is dynamic and swiftly expanding in high-frequency trading (NFT), specializing in cryptocurrency market-making and arbitrage strategies. Despite being a compact organization, they've rapidly grown to trade on numerous cryptocurrency exchanges globally. Their activities contribute crucial liquidity, fostering more efficient financial markets. With proprietary algorithms, they command a notable market share, reaching up to 5 percent of trading volume on these platforms, solidifying their position as a significant market maker.

About Role:

This role encompasses the management of data from a variety of sources including cryptocurrency exchanges and our internal trading systems, focusing on high-volume data extraction, processing, and statistical analysis. You will also play a pivotal role in both developing and maintaining sophisticated data integration systems and infrastructure services that complement our trading platform.

Required Skills & Qualifications:

  • Construct and optimize database schemas in ClickHouse (or other column-oriented database) for high-performance storage and rapid retrieval of vast datasets.
  • Proficiency in Python with experience in developing infrastructure and data processing services with a strong understanding of its data-related libraries such as pandas and NumPy.
  • Ability to write clean, efficient, and well-structured code.
  • Exceptional problem-solving skills and meticulous attention to detail.
  • Strong prioritization skills.
  • Knowledge of financial markets and trading principles.
  • Familiarity with Apache Kafka.
  • Excellent SQL skills
  • Analytical and detail-oriented, with a strong ability to troubleshoot and solve complex problems.
  • Proactive and self-motivated, with a strong ability to work independently and in a team environment.
  • Excellent communication skills, capable of explaining complex data issues in a clear and understandable manner.
  • Adaptable and open to learning new technologies and frameworks as needed.

Having experience with any of the following technologies will be a plus: Kafka Connect (~ClickPipes), Kafka Schema Registry, AWS Certified Data Engineer Associate, Business Intelligence Tools (metabase), Parque, Apache Airflow / Argo Workflows / Kubernetes, Experience with timeseries, Docker, Grafana, Prometheus, Ansible, Kubernetes, Terraform, Grafana.

All candidates need to have a good level of English. Interviews are conducted in English only!

Key Performance Indicators:

  • Accuracy of data extraction and integration as measured by error rates and data validation checks.
  • Efficiency of data processing workflows, targeting improvements in processing time
  • Adherence to project deadlines and milestones for development tasks.

What's in it for you?

  • Remote-first team;
  • flexible working hours and a healthy work-life balance (we only target candidates outside of RF or those who are interested and willing to relocate quickly);
  • payment in dollars under b2b contract+annual bonus and the opportunity to invest in a corporate fund;
  • the opportunity to work in a thriving, multicultural, fun environment in one of the world’s fastest-growing industries;
  • corporate workations: the team regularly goes on corporate trips to unique locations all over the world to work, explore the local culture, and get to know each other better.

Навыки
  • Python
  • clickhouse
  • Django Framework
  • django
  • fastapi
  • kafka
Посмотреть контакты работодателя

Похожие вакансии

Хотите оставить вакансию?

Заполните форму и найдите сотрудника всего за несколько минут.
Оставить вакансию