The Open Platform (TOP) is the leading technology company developing Web3 innovations inside Telegram. TOP is fueling the Telegram economy through both building and investing in foundational infrastructure and consumer-facing apps. By integrating blockchain technology, TOP is building scalable solutions designed for a billion users — accelerating the mass adoption of crypto.
TOP provides a powerful toolkit of funding, expertise, and technology resources, streamlining access to critical tools like wallets, developer resources, SDKs, APIs, and marketplaces. TOP also develops and supports leading ecosystem products including the Wallet in Telegram, Tonkeeper, STON.fi, Getgems, Tribute, and more.
We are looking for a Junior Data Engineer to join our data platform team. You will help build and maintain data pipelines that power analytics for multiple products and teams, working closely with senior engineers.
This role is suited for individuals early in their careers who aspire to become strong Data Engineers, gaining experience with modern data workflows, building data systems, and writing production-quality code.
Responsibilities:
- Participate in the development and maintenance of data pipelines and data-related services.
- Contribute to the existing codebase for shared tools and libraries.
- Assist with upgrading data platform components and services.
- Communicate with analysts to understand their data needs.
Example tasks for this role:
- Extend an existing SQL-based pipeline with a new transformation needed by analytics.
- Add a new data source to an existing ETL process under the supervision of a senior engineer.
- Refactor a small Python script into a clearer, modular structure and add logging.
- Help configure CI steps for linting and tests for a data repository.
- Update documentation for a pipeline after changes in logic or schema.
Requirements:
- Confident in communication and proactive in seeking clarification when needed.
- Demonstrates responsibility and ownership, and proactively communicates in any challenges.
- Comfortable working with IDEs and version control systems like Git.
- Basic understanding of clean code principles and software delivery workflows.
- Essential Python skills, covering language fundamentals and data structures.
- Confident with SQL basics.
- Regular and thoughtful use of AI tools.
- Strong motivation to learn and grow in data engineering.
Nice to have:
- Knowledge of data engineering fundamentals (ETL, data modeling, data quality, and various storage systems).
- Experience using Apache Airflow.
- Familiarity with containerization and associated tools (e.g., Docker).
- Basic experience with any cloud platform (GCP, AWS, or Azure).
- Experience with BI tools (Superset, Metabase, Power BI, etc.)
- Any pet projects related to data: ETL scripts, dashboards, analytics for personal projects or hackathons.
Why it is a fantastic opportunity:
- Our business is growing at an exponential scale.
- Non-bureaucratic management that focuses on results.
- Regular performance reviews to assess your progress.
- Remote setup with access to our hubs in Dubai, Yerevan, London and Belgrade.
- Compensation for medical expenses.
- Provision of necessary equipment.
- 20 working days of paid vacation annually.
- 11 days off per year.
- 14 days of paid sick leave to support your health and recovery when needed.
- Access to internal conferences, English courses and corporate events.
The Open Platform is an equal opportunity employer.