Data Engineer
The Open Platform
This job is no longer accepting applications
See open jobs at The Open Platform.See open jobs similar to "Data Engineer" TON.Hey, this is The Open Platform! We create open platforms that support builders in developing user-friendly applications with Toncoin as a native asset. We believe that the unique combination of TON Blockchain and Telegram Messenger provides an unparalleled foundation for new ecosystem growth.
The companies we invest and create are centered in the areas of NFT, DeFi, GameFi and SocialFi.
We are seeking a Data Engineer to join our venture builder. We've built an analytics system, established processes, and share our expertise with startups. Now, we've reached a point where maintaining the cleanliness of data pipeline architecture and data consistency is crucial – without your help, we won't be able to manage. This position will appeal to those who aspire to create a reliable data transmission and processing system that scales effortlessly.
We have a vibrant analytical community and a vast number of developers and owners: in this role, you'll immerse yourself in the intricacies of multiple products and witness the growth and development of startups.
Responsibilities:
- Designing and maintaining ETL processes between multiple data sources, including databases on PostgreSQL, ClickHouse, and APIs.
- Configuring watchdogs to monitor data flow and processing processes.
- Setting up automatic data quality control with user-friendly reporting.
- Assisting in the setup and maintenance of BI tools, including Superset.
- Working with the primary database on Google BigQuery and configuring services on the Google Cloud platform.
Requirements:
Requirements
- Ability to effectively orchestrate cloud services and applications using CLI commands (GCP, AWS, ...)
- Basic knowledge of containerization and orchestration with Docker and Kubernetes.
- Experience in setting up ETL processes from scratch.
- Experience with Airflow/Google Cloud Composer, both in writing pipelines and deploying/maintaining infrastructure for it.
- Proficiency in SQL and experience working with databases such as PostgreSQL, ClickHouse, and BigQuery.
- Knowledge of Python.
- Understanding of data monitoring principles and quality control.
- Ability to work with BI tools, preferably Superset.
Example Tasks:
- Currently, ETL is performed using Python scripts scheduled via cron. While this setup is convenient, transitioning to a more reliable tool like Airflow is desired.
- Building product analytics for startups requires rapid scalability and reporting to numerous products. Ensuring all data arrives correctly and on time necessitates proper dependency configuration and query execution timing.
- Our Superset instance is functional, but there's room for improvement. Assistance in fine-tuning it would be appreciated.
- Our data is utilized by numerous clients in various formats. Conducting a review of all endpoints would be beneficial.
- Overall, documenting the current pipeline schema and contemplating architectural adjustments for easy and scalable expansion without confusion is necessary.
What we offer:
- Plenty of various tasks and ideas.
- Non-bureaucratic management that respects business processes.
- Work in a well-funded startup environment with unique growth opportunities and a chance to join a rapidly growing company with a unique products.
- Be based remotely.
- Compensation for medical expenses.
- 20 working days of paid vacation annually.
- 7 days off/year.
- 14 days of paid sick leave to support your health and recovery when needed.
- Access to internal English courses for continuous learning and improvement in language skills.
This job is no longer accepting applications
See open jobs at The Open Platform.See open jobs similar to "Data Engineer" TON.