About Lambda:
Lambda is the DeFi Intelligence Platform. We provide a single source of truth for on-chain financial data, enabling investors and institutions to:
- Track token balances and DeFi positions across multiple chains
- Analyze historical and real-time rewards
- Accurately calculate PnL and uncover hidden costs (e.g., slippage, rebalancing, fees)
- Compare strategies and pools across protocols with confidence
Our mission is to make crypto data transparent, reliable, and actionable, reducing the time to generate accurate performance reports from weeks to just a few hours. We’re a fast-moving startup with a strong technical culture, building the backbone of crypto data infrastructure.
Responsibilities: - Design, maintain, and scale streaming ETL pipelines for blockchain data.
- Build and optimize ClickHouse data models and materialized views for high-performance analytics.
- Develop and maintain data exporters using orchestration tools.
- Implement data transformations and decoding logic.
- Combine multiple data sources — indexers and Kafka topics from third parties — to aggregate them into tables for our API.
- Establish and improve testing, monitoring, automation, and migration processes for pipelines.
- Ensure timely delivery of new data features in alignment with product goals.
- Create automation tools for data analyst inputs, such as a dictionary, to keep them up to date.
Requirements: - Strong SQL skills with columnar databases (ClickHouse, Druid, BigQuery, etc.).
- Hands-on streaming frameworks experience (Flink, Kafka, or similar).
- Solid Python skills for data engineering and backend services.
- Proven track record of delivering pipelines and features to production on schedule.
- Strong focus on automation, reliability, maintainability, and documentation.
- Startup mindset: keep balance between speed and quality.
Nice to Have: - Experience operating ClickHouse at scale (performance tuning, partitioning, materialized views)
- Experience with CI/CD and automated testing for data pipelines (e.g. GitHub Actions, dbt)
- Knowledge of multi-chain ecosystems (EVM & non-EVM)
- Familiarity with blockchain/crypto data structures (transactions, logs, ABI decoding).
- Contributions to open-source or blockchain data infrastructure projects.