We use cookies on this website to ensure you get the best experience. By confirming, you agree to our Cookie Policy.

Accept
Available on:
Stats

Key Numbers

Raw on-chain events indexed to date (across 10 chains)

27
B+

Processed DeFi datasets under management (uncompressed)

117
TB

Median time from block message arrival to processed record (7-day median, Ethereum)

<
140
ms
Features

Engineered for Accuracy.
Designed for Speed.

Proof‑of‑Derivation

Every row includes a salted hash linking back to the exact raw on-chain records for auditability.

Real‑time & resilient

Reorg‑aware pipeline; fast propagation; independent gateways for failover.

Highest granularity and coverage

Tick‑level changes; never aggregated away. Coverage across the top EVM chains by DeFi protocols activity.

Configurable & efficient

Choose chains, protocols and delivery format; pay only for what you use.

Use Cases

From market insight to models
and production apps.

Cryptocurrency Market Analysis

Discover signals faster with granular, auditable on-chain datasets.

  • Liquidity and flow analysis across chains and DeFi protocols
  • Regime shift detection (DEX volume, token volatility, market depth)
  • Strategy research with traceable, backtest-ready data
Every record includes proof-of-derivation.
Request a tailored dataset

Machine Learning / AI Training 

Feature-store-ready tables for supervised and time-series models

  • MCP-compatible tools to fetch dataset slices and metadata directly into AI workflows (with provenance) 
  • Consistent schemas (volume, liquidity, sentiment indicators) with stable IDs 
  • Deterministic reference hashes for audit 
Billions of swaps, transfers, and liquidity snapshots - normalized and ready to prompt-ready.
Request MCP quickstart

Accelerated Web3 Development

Real-time events and schema-stable data to ship features quickly. 

  • Live WebSocket streams for swaps, reserves, prices 
  • Schema-stable entities (tokens, pools, dexes) that won’t break your app 
  • Historical snapshots to bootstrap state instantly 
Latency <140 ms median on ETH (7-day); 99.95% uptime with SLA. 
Request WebSocket (WSS) access
Cryptocurrency Market Analysis

Cryptocurrency Market Analysis

Discover signals faster with granular, auditable on-chain datasets.

  • Liquidity and flow analysis across chains and DeFi protocols
  • Regime shift detection (DEX volume, token volatility, market depth)
  • Strategy research with traceable, backtest-ready data
Every record includes proof-of-derivation.
Request a tailored dataset
Machine Learning / AI Training 

Machine Learning / AI Training 

Feature-store-ready tables for supervised and time-series models

  • MCP-compatible tools to fetch dataset slices and metadata directly into AI workflows (with provenance) 
  • Consistent schemas (volume, liquidity, sentiment indicators) with stable IDs 
  • Deterministic reference hashes for audit 
Billions of swaps, transfers, and liquidity snapshots - normalized and ready to prompt-ready.
Request MCP quickstart
Accelerated Web3 Development

Accelerated Web3 Development

Real-time events and schema-stable data to ship features quickly. 

  • Live WebSocket streams for swaps, reserves, prices 
  • Schema-stable entities (tokens, pools, dexes) that won’t break your app 
  • Historical snapshots to bootstrap state instantly 
Latency <140 ms median on ETH (7-day); 99.95% uptime with SLA. 
Request WebSocket (WSS) access
Who we are

Who we are

We’re a Poland-based team of data engineers and quants. We transform raw on-chain events into lineage-verified, schema-stable datasets that your team can trust and use immediately - reducing data prep time and accelerating decisions across research, trading, and product.

FAQs

What exactly does BlockDB® deliver?

Clean, normalized DeFi tables (e.g., swaps, transfers, pools, liquidity depth metrics, prices, arbitrage opportunities) designed for analysis, ML features, and app backends. Each dataset has a stable schema and documentation. See each data product page in the data catalog for current details.

What differentiates you from running our own nodes or using public APIs?

We eliminate the burden of node operations, ensure reorg awareness, provide lineage proofs, and deliver structured datasets that are not available through public APIs.

How do you prove data lineage?

Every row carries a reference hashes linking back to specific raw on-chain events. This enables reproducible research, audits, and precise tracing.

What about freshness and reliability?

Live feeds operate in near real-time, with typical latency under 150 ms on Ethereum Mainnet and similar performance across other EVM chains. The entire ingestion pipeline is reorg-aware, meaning it automatically reconciles short blockchain reorganizations to maintain data accuracy. Our infrastructure is redundant and monitored 24/7, with a 99.95% uptime SLA and guaranteed continuity of real-time and End-of-Day (EOD) feeds.

Do you provide historical backfills?

Yes. We maintain full history from chain genesis on every supported chain and can provide backfills for specific EVM chains, protocols, or time windows on request.

Which chains and protocols do you cover?

Our coverage is constantly growing, focused on the top EVM chains and the most active DeFi protocols. See each data product page in our catalog for current details. We can also support specific EVM chains, protocols, or time windows on request.

How is the data delivered? In what formats?

By default, we support SFTP for scheduled batch drops and WebSocket (WSS) for low-latency live streams. For AI workflows, we provide Model Context Protocol (MCP) tools so assistants can pull dataset slices, metadata, and lineage on demand. Optional integrations into Amazon S3, Azure Blob Storage, Snowflake, and other enterprise data platforms are available upon request. Files are delivered as Parquet, CSV, XLS, or JSON - time-partitioned in UTC, compressed, and schema-/version-tagged for stability and reproducibility.

Can we customize datasets or request new metrics?

Yes. Custom metrics, indicators, and schema extensions can be supported. All changes are versioned and backward-compatible.

Can we try a sample and how is pricing structured?

You can preview data in the catalog and request a 1,000-row sample. Pricing depends on chains, datasets, and delivery options; we’ll scope it quickly after a short call.

Tell us what you need data for.

For any company related topics:

support@blockdb.io

Thank you!

We’ve received your request and our team will get back to you within 1–2 business days.

For urgent questions, contact us directly at support@blockdb.io.

Oops! Something went wrong while submitting the form. Please try again later