Key Features
Silica Protocol
The Silica Protocol defines how message data is propagated, validated, and made available across the chain. Instead of embedding message bytes in blocks, data is organized into erasure-coded sidecars that are anchored to the block via cryptographic commitments.
Silica is responsible for:
Routing messages into respective lanes
Erasure coding each message batch into redundant chunks
Peer to peer lane committee gossip
Verifying availability through lane committee voting
Serving data to requested nodes
Silica operates alongside consensus but does not block it. A block can be finalized even while data propagation is still in progress.
Lane-Based Parallelism
The network is divided into parallel lanes. Each lane has a rotating committee of validators assigned to it. When you submit a message:
It gets routed to a specific lane (based on your address)
The lane's committee validators store chunks of your data
Each validator holds a piece - no single validator has the full message
This distribution model prevents any single validator from becoming a bandwidth bottleneck.
Priority (PM)
Bid-based (Signed Debit)
Time-sensitive data, guaranteed fast inclusion
Standard (SM)
VDF proof (compute cost)
Regular messages, fair access
VDF Anti-Spam Protection
Standard Messages require a Verifiable Delay Function proof. VDFs are computations that:
Take a minimum amount of sequential time to compute
Can be verified quickly
Cannot be parallelized or accelerated
This creates a natural rate limit: users must expend real-world time to submit messages, preventing spam floods without requiring monetary fees.
Data Availability & Permanence
Obsidian's approach: the acceptance criteria for messages is committee attestation. Non-committee nodes trust that attestation or can optionally sample for additional confidence.
Availability Certificates: A valid certificate proves that a supermajority of the lane committee possessed the data at the time of signing.
Erasure Coding: Combined with erasure coding, this guarantees reconstructability from partial data.
Retention Window: Committee members are obligated to serve data for a defined retention window after inclusion. After this window, data transitions to archival nodes.
Full EVM Compatibility
Obsidian runs standard Ethereum tooling:
MetaMask, Rainbow, and all EVM wallets
ethers.js, web3.js, viem
Hardhat, Foundry, Remix
Data Indexers (The Graph, Ponder, etc.)
Smart Contracts (Solidity, Vyper, Huff)
Your Ethereum skills transfer directly.
Sustainable Archive Economics
Obsidian is designed to support sustainable long-term data availability through dedicated archive incentives. This enables:
Sustainable incentives for data preservation
More messages → more fees → more archive nodes
Decentralized historical data availability
Sharded Archive Support
High message throughput creates a storage scaling challenge. At maximum capacity, yearly data growth can reach ~84 TB—far beyond what traditional "store everything" archive nodes can handle.
Obsidian solves this with sharded archives: instead of every archive storing all history, nodes store specific epoch ranges:
Accessible participation: Run an archive with consumer hardware by storing a subset of history
Horizontal scaling: More epoch ranges served by adding shard groups
Redundancy: Multiple nodes per shard group ensures availability
Cryptographic Security
Every message includes:
Signature: Proves sender authenticity
Chain ID: Prevents cross-chain replay
Nonce: Prevents same-chain replay
Payload Commitment: Anchored in the canonical block
All verified at multiple layers (RPC, P2P, consensus).
Availability Certificates
An Availability Certificate contains:
Reference to the lane batch (slot, lane, sequence)
Data commitment (merkle root of chunks)
Aggregated committee signatures
Signer bitmap (which validators attested)
A valid certificate proves that a supermajority of the lane committee possessed the data at the time of signing. Combined with erasure coding, this guarantees reconstructability.
See: Silica Protocol.
These features combine to create a blockchain purpose-built for permanent, accessible, decentralized data.
Last updated