Skip to content
Architecture
Caching Overlay

Caching Overlay

A caching overlay accelerates access to relatively static or slowly changing content (e.g., videos, images, software downloads). Unlike a traditional CDN that relies on a single corporate infrastructure, a decentralized caching overlay harnesses globally distributed nodes—often run by independent participants—to store and serve content closer to users.


Caching Layer Concepts

Local Proximity

The overlay uses smart discovery algorithms to route user requests to the nearest cache node, reducing latency and improving load times.

Distributed Hash Tables (DHTs)

A DHT stores index pointers for each cached resource, allowing clients to quickly discover which peers have the desired content.

Adaptive Replication

Popular content is replicated more widely; less frequently accessed data has fewer copies. This ensures efficient resource use and optimal content availability.

Cache Hierarchies

An edge node can escalate missed requests to a parent cache, and then to the origin if needed. This design minimizes the origin server load while avoiding excessive replication.

On-Chain Reputation & Rewards

Nodes earn token incentives proportional to actual content delivered—confirmed via cryptographic proofs or random spot-checks.

Proof-of-Random-Access

Each cache node can be periodically challenged to prove it holds certain data blocks via Merkle proofs, ensuring data integrity and discouraging free-riding.

Caching Layer Benefits

Content Speedup

Content is served from a node close to the end user, accelerating web experiences. Speedup is the ratio of the time to download the file directly from the origin to the time to download the same file using the overlay.

Origin Offload

Reduces bandwidth usage and server strain at the origin, making it easier and cheaper to scale.

Scalability & Resilience

With many small caches distributed globally, a network can handle high traffic volumes without centralized bottlenecks.