top of page
junto innovation hub logo.png

Distributed Data Storage and Synchronization: Evolutions in the Blockchain

Distributed Data Storage and Synchronization: Evolutions in the Blockchain


This architecture reduces by orders of magnitude the replication load for most users, while preserving the critical guarantees of security and verification.


The DAS Breakthrough: How It Works in Practice

The real revolution is Data Availability Sampling, introduced by Mustafa Al-Bassam with LazyLedger and now adopted by Celestia. DAS enables light clients to verify data availability without having to download a full block. The typical flow is:

Celestia, launched in 2023, already has typical blocks of ~10–20 MiB using 2D Reed-Solomon and DAS. Light clients require a continuous bandwidth of around 1–2 Mbps, suitable for smartphones.


State Synchronization: Three Main Approaches

Even with data availability solved, it is necessary to ensure nodes converge on the same state. The three prevailing strategies today are:


Debate: Pros and Cons of Different Approaches

There is much discussion in the ecosystem about which model will offer the best balance of security, scalability, and decentralization. On one side, monolithic replication remains the simplest to understand and provides maximum resilience against censorship and attacks, but requires enormous hardware and bandwidth. On the other side, modular models with DAS promise real scalability and lower operating costs for common participants, but introduce new attack surfaces and dependence on the continuous availability of data. Moreover, some fear that separating DA and execution could create a tech supply chain where data providers' control becomes a friction point or, worse, a strategic blow to overall resilience. In this debate, transparency and verification remain central: only execution nodes cannot guarantee the accuracy of the entire history without a reliable and available data base. Yet the value of modularity lies precisely in balancing security with accessibility: less hardware per participant, more specialization, less data stored permanently, but strong and verifiable proofs of correctness. Furthermore, adopting DAS allows developers to build flexible execution environments optimized for different use cases, from IoT to enterprises, while preserving the integrity of the entire ecosystem.

Other perspectives argue that cohesion across layers may be preferable to ensure regulatory coherence and risk management, especially in a European context where data governance has significant weight. In any case, the evolution is moving quickly: hybrid models that combine foundational security with operational scalability are becoming the norm, with interfaces between consensus nodes, data availability nodes, and transaction executors growing ever clearer.


Conclusion: A New Architecture for Innovation

The era of “immutable history and everyone stores everything” is giving way to a more sustainable, modular, and verifiable scale. The shift toward modularity with DAS is not only about performance: it changes how we design, verify, and commercialize blockchain applications and dApps. Distributed architectures, with the separation of consensus, data availability, and execution, offer new opportunities for startups and developers to innovate on robust infrastructure, reducing costs and opening new application possibilities. For those building the future, the modular approach acts as a compass to guide resilient and scalable projects.


bottom of page