Data Availability Sampling, often called DAS, is a way to check that the data for a new block was actually published, without every participant downloading the whole block. Light clients pull a few random pieces and use probability to decide if the full dataset is available across the network. This keeps verification fast while still giving strong confidence that nothing is being hidden.
Blockchains need a simple way to know that a block producer released all the block’s transaction data. If some of it is missing, hidden transactions could slip through. DAS addresses this “data availability problem” by letting the network verify publication of data with far lower bandwidth and storage than a full download.
With DAS, block data is first expanded using erasure coding, which adds redundant chunks that let the original data be reconstructed from only a portion of those chunks. Light clients then request a few small, random samples from that expanded set. If the samples arrive and verify, the chance that any data is missing drops quickly as more samples succeed, giving high confidence that the full block data is available.
DAS is especially helpful for light clients that run on laptops and phones. They can check data availability on their own instead of trusting a nearby full node. The same idea also supports rollups, which post block data to a base chain. DAS helps the base chain confirm that the rollup’s data was made public without forcing every node to fetch the entire dataset.
Because nodes only sample tiny pieces, DAS cuts bandwidth and storage needs. More people can run verifying nodes, which spreads participation and reduces reliance on heavyweight infrastructure. Networks stay inclusive while still spotting missing or corrupted data through many independent random checks.
DAS provides probabilistic guarantees, not absolute certainty. Confidence grows with each successful sample and with the number of independent nodes that perform sampling. If a block producer tries to hide any part of a block, many of the erasure-coded pieces would fail to show up, and random checks would reveal the issue quickly. The approach trades small verification work for very high statistical assurance that the full data is out there.
Traditional verification means downloading every byte of a block, which gets harder as blocks grow. DAS avoids that by sampling and still lets light clients verify availability on their own. This changes the trust model from “trust a full node that downloaded everything” to “check enough random pieces to be confident the data was published.”