NIST Develops BB84 Software
The NIST BB84 software (Reconciliation and Privacy Amplification) was developed around 2002 when three orders of magnitude improvements in the Quantum Channel capacity were in sight while the existing algorithms were about six times slower that the existing channels. Unwilling to field 6000 PCs to process the data or to follow a regime in which one second of communication would be followed by 1hr 40min of processing, we sought software and hardware improvements such that the software processing capacity would match or exceed the channel capacity. For that reason we modified the existing interactive algorithms (chiefly cascade) as follows:
1) We would operate on tens of thousand to hundreds of thousands of bits at a time (not at a few hundred as commonly practiced). Our experiments showed that big chunks of data tended to be transmitted faster than minuscule ones.
2) We abandoned the idea that the maximum number of privacy amplified bits should be extracted from each load. We would measure performance as "privacy amplified bits per second" and would a. disregard algorithms parsimonious in bits but profligate in time, and b. discard bits and bit segments which would require valuable time to deliver paltry results.
3) We abandoned the practice of monitoring the error rate and of segmenting the data accordingly. Instead we relied on past data to produce an initial segmentation and test its goodness (percentage of non-matching parities) and if the original segmentation was not good enough we would bisect the segments. Once an acceptable segmentation was found, the error rate would be estimated with greater confidence than in previous algorithms.
4) We recognized that as the data were processed, recognizable subpopulations would form and that advantages would accrue if such populations would be treated separately.
These changes, and a few other, less significant, modifications, combined with parallelism and hardware improvements ensured that our BB84 software would match the speed of the quantum link.
It should be noted that interactive correction tends to pinpoint the location of errors and therefore (when probabilistic attacks are launched) signals to Eve more information than what the checksum values, all by themselves, convey. It follows that high performance non-interactive algorithms are preferable. Such algorithms, based on Low Density Parity Checksum (LDPC) correction techniques have been tested by BBN.