Thanks to @[email protected] for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
In some contexts, a bit can refer to a boolean variable, a flag. In other contexts, it may refer to the voltage at a certain point, or any number of other things. But when you are talking about bits/s then it’s a measure of information.
Yes, but as you know, this implies that the information is already available. You can use that knowledge to create a compression algorithm, or to define a less redundant file format. That’s very practical.
We can also be a bit philosophical and ask: How much information does a backup contain? The answer could be: By definition, 0 bits. That’s not a useful answer, which implies a problem with the application of the definition.
A more interesting question might be: How much information does a file contain, that stores the first 1 million digits of the number π?