Thanks to @General_Effort@lemmy.world for the links!
Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here’s a link to a preprint: https://arxiv.org/abs/2408.10234
Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.
Which is exactly what bit means.
Which is not bits, but the equivalent 1 digit at base 10.
I have no idea how you think this changes anything about what a bit is?
I read ‘tits’ and about died laughing 😭
The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.
Wrong. They are called the same because they are fundamentally the same. That’s how you measure information.
In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that’s a fairly subtle thing.
I don’t see how that can be a subtle difference. How is a bit of external storage data only subtly different from information content that tells the probability of the event occurring is ½?
It’s a bit like asking what is the difference between the letter “A” and ink on a page in the shape of the letter “A”. Of course, first one would have to explain how they are usually not different at all.
BTW, I don’t know what you mean by “external storage data”. The expression doesn’t make sense.
That is a fair criticism.
Did you actually read it?
Because it’s not:
Which is exactly what bit means.
Which is not bits, but the equivalent 1 digit at base 10.