← Back to context

Comment by basementcat

7 hours ago

A bit is a measure of information theoretical entropy. Specifically, one bit has been defined as the uncertainty of the outcome of a single fair coin flip. A single less than fair coin would have less than one bit of entropy; a coin that always lands heads up has zero bits, n fair coins have n bits of entropy and so on.

https://en.m.wikipedia.org/wiki/Information_theory

https://en.m.wikipedia.org/wiki/Entropy_(information_theory)

That is a bit in information theory. It has nothing to do with the computer/digital engineering term being discussed here.

  • This comment I feel sure would repulse Shannon in the deepest way. A (digital, stored) bit, abstractly seeks to encode and make useful through computation the properties of information theory.

    Your comment must be sarcasm or satire, surely.