No.13385908 ViewReplyOriginalReport
Hi /sci/. I'm confused as to how information is quantified. So, say you have a 4-digit pin, for which you remember half the digits. This is like remembering half the code, but you don't remember enough for a 50/50 chance at guessing it. If only one of the values between 00 and 99 works, that's a 1% chance instead, and each newly learned digit bumps you up an order of magnitude, to 10 and then 100. This is directly contrasted by forgetting all the digits, but perhaps remembering which puts your odds at 50%. Maybe you know your password was either 1234 or 4321, or that it was either 0000 or 1111 to be blunt. You remember none of the digits, but this is in a sense closer to remembering half the password.

So let's set these methods equal and say one person knows all but the last digit (10% chance of guessing), while another person knows the difference between each subsequent digit, just not what any of these digits are (again, 10%). Part of my intuition tells me, it should take the same amount of space to store both these forms of information. They both guess equally well. But, they're not both completed equally well: the latter case could learn any one of four digits and be complete, but the former case has only one digit it needs to know. Most of the information the latter case holds would not help the former case, whereas even a third of what the former case knows (excluding additional knowledge like the total number of digits) would be enough for the latter. This suggests the quantities of information are infact unequal, through basic reductio: if they were equal to eachother, then they would each require an equally large fraction of the other to reach completion. One of their completions should count as more information than the other if they're ultimately the same code.

Which gets me towards the end of my confusion and to my question: is there any direct relation between units like bytes and units like possibilities?