• VoterFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 hours ago

    ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      If they had heard of it, we’d probably get statements like: “It’s just statistics.” or “It’s not information. It’s just a probability.”

    • w3dd1e@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.

    • nelly_man@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 hours ago

      Bit in this context refers to the Shannon from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      8
      ·
      10 hours ago

      I also don’t have 10 fingers. That doesn’t make any sense - my hands are not numbers!

      Ooooor “bits” has a meaning beyond what you assume, but it’s probably just science that’s stupid.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          5
          ·
          edit-2
          9 hours ago

          You say “we don’t think in bits because our brains function nothing like computers”, but bits aren’t strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

          To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That’s because the concept of “10” is applicable both to math and topics that math can describe, just like “bits” are applicable both to information theory and topics that information theory can describe.

          For the record: I didn’t downvote you, it was a fair question to ask.

          I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say “that’s stupid, you can’t get a thermometer to a star and read the measurement, you’d die”, just because you don’t know how one could measure it.

          • renegadespork@lemmy.jelliefrontier.net
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            6 hours ago

            Bits are binary digits used for mechanical computers. Human brains are constantly changing chemical systems that don’t “process” binary bits of information so it makes no sense as a metric.

            imagine someone tells you they measured the temperature of a distant star, and you say “that’s stupid, you can’t get a thermometer to a star and read the measurement, you’d die”, just because you don’t know how one could measure it.

            It’s not about how you measure it, it’s about using a unit system that doesn’t apply. It’s more like trying to calculate how much star costs in USD.

            • scratchee@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 minutes ago

              Bits are also a unit of information from information theory. In that context they are relevant for anything that processes information, regardless of methodology, you can convert analogue signals into bits just fine.

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      19 hours ago

      Base 2 gives the unit of bits

      Which is exactly what bit means.

      base 10 gives units of “dits”

      Which is not bits, but the equivalent 1 digit at base 10.

      I have no idea how you think this changes anything about what a bit is?

      • Aatube@kbin.melroy.org
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        18 hours ago

        The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.

        • General_Effort@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          12 hours ago

          Wrong. They are called the same because they are fundamentally the same. That’s how you measure information.

          In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that’s a fairly subtle thing.

          • typeswithpenis@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            9 hours ago

            A bit in the data sense is just an element of the set of booleans. A bit in the entropy sense is the amount of information revealed by an observation with two equally probable outcomes. These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy. For example, if a boolean is known to be 0, then the amount of information it contains is 0 bits. If it is known that the boolean is equally 0 or 1, then the information content is 1 bit. It depends on the prior probability distribution.

          • Aatube@kbin.melroy.org
            link
            fedilink
            arrow-up
            2
            ·
            12 hours ago

            I don’t see how that can be a subtle difference. How is a bit of external storage data only subtly different from information content that tells the probability of the event occurring is ½?

            • General_Effort@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 hours ago

              It’s a bit like asking what is the difference between the letter “A” and ink on a page in the shape of the letter “A”. Of course, first one would have to explain how they are usually not different at all.

              BTW, I don’t know what you mean by “external storage data”. The expression doesn’t make sense.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        19 hours ago

        Did you actually read it?
        Because it’s not:

        Base 2 gives the unit of bits

        Which is exactly what bit means.

        base 10 gives units of “dits”

        Which is not bits, but the equivalent 1 digit at base 10.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    6
    ·
    21 hours ago

    Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
    Speaking which is conveying thought, also far exceed 10 bits per second.

    This piece is garbage.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      14 hours ago

      Speaking which is conveying thought, also far exceed 10 bits per second.

      There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

      Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I’m curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik’s cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

      EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That’s an important limitation in that it’s not trying to measure internal richness in unobserved thought.

      So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

      The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik’s cube solving, memory contests).

      It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn’t really change the result.

      There’s also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as “subjective inflation”), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.

      I still think visual processing seems to be faster than 10, but I’m now persuaded that it’s within an order of magnitude.

      • RustyEarthfire@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        12 hours ago

        Thanks for the link and breakdown.

        It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

        “Thinking speed” is also a poor description for input/output measurement, akin to calling a monitor’s bitrate the computer’s FLOPS.

        Visual processing is multi-faceted. I definitely don’t think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        13 hours ago

        with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

        The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information. Normally a bit can only have 2 values, here they are talking about very different types of bits, which AFAIK is not a specific quantity.

        the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing

        This is of course a thing.

        • GamingChairModel@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          12 hours ago

          The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information

          here they are talking about very different types of bits

          I think everyone agrees on the definition of a bit (a binary two-value variable), but the active area of debate is which pieces of information actually matter. If information can be losslessly compressed into smaller representations of that same information, then the smaller compressed size represents the informational complexity in bits.

          The paper itself describes the information that can be recorded but ultimately discarded as not relevant: for typing, the forcefulness of each key press or duration of each key press don’t matter (but that exact same data might matter for analyzing someone playing the piano). So in terms of complexity theory, they’ve settled on 5 bits per English word and just refer to other prior papers that have attempted to quantify the information complexity of English.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.

    • meyotch@slrpnk.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      19 hours ago

      You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.

      The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.

        • meyotch@slrpnk.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          16 hours ago

          Only when you are framing it in terms of information entropy. I think many of those misunderstanding the study are thinking of bits as part of a standard byte. It’s a subtle distinction but that’s where I think the disconnect is

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            13 hours ago

            Yes, the study is probably fine, it’s the article that fails to clarify before using it, that they are not talking about bits the way bits are normally understood.

        • credo@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          17 hours ago

          I think we understand a computer can read this text far faster than any of us. That is not the same as conscious thought though- it’s simply following an algorithm of yes/no decisions.

          I’m not arguing with anything here, just pointing out the difference in what CPUs do and what human brains do.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            17 hours ago

            a computer can read this text far faster than any of us.

            I think you missed the comprehend part.

            • credo@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              5
              ·
              edit-2
              16 hours ago

              No i didn’t. Feed a book into chat GPT. You will see what fast comprehension is. I think you missed the consciousness part.

              Stop being an ass.

              Edit: The average person knows approximately 15-20,000 words. This is between 14 and 15 bits minimum to address every word independently. But I’m no brainologist, and I don’t know that’s how processing speech actually works. This is all just for comparison to bitwise operations.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                13 hours ago

                comprehension - the action or capability of understanding something.

                LLMs don’t understand anything they read.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      18 hours ago

      You’re misunderstanding the terminology used then.

      In information theory, “bit” doesn’t mean “bitrate” like you’d see in networks, but something closer to “compressed bitrate.”

      For example, let’s say I build a computer that only computes small sums, where the input is two positive numbers from 0-127. However, this computer only understands spoken French, and it will ignore anything that’s not a French number in that range. Information theory would say this machine receives 14 bits of information (two 7-bit numbers) and returns 8 bits. The extra processing of understanding French is waste and ignored for the purposes of calculating entropy.

      The article also mentions that our brains take in billions of bits of sensory data, but that’s ignored for the calculation because we only care about the thought process (the useful computation), not all of the overhead of the system.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        17 hours ago

        I think I was pretty clear about a understanding or comprehension part, which is not merely input output.

    • Imgonnatrythis@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      17 hours ago

      Indeed it is. If you want to illustrate the point that silicon and copper are faster than bioelectric lumps of fat there are lots of ways to do this and it’s no contest, but this is not a well done study.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      edit-2
      16 hours ago

      Try to read this next part without understanding it. If you know English you will find it impossible to NOT find meaning in these letters displayed in a row. That’s more like a subconscious processing. If you’re learning to read English then there’s likely an active “thought” appearing in your experience. See a difference?

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        19 hours ago

        I absolutely do, which is why I concentrated on the THOUGHT part, as in understanding. You obviously can’t have understanding without thought. That’s the difference between data and information.
        Please I have 40 years of experience in philosophic issues regarding intelligence and consciousness, also from a programmers perspective.

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          19 hours ago

          Sorry if my replies are annoying, I merely find this subject interesting. Feel free to ignore this.

          It is not obvious to me why a being couldn’t have an “understanding” without a “thought”. I do not believe it’s possible to verify if a creature has a subjective experience but an “understanding” of the environment could be attributed to how a being performs in an environment (a crow gets foods that was just out of reach inside a tube of water by adding rocks to raise the water level). I have some experience on game-dev programming and limited understanding on consciousness as a personal interest, if that’s helpful.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            18 hours ago

            Oh no this is not annoying, this is a very interesting question.
            I suppose with the crow, it doesn’t need to understand volume of water and rocks displacing it, but merely has a more basic understanding that adding rocks raise the water, or maybe even just makes the food easier to get at.
            So I suppose we can agree that there are multiple levels of understanding.
            But still the crow must have observed this, unless it actually figured it out? And some thought process must have led it to believe that dropping stones in the water might have the desired effect.
            Now even if the crow had observed another crow doing this, and seen this demonstrated. Ir mist have had a thought process concluding that it could try this too, and perhaps it would work.

            But there are other situations that are more challenging IMO, and that’s with LLM, how do we decide thought and understand with those.
            LLM is extremely stupid and clever at the same time. With loads of examples of them not understanding the simplest things, like how meany R’s are in Strawberry, and the AI answering stubbornly that there are only 2! But on the other hand being able to spell it out and count them, then being able to realize that there are indeed 3, which it previously denied.

            IMO animal studies are crucial to understand our own intelligence, because the principle is the same, but animals are a simpler “model” of it so to speak.
            It seems to me that thought is a requirement to understanding. You think about something before you understand it.
            Without the thought process it would have to be instinctive. But I don’t think it can be argued that crows dropping rocks in water is instinctive.
            But even instinctive understanding is a kind of understanding, it’s just not by our consciousness, but by certain behavior traits having an evolutionary advantage, causing that behavior to become more common.

            So you are absolutely right that thought is not always required for some sort of “understanding”. which is a good point.
            But what I meant was conscious understanding as in really understanding a concept and for humans understanding abstract terms, and for that type of understanding thought is definitely a requirement.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        20 hours ago

        Understanding it is active thought. And processing the words, as words with meaning, is required to formulate a relevant response.

        The more than 10 bits each word is are part of your active thought.

        • tabular@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          19 hours ago

          I think we may disagree on term definitions.

          I perceive “active thought” when trying to decipher parts of a sentence I do not already have an understanding of. If I already understand a part then no active thought is perceived by me - like driving a car when nothing eventful is happening. [Note: I don’t believe I have 100% accurate perception of my own subjective experience. Trying to focus on subjective experience at all instead of constantly being “lost in thought” is very short lived]

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        19 hours ago

        That doesn’t really matter, because 1 bit is merely distinguishing between 1 and zero, or some other 2 component value.
        Just reading a single word, you understand the word between about 30000 words you know. That’s about 15 bits of information comprehended.
        Don’t tell me you take more than 1.5 second to read and comprehend one word.

        Without having it as text, free thought is CLEARLY much faster, and the complexity of abstract thinking would move the number way up.
        1 thought is not 1 bit. But can be thousands of bits.

        BTW the mind has insane levels of compression, for instance if you think bicycle, it’s a concept that covers many parts. You don’t have to think about every part, you know it has a handlebar, frame, pedals and wheels. You also know the purpose of it, the size, weight range of speed and many other more or less relevant details. Just thinking bicycle is easily way more than 10 bits worth of information. But they are “compressed” to only the relevant parts to the context.

        Reading and understanding 1 word, is not just understanding a word, but also understanding a concept and putting it into context. I’m not sure how to quantize that, but to quantize it as 1 bit is so horrendously wrong I find it hard to understand how this can in any way be considered scientific.

        • Flying Squid@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          21 hours ago

          You are confusing input with throughput. They agree that the input is much greater. It’s the throughput that is so slow. Here’s the abstract:

          This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼1⁢09 bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

          • dosaki@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            21 hours ago

            Why can we only think about one thing at a time?

            Someone tell that to the random tab in my brain who keeps playing music

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            21 hours ago

            He’s not.

            Executive function has limited capacity, but executive function isn’t your brain (and there’s no reasonable definition that limits it to anything as absurd as 10 bits). Your visual center is processing all those bits that enter the eyes. All the time. You don’t retain all of it, but retaining any of it necessarily requires processing a huge chunk of it.

            Literally just understanding the concept of car when you see one is much more than 10 bits of information.

            • Flying Squid@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              4
              ·
              21 hours ago

              I think that we are all speaking without being able to read the paper (and in my case, I know I wouldn’t understand it), so I think dismissing it outright without knowing how they are defining things or measuring them is not really the best course here.

              I would suggest that Caltech studies don’t tend to be poorly-done.

              • conciselyverbose@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                4
                ·
                edit-2
                20 hours ago

                There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

                There are hundreds of systems in your brain that are actively processing many, many orders of magnitude more than ten bits of information per second all the time. We can literally watch them do so.

                It’s possible the headline is a lie by someone who doesn’t understand the research. It’s not remotely within the realm of plausibility that it resembles reality in any way.

                • Flying Squid@lemmy.worldOP
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  2
                  ·
                  20 hours ago

                  There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

                  That is quite the claim from someone who has apparently not even read the abstract of the paper. I pasted it in the thread.

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            edit-2
            19 hours ago

            You are confusing input with throughput.

            No I’m not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn’t be there. It’s true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that’s even more obviously way way way above 10 bit per second.

            This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
            The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.

            Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!

            Edit:

            Changed nanosecond to microsecond.

            • Flying Squid@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              3
              ·
              21 hours ago

              As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like “moronic” is not warranted. And as I also suggested, I don’t think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.

              • Buffalox@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                edit-2
                19 hours ago

                I’m not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn’t worth it. And I understand their claims and argumentation perfectly. They simply don’t have a clue about the things they make claims about.
                I’ve been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.

                • Aatube@kbin.melroy.org
                  link
                  fedilink
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  19 hours ago

                  What is your realm of research? How have you represented abstract thought by digital storage instead of information content?

                • Flying Squid@lemmy.worldOP
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  3
                  ·
                  19 hours ago

                  Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it “moronic.” That’s quite a claim.

  • Australis13@fedia.io
    link
    fedilink
    arrow-up
    14
    ·
    20 hours ago

    Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub

    It doesn’t look like these “bits” are binary, but “pieces of information” (which I find a bit misleading):

    “Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

    The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

    To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      12 hours ago

      So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        3
        ·
        20 hours ago

        There is no other definition of bit that is valid in a scientific context. Bit literally means “binary digit”.

        Information theory, using bits, is applied to the workings of the brain all the time.

        • Flying Squid@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          12
          ·
          20 hours ago

          How do you know there is no other definition of bit that is valid in a scientific context? Are you saying a word can’t have a different meaning in a different field of science?

            • Flying Squid@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              11
              ·
              20 hours ago

              Actual neuroscientists define their terms in their papers. Like the one you refuse to read because you’ve already decided it’s wrong.

              • conciselyverbose@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                8
                arrow-down
                3
                ·
                19 hours ago

                Actual neuroscientists do not create false definitions for well defined terms. And they absolutely do not need to define basic, unambiguous terminology to be able to use it.

      • Australis13@fedia.io
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        20 hours ago

        Indeed not. So using language specific to binary systems - e.g. bits per second - is not appropriate in this context.

      • Tramort@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        19 hours ago

        All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).

        • Flying Squid@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          19 hours ago

          But it isn’t stored that way and it isn’t processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.

          • Tramort@programming.dev
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            19 hours ago

            Your initial claim was that they couldn’t be measured that way. You’re right that they aren’t stored as bits, but it’s irrelevant to whether you can measure them using bits as the unit of information size.

            Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, “the entire encyclopedia Britannica could be stored on one disc”. How was that possible to know? Encyclopedias were not digitally stored! You can’t measure them in bits!

            It’s possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.

            This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off… in bits (per second)

            • Flying Squid@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              6
              ·
              19 hours ago

              As has been shown elsewhere in this thread by Aatube a couple of times, they are not defining ‘bit’ the way you are defining it, but still in a valid way.

  • leaky_shower_thought@feddit.nl
    link
    fedilink
    English
    arrow-up
    6
    ·
    20 hours ago

    i can agree at some extent why it could be at 10bits/sec.

    the brain is known to do some shortcuts when parsing/speed reading but slows down when we try to extract details from written works. it is also more tiring to scrutinize details than to just read articles.

    i was surprised that they got the speed measured.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      16 hours ago

      The Caltech release says they derived it from “a vast amount of scientific literature” including studies of how people read and write. I think the key is going to be how they derived that number from existing studies.

  • essell@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    21 hours ago

    “They also explain why we can only think one thought at a time”

    I know a lot of people who would disagree with that

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      21 hours ago

      Those people are probably lying. Try to count in your head from 1 to 100, and simultaneously count down from 100 to 1.
      This is surprisingly hard despite the simplicity of the task. Of course I can’t know how it is for other people, but in my experience it is true that we can only “process” one conscious task at a time. I have tried to train myself to exceed this limitation, but frankly had to give up, I even suspect if you try to hard, you might risk going crazy.
      We can however learn things so called “by heart”, in which case we don’t have to focus on them consciously, and do that at the same time as we focus on something else. Even things that can be pretty hard to learn, like driving a bicycle.

      • eran_morad@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        21 hours ago

        Musicians playing polyrhythms count at two or more rates simultaneously. Further, they operate their limbs and fingers accordingly.

        • MurrayL@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          21 hours ago

          True, but I wouldn’t call that conscious counting - you’re not literally counting out multiple simultaneous time signatures in your head, it’s done by feel.

          • Opisek@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 hours ago

            Enough a proof that brains are more complex than a single processor calculating one thing at a time.

            Furthermore, what about people who had lobotomy done to them? Weird things happen with two halves of the brain (now separate) “thinking” two different things at the same time!

    • Flying Squid@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      20 hours ago

      Thank you! I’ll add these to the body.

      Edit: Never mind, it doesn’t seem to want to let me save. Oh well.

      Edit 2: Weird, it did when I tried it again, so thanks!

  • Spacehooks@reddthat.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    20 hours ago

    In fact, the 10 bits per second are needed only in worst-case situations, and most of the time our environment changes at a much more leisurely pace."

    Bruh some tech pro is going to read this and interpret this in a terrible fashion but then again humans already change our environment.

      • Spacehooks@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        So the article is going with humans only think as fast because evolution determines this speed was sufficient. So if i was highly misguided individual wanting to up the average human speed we need to create an environment where there is a need to process data faster. Sounds like a horror cyber punk but in reality human progress is super fast now relative to 10k years ago. So the change may happen naturally.

        • stinky@redlemmy.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 hours ago

          Oh, you have the full text of the paper?? Please share it! We’d like to read it for ourselves.

  • ReadMoreBooks@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    21 hours ago

    Yet, it takes an enormous amount of processing power to produce a comment such as this one. How much would it take to reason why the experiment was structured as it was?

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      18 hours ago

      Information theory is all about cutting through the waste of a given computation to compare apples to apples.

      I’ll replicate an example I posted elsewhere:

      Let’s say I make a machine that sums two numbers between 0-127, and returns the output. Let’s say this machine also only understands spoken French. According to information theory, this machine receives 14 bits of information (two 7-bit numbers with equal probability for all values) and returns 8 bits of information. The fact that it understands spoken French is irrelevant to the computation and is ignored.

      That’s the same line of reasoning here, and the article makes this clear by indicating that brains take in billions of bits of sensory data. But they’re not looking at overall processing power, they’re looking at cognition, or active thought. Performing a given computational task is about 10 bits/s, which is completely separate from the billions of bits per second of background processing we do.

  • vext01@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    21 hours ago

    I could believe that we take 10 decisions based on pre-learned information per second, but we must be able to ingest new information at a much quicker rate.

    I mean: look at an image for a second. Can you only remember 10 things about it?

    It’s hard to speculate on such a short and undoubtedly watered down, press summary. You’d have to read the paper to get the full nuance.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      I mean: look at an image for a second. Can you only remember 10 things about it?

      The paper actually talks about the winners of memory championships (memorizing random strings of numbers or the precise order of a random arrangement of a 52-card deck). The winners tend to have to study the information for an amount of time roughly equivalent to 10 bits per second.

      It even talks about the guy who was given a 45 minute helicopter ride over Rome and asked to draw the buildings from memory. He made certain mistakes, showing that he essentially memorized the positions and architectural styles of 1000 buildings chosen out of 1000 possibilities, for an effective bit rate of 4 bits/s.

      That experience suggests that we may compress our knowledge by taking shortcuts, some of which are inaccurate. It’s much easier to memorize details in a picture where everything looks normal, than it is to memorize details about a random assortment of shapes and colors.

      So even if I can name 10 things about a picture, it might be that those 10 things aren’t sufficiently independent from one another to represent 10 bits of entropy.

        • loppy@fedia.io
          link
          fedilink
          arrow-up
          1
          ·
          12 hours ago

          I was responding to “Look at an image for a second. Can you only remember 10 things about it?” I didn’t think that was a fair characterization. I see you probably specifically meant 10 yes/no questions about an image, but I don’t think yes/no questions are a fair proxy for “things”.

          In any case you can read the preprint here https://arxiv.org/abs/2408.10234v2 and they make it immediately clear that 10 bits/s is an order-of-magnitude estimate, and also specifically list (for example) object recognition at 30-50 bits/s.

  • stinky@redlemmy.com
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    20 hours ago

    Caltech article: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

    The full text of the paper costs $35 to read once.

    “Look, I made a really exciting controversial discovery! It’s really emotional and intriguing! You’re missing out! Only smart rich people can read it! Put your money in the basket please :)” Our education system is dead the the populace is too stupid to care.

    • AmidFuror@fedia.io
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      19 hours ago

      The educational system isn’t setting the prices. The publishers are separate private enterprises which are mostly profit-driven.

      In the last 20 years, “open access” journals have been created where the author (author’s grant money, mostly from the government) pays the charges instead of the readers. That has led to a whole slew of other problems including predatory and phony journals springing up.

      • stinky@redlemmy.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        18 hours ago

        author’s grant money, mostly from the government, paid for by tax dollars, by US citizens, as part of taxes attributed to education and healthcare. yaaaaaaaawn.