• Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      1 day ago

      That doesn’t really matter, because 1 bit is merely distinguishing between 1 and zero, or some other 2 component value.
      Just reading a single word, you understand the word between about 30000 words you know. That’s about 15 bits of information comprehended.
      Don’t tell me you take more than 1.5 second to read and comprehend one word.

      Without having it as text, free thought is CLEARLY much faster, and the complexity of abstract thinking would move the number way up.
      1 thought is not 1 bit. But can be thousands of bits.

      BTW the mind has insane levels of compression, for instance if you think bicycle, it’s a concept that covers many parts. You don’t have to think about every part, you know it has a handlebar, frame, pedals and wheels. You also know the purpose of it, the size, weight range of speed and many other more or less relevant details. Just thinking bicycle is easily way more than 10 bits worth of information. But they are “compressed” to only the relevant parts to the context.

      Reading and understanding 1 word, is not just understanding a word, but also understanding a concept and putting it into context. I’m not sure how to quantize that, but to quantize it as 1 bit is so horrendously wrong I find it hard to understand how this can in any way be considered scientific.

      • Flying Squid@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        2 days ago

        You are confusing input with throughput. They agree that the input is much greater. It’s the throughput that is so slow. Here’s the abstract:

        This article is about the neural conundrum behind the slowness of human behavior. The information throughput of a human being is about 10 bits/s. In comparison, our sensory systems gather data at ∼1⁢09 bits/s. The stark contrast between these numbers remains unexplained and touches on fundamental aspects of brain function: what neural substrate sets this speed limit on the pace of our existence? Why does the brain need billions of neurons to process 10 bits/s? Why can we only think about one thing at a time? The brain seems to operate in two distinct modes: the “outer” brain handles fast high-dimensional sensory and motor signals, whereas the “inner” brain processes the reduced few bits needed to control behavior. Plausible explanations exist for the large neuron numbers in the outer brain, but not for the inner brain, and we propose new research directions to remedy this.

        • dosaki@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 days ago

          Why can we only think about one thing at a time?

          Someone tell that to the random tab in my brain who keeps playing music

        • conciselyverbose@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          2 days ago

          He’s not.

          Executive function has limited capacity, but executive function isn’t your brain (and there’s no reasonable definition that limits it to anything as absurd as 10 bits). Your visual center is processing all those bits that enter the eyes. All the time. You don’t retain all of it, but retaining any of it necessarily requires processing a huge chunk of it.

          Literally just understanding the concept of car when you see one is much more than 10 bits of information.

          • Flying Squid@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            4
            ·
            2 days ago

            I think that we are all speaking without being able to read the paper (and in my case, I know I wouldn’t understand it), so I think dismissing it outright without knowing how they are defining things or measuring them is not really the best course here.

            I would suggest that Caltech studies don’t tend to be poorly-done.

            • conciselyverbose@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              6
              ·
              edit-2
              2 days ago

              There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

              There are hundreds of systems in your brain that are actively processing many, many orders of magnitude more than ten bits of information per second all the time. We can literally watch them do so.

              It’s possible the headline is a lie by someone who doesn’t understand the research. It’s not remotely within the realm of plausibility that it resembles reality in any way.

              • Flying Squid@lemmy.worldOP
                link
                fedilink
                English
                arrow-up
                8
                arrow-down
                2
                ·
                2 days ago

                There is literally nothing the paper could say and no evidence they could provide to make the assertion in the title anything less than laughable.

                That is quite the claim from someone who has apparently not even read the abstract of the paper. I pasted it in the thread.

                  • Flying Squid@lemmy.worldOP
                    link
                    fedilink
                    English
                    arrow-up
                    8
                    arrow-down
                    2
                    ·
                    2 days ago

                    You know, dismissing a paper without even taking a minute to read the abstract and basing everything on a headline to claim it’s all nonsense is not a good look. I’m just saying.

        • Buffalox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          edit-2
          1 day ago

          You are confusing input with throughput.

          No I’m not, I read that part. Input is for instance hearing a sound wave, which the brain can process at amazing speed, separating a multitude of simultaneous sounds, and translate into meaningful information. Be it music, speech, or a noise that shouldn’t be there. It’s true that this part is easier to measure, as we can do something similar, although not nearly as well on computers. As we can determine not only content of sounds, but also extrapolate from it in real time. The sound may only be about 2x22k bit, but the processing required is way higher. And that’s even more obviously way way way above 10 bit per second.

          This is a very complex function that require loads of processing. And can distinguish with microsecond precision it reaches each ear to determine direction.
          The same is the case with vision, which although not at all the resolution we think it is, requires massive processing too to interpret into something meaningful.

          Now the weird thing is, why in the world do they think consciousness which is even MORE complex, should operate at lower speed? That idea is outright moronic!!!

          Edit:

          Changed nanosecond to microsecond.

          • Flying Squid@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            3
            ·
            2 days ago

            As I suggested to someone else, without any of us actually reading the paper, and I know I do not have the requisite knowledge to understand it if I did, dismissing it with words like “moronic” is not warranted. And as I also suggested, I don’t think such a word can generally be applied to Caltech studies. They have a pretty solid reputation as far as I know.

            • Buffalox@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              3
              ·
              edit-2
              1 day ago

              I’m not fucking reading a paper with such ridiculous claims, I gave it a chance, but it simply isn’t worth it. And I understand their claims and argumentation perfectly. They simply don’t have a clue about the things they make claims about.
              I’ve been investigating and researching these issues for 40 years with an approach from scientific evidence, so please piss off with your claims of me not understanding it.

              • Aatube@kbin.melroy.org
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                1 day ago

                What is your realm of research? How have you represented abstract thought by digital storage instead of information content?

                • Buffalox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  edit-2
                  1 day ago

                  Mostly philosophical, but since I’m also a programmer, I’ve always had the quantized elements in mind too.

                  In the year 2000 I estimated human level or general/strong AI by about 2035. I remember because it was during a very interesting philosophy debate at Copenhagen University. Where to my surprise there also were a number of physics majors.
                  That’s supposed to be an actually conscious AI. I suppose the chances of being correct were slim at the time, but now it does seem to be more likely than ever.

              • Flying Squid@lemmy.worldOP
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                3
                ·
                1 day ago

                Without evaluating the data or methodology, I would say that the chance you gave it was not a fair one. Especially since you decided to label it “moronic.” That’s quite a claim.

                • Buffalox@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  2
                  ·
                  edit-2
                  1 day ago

                  It’s 100% moronic, they use terminology that clearly isn’t fit for the task.

                  • Flying Squid@lemmy.worldOP
                    link
                    fedilink
                    English
                    arrow-up
                    4
                    arrow-down
                    3
                    ·
                    1 day ago

                    “100% moronic” is an even bolder claim for someone who has not evaluated any of the claims in the paper.

                    One might even say that calling scientific claims “100%” false is a not especially scientific approach.