• conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    3 days ago

    No, I am saying that I do have a meaningful working knowledge of how the brain works, and information theory, beyond the literal surface level it would take to understand that the headline is bullshit.

    You don’t need to be a Nobel prize winning physicist to laugh at a paper claiming gravity is impossible. This headline is that level. Literally just processing a word per second completely invalidates it, because an average vocabulary of 20k means that every word, by itself, is ~14 bits of information.

    • Flying Squid@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      3 days ago

      You are already not using ‘bit’ the way it is defined in the paper. Again, not a good look.

        • Aatube@kbin.melroy.org
          link
          fedilink
          arrow-up
          5
          ·
          3 days ago

          From a cursory glance it seems at least quite close to the definition of a bit in relation to entropy, also known as a shannon.

          Nevertheless, the term bits of information or simply bits is more often heard, even in the fields of information and communication theory, rather than shannons; just saying bits can therefore be ambiguous. Using the unit shannon is an explicit reference to a quantity of information content, information entropy or channel capacity, and is not restricted to binary data, whereas bits can as well refer to the number of binary symbols involved, as is the term used in fields such as data processing. —Wikipedia article for shannons

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            3 days ago

            If it’s not re-defining the term then I’m using it like the paper is defining it.

            Because just understanding words to respond to them, ignoring all the sub-processes that are also part of “thought” and directly impact both your internal narration and your actual behavior, takes more than 10 bits of information to manage. (And yeah I do understand that each word isn’t actually equally likely as I used to provide a number in my rough version, but they also require your brain to handle far more additional context than just the information theory “information” of the word itself.)

        • Flying Squid@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          3 days ago

          And now it’s “it’s the paper’s fault it’s wrong because it defined a term the way I didn’t want it defined.”

          • conciselyverbose@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            3 days ago

            Yes.

            Science is built on a shared, standardized base of knowledge. Laying claim to a standard term to mean something entirely incompatible with the actual definition makes your paper objectively incorrect and without merit.

            • Flying Squid@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              2
              ·
              3 days ago

              Cool. Let me know when you feel like reading the paper since Aatube already showed you they are using it properly. Or at least admitting you might not know as much about this as you think you do…