JohnFen 11 hours ago

My guess is that it's because humans are intelligent. What I mean by that is that humans are actually understanding what they're reading. If you understand what the words you're reading mean, that makes it easier to read the same words in other contexts.

tolerance 12 hours ago

This is a great question.

My remedial guess is that the human mind is more efficient at the pattern recognition that LLMs excel at in their own right.

We can do a lot more with less data, exert less effort and come to a reasonably accurate conclusion.

LLMs can artificially reason, but it requires intricate software that took decades to develop to the standard that it's reached now, and computers that suck the earth of its resources at a hair-raising scale, and like you've mentioned a lot of data. A lot of data. Apparently the entire internet and then some on a carousel.

Intelligence is an innate faculty of man and man's measure of intelligence generally doesn't require that much, depending on what's expected of the man throughout the course of his life.

Because AI is a technology the expectations we place on it are way higher.

A manuscript with a few errors, blotches, misspellings, omissions, what have you, is excused. If your printer does the same thing for every four or five jobs, it's defective.

  • fasthands9 12 hours ago

    I think this is mostly right, but also I'm not sure I agree completely with the premise. Humans have years of conversations they've heard before they attempt to read or write. They already have a concept of what a 'dog' is before they see the word, and know what it is likely to do. Not the same with something that only sees text.

    • tolerance 12 hours ago

      I agree with you 100% and I'm not sure if it contradicts my point that humans have a natural advantage over LLMs in the way I tried to illustrate.

      My initial comment was going to make an abstract reference to how human beings are pretty much wired for reasoning from the time that they're being breastfed, or at least reared in the clutch of their mother. It has something to do with the impression I've picked up of how the inheritance of a language, and subsequently literacy, starts with your mom—in ideal cases.

      I don't know if this is a strike against humans in the whole argument for efficiency. But I don't think it does.

      Computers don't have Moms. Go Moms.

    • techpineapple 12 hours ago

      Yeah one thing I’ve wondered (and maybe they do this) but find ways to cross encode different kinds of data, words yes, but auditory and visual data too. The algorithms to do this might be complicated (or incomprehensible) but for sure lots of creativity say comes from the interrelationship between senses, combine that with emotion as well, and I imagine it partially comes down to, our writing ability isn’t limited to the collection of what we’ve read.

      Then maybe the other thing is that rules and relationships must be encoded in a special way. In LLM’s I assume rules are emergent, but maybe we have a specific rules engine that gets trained based on the emotional salience of what we read/hear.

      Maybe another reason is what’s encoded in our DNA, which might imagine our brain structure is fundamentally designed for some of this stuff.

  • NoahZuniga 8 hours ago

    Humans have tons of "pretraining" encoded in their DNA