• 0 Posts
  • 2 Comments
Joined 3 months ago
cake
Cake day: August 30th, 2025

help-circle
  • Think of it this way:

    If I ask you can a car fly? You might say well if you put wings on it or a rocket engine or something, maybe? OK, I say, so I point at a car on the street and ask: Do you think that specific car can fly? You will probably say no.

    Why? Even though you might not fully understand how a car works and all the parts that go into it, you can easily tell it does not have any of the things it needs to fly.

    It’s the same with an LLM. We know what kinds of things are needed for true intelligence and we can easily tell the LLM does not have the parts required. So an LLM alone can never ever lead to AGI, more parts are needed. Even though we might not fully understand how the internals of an LLM function in specific cases and might also not know what parts exactly are needed for intelligence or how those work.

    A full understanding of all parts isn’t required to discern large scale capabilities.