• 0 Posts
  • 11 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle

  • That’s because AI doesn’t know anything. All they do is make stuff up. This is called bullshitting and lots of people do it, even as a deliberate pastime. There was even a fantastic Star Trek TNG episode where Data learned to do it!

    The key to bullshitting is to never look back. Just keep going forward! Constantly constructing sentences from the raw material of thought. Knowledge is something else entirely: justified true belief. It’s not sufficient to merely believe things, we need to have some justification (however flimsy). This means that true knowledge isn’t merely a feature of our brains, it includes a causal relation between ourselves and the world, however distant that may be.

    A large language model at best could be said to have a lot of beliefs but zero justification. After all, no one has vetted the gargantuan training sets that go into an LLM to make sure only facts are incorporated into the model. Thus the only indicator of trustworthiness of a fact is that it’s repeated many times and in many different places in the training set. But that’s no help for obscure facts or widespread myths!


  • A lack of communities. Communal child care works great when you live in a village and you know everyone and most people around you are related to you.

    We don’t have that anymore. People live in suburbs where they don’t even want to talk to their neighbours. Their relatives live far away, potentially in other provinces/states or even other countries.

    Heck, a lot of people don’t even like their own relatives!