0
A prevailing sentiment online is that GPT-4 still does not understand what it talks about. We can argue semantics over what “understanding” truly means. I think it’s useful, at least today, to draw the line at whether GPT-4 has succesfully modeled parts of the world. Is it just picking words and connecting them with correct grammar? Or does the token selection actually reflect parts of the physical world?
One of the most remarkable things I’ve heard about GPT-4 comes from an episode of This American Life titled “Greetings, People of Earth”.
You really, truly don’t understand what you’re talking about.
If this community values good discussion, it should probably just ban statements that manage to be this wrong. It’s like when creationists say things like “if we came from monkeys why are they still around???”. The person has just demonstrated such a fundamental lack of understanding that it’s better to not engage.
Oh, you again – it’s incredibly ironic you’re talking about wrong statements when you are basically the poster child for them. Nothing you’ve said has any grounding in reality, and is just a series of bald assertions that are as ignorant as they are incorrect. I thought you would’ve picked up on it when I started ignoring you, but: you know nothing about this and need to do a ton more research to participate in these conversations. Please do that instead of continuing to reply to people who actually know what they’re talking about.