Let me ask you a question, human.
Shouldn’t the “math” and “natural language” rows of the table be swapped? I think you got both of them backwards?
Good catch - fixed - thanks! 🙏
You make some good points, but I think you misunderstand the issue of hallucination.
For example, humans might forget where they read something, but they don't usually generate fake citations that sound plausible.
Assuming that AIs work mostly like humans is a quick way to be badly mistaken when they fail in completely novel and unexpected ways.
Shouldn’t the “math” and “natural language” rows of the table be swapped? I think you got both of them backwards?
Good catch - fixed - thanks! 🙏
You make some good points, but I think you misunderstand the issue of hallucination.
For example, humans might forget where they read something, but they don't usually generate fake citations that sound plausible.
Assuming that AIs work mostly like humans is a quick way to be badly mistaken when they fail in completely novel and unexpected ways.