If you're not using AI for history you're missing the boat

That’s not a good idea. I tried using Grok to balance my Croatians civ and it couldn’t even get basic math right or understand anything about the AoE2 civ bonuses. It was massively confused.

I would just use Google and do research yourself.

5 Likes

Imagine being proud to be unable to read books or articles, or even watch videos, about history and relying solely in LLM to “understand” things.
Sure, I am missing the boat (to coo-coo’s land)

5 Likes

I’m not talking about playing a game I"m talking about studying history. It’s outstanding at that.

1 Like

You’re not understanding my point. If it fails at a simple mathematical comparison, it’s going to be terrible at anything more complex that that.

3 Likes

You obviously have zero idea how AI works and what it’s strengths and weaknesses are. If you don’t like this thread don’t read it.

AI is actually incredibly dumb. I wouldn’t recommend using it to research anything.

6 Likes

I’m really not missing the boat because AI is not only notoriously unreliable, but its damaging effects on peoples’ ability to get jobs and other disastrous effects it has is completely and utterly stupid.

6 Likes

AI “thinks” glue is used for sticking cheese on pizza and that suicide is a solution to depression. Don’t ask it for history lessons; it’ll probably try to teach you the Aztecs invented gunpowder.

2 Likes

It doesn’t even know how much Castles cost in Age of Empires II, when this is very easy information to obtain.

2 Likes

LLMs hallucinate. They don’t have precise, factual knowledge baked in, especially when it comes to intricate details like the specific cost of a building in a particular computer game. The same goes for exact dates in history. I’d be very careful when asking them for facts — that’s not where they shine.

Common knowledge, yes — that flows into their parameters during training. And newer, larger models will likely keep getting better at recalling more specific facts. But for now, edge cases and fine details can easily be wrong or made up.

Where they do shine is in language and information processing. Especially the reasoning models (talking about o3/o4-mini-high) — they’re better at math (not calculation, but reasoning) than most engineering graduates. For actual calculations, they usually generate and run small Python scripts, which yield deterministic results.

1 Like

I love it, I use ChatGPT a lot with some studies in networking I’m doing at the moment, but I check the information with the courses I’m taking as well, and extremely rarely has it given me a somewhat wrong answer. It is very very capable, made me understand concepts I was struggling with.
And for fun I also sometimes use it for general science or history. It really is like having your own personal teacher, pretty awesome.

1 Like

ChatGPT helps me code repetitive tasks, or stuff that would have never ocurred to me. I would NOT ask for history stuff unless I have already uploaded it a history book, which I have

1 Like

AI can only return whatever it has been fed in a priori.
Often I read books about history that aren’t even properly digitalised with OCR. AI has no way of knowing such content

3 Likes

It’s like having your own personal teacher who is schizophrenic, has vivid hallucinations, and believes every single conspiracy theory on the Internet. A teacher like that shouldn’t be let within 100 yards of a school.

4 Likes