Keeping in line with my controversial threads that solve nothing, I ask, retrospectively, if U.S.A should have conquered the World after the World War 2, when it was the only one with the bomb.
Conquered places such as Russia after war. And kept Islamic countires in check. I commonly hear "turn sand into crater of glass solids". Also, U.S.A could have expanded into Mexico. Or something.
I'm going to obviously say no, because I think it woud be immoral, but the world would honestly better place?
This is all retrospective though.
Weird topic I know. Just basicly, woud the world be a better place with a U.S.A. control?
Conquered places such as Russia after war. And kept Islamic countires in check. I commonly hear "turn sand into crater of glass solids". Also, U.S.A could have expanded into Mexico. Or something.
I'm going to obviously say no, because I think it woud be immoral, but the world would honestly better place?
This is all retrospective though.
Weird topic I know. Just basicly, woud the world be a better place with a U.S.A. control?