Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
History says no. Once an empire starts down the path the US is on there is no going back.