Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
Do you think the US will be able to change for the better regarding things like gun laws, healthcare, equality and overall politics?
Gun laws - no. But I think on the other issues they might. Might get worse as well…
I can only hope for the people to wake up and see what is happening around them and how these policies are destroying society. Just maybe there is a small chance that this is possible but I’m still pessimistic about it.