I don’t want to live in America anymore.

Everything is a contest for insecure dudes. Who has the biggest truck. Who can have the most pro-second amendment shit on display. All these people are tough skin deep. Anytime you challenge any of them, they bend over and apologize.

Everyone is so fuckin hateful and angry here. Our political discourse has become about owning the other side, and not helping anyone. We actively try to make people’s lives harder because it makes us feel better. It’s fuckin disgusting.

Everyone acts so tough here and hates everyone else, yet the majority of us are a paycheck away from financial ruin. We’re just constantly brainwashed into thinking each other is the problem. Women can’t live in certain states without risking their lives if their pregnant, all because the American version of Christianity/conservatism is closer to the National Front and KKK than it is to literally anything in the Bible.

Fuckin over it. I used to love this place, it’s becoming something I don’t even recognize. And I live in a blue state.