you are viewing a single comment's thread.

view the rest of the comments →

[–]soundsituation 3 insightful - 2 fun3 insightful - 1 fun4 insightful - 2 fun -  (1 child)

When did you start to notice these changes, and what/who exactly are you basing your observations on? I just ask because this hasn't been my experience. None of the Christians I've personally interacted with act like the ones on this site.

[–]Vulptex 2 insightful - 2 fun2 insightful - 1 fun3 insightful - 2 fun -  (0 children)

I'm seeing it a lot irl too. My circle is mostly evangelicals, and I barely even recognize them anymore. Up until like 2 years ago we were all about God, salvation, love, the spirit, freedom, justice, morals, etc. Now all they ever talk about is how we need to destroy the gays and trannies who are apparently the cause of all the world's problems, and they literally try to use the New Testament to argue that judging people and hating your enemy is fighting the good fight. That has to be the definition of irony. They also suddenly support dictatorship which would persecute believers because they say we need a "strong leader". What's really frightening is that you can read the New Testament itself and see that they are behaving almost exactly the same as the Pharisees, the only difference is that they're "Christian" instead of Jewish. I don't even remember the last time I heard them talk about Jesus, it's all this made-up stuff that they treat like the be-all end-all of the world.

There's also been a sudden resurgence of interest in 1950s gender roles that has been dead for decades, and it baffles me. Just yesterday one of my close relatives accused me of being unbiblical and communist? (I'm literally the last person who would support communism and they know it, it was clearly an us-vs-them tactic) for not thinking that men should be treated like second-class citizens. I highly doubt any of them would've thought that's okay 2 years ago, but now you're evil if you don't support it. I'm being bombarded with "chivalry" and told I always have to open doors for women and pay for them and do everything for them and treat them like royalty while never seeking help for my own problems. Because men are supposed to be one way and women another. Yet 2 years ago they were criticizing Muslims for their treatment of women and liberals for trying to take rights away from men.

The even worse thing they've been pushing for is "always obey authority". I have a university elective about the New Testament, and one day the usually sweet professor shouted with full force at the whole class that people who question authority are evil. Nothing else was proclaimed with such power, and I've been seeing the same pattern in popular preachers and Christian media. That doesn't sound at all like the Christians who were willing to die brutal deaths for refusing to bow to the Roman emperor or the Jewish priest. It's almost like those power-hungry authority figures have infiltrated Christianity and started pushing their propaganda and tearing it down from the inside.

Another thing I've noticed is that hardly anyone believes in "faith without works" anymore. I'm not taking a stance here, it's just interesting to me considering it was a central doctrine to the vast majority of Christians only a couple of years ago. Maybe they stopped liking it because it would mean gay people don't all go to hell or something.

This is happening to most people I know irl, most of the popular figures (like John MacArthur), as well as with people online. And God is not happy with what the people who bear his name have been doing.

It's really sad. They're basically becoming the strawman that crazy SJWs falsely accused Christians of being before. It's been repeated so much, they themselves started believing it.