I still believe in God and I still go to church and stuff, but lately Christianity has been turning me off! Why?
1. Many of the people who claim to be Christian.

I know just as many fucked up people who claim to be Christian as those who aren't.
2. The view of men and women and their roles.
3. THE OVER-RELIANCE ON FAITH!
4. The belief that Jesus is the one and only way into heaven.
I'm really trying to gain my faith back, but it isn't working at all! It is so strange that everytime I go to bible study or something, the more I want to run away from the religion!