I've gotta know, what's all the hype about this show "Grey's Anatomy"? I have to say, I don't think I've ever watched this show, but I have seen the commercials. Judging by the commerials there's A LOT of sex, A LOT of back-stabbing drama, with perhaps a LITTLE bit of Anatomy (I mean science, people, not sex which has already been covered). Here's why I ask, I notice a trend of women who claim to be Christians and LOVE this show! I mean, every week when it aires they are watching! Does it not go completely against what Christian women stand for? We are to be gentle and lovely and ladylike, not women looking for men to arouse and gain interest, pursue or be pursued with the intent on filling ones life with worldly pleasures! We are not to cause drama among our friends, co-workers or even enemies!
I want every Christian woman who reads this to think about what that show tells women who are watching. Does it tell them to lead godly, or even moral lives?? Can't find anything wrong with this show? Do you find it to just be a good-humored show, no harm done? Let me tell you, before long, you'll start thinking about this show, what happens in the show, it might appeal to you, you may start thinking the same pleasures. But would you ever act on those pleasures? Of course not, right?!?! WRONG! Fill your mind with thoughts like this show offers, and you'll start having a heart that out pours the actions you see!
Tell me, is "Desperate Housewives " any better? If you watch this show and you're a wife, I pray that you don't act like they do......it only takes commercials to see the shows.
Think I'm being too harsh, please tell me, after you pray about it ;)