Steve Harvey Says Hollywood is More Racist than America


Radio Facts: steve harveyWe are still in deep shock that a celebrity as big as Steve Harvey would state something that most other major black celebrities would never say. You would NEVER hear Oprah, Denzel, Will Smith or Eddie Murphy say what Harvey said about Racism in Hollywood. We’re not sure if Harvey’s statement is more shocking or the fact that The Hollywood Reporter, a magazine that has a black person on the cover once a decade actually PRINTED it. KUDOS to Steve. Here is a quote from a recent Hollywood Reporter story.”Hollywood is still very racist,” Harvey says. “Hollywood is more racist than America is. They put things on TV that they think the masses will like. Well, the masses have changed. The election of President Obama should prove that. And television should look entirely different. [Scandal star] Kerry Washington should not be the first African-American female to head up a drama series in 40 years. In 40 years! That’s crazy.” 

Shop at Coach


Please enter your comment!
Please enter your name here