The definition of "Christian" means completely different things depending on who is using it. From a secular standpoint, its a very wide umbrella that includes most Americans, from the IFB to Mormonism to Catholicism to Universalist Unitarians. Most people on the more fundamentalist side of the spectrum would deny that Mormons, Unitarians, and a large percentage of Catholics are truly Christian. On the opposite side of the spectrum, I've heard IFB preachers say that if you aren't a card-carrying member of a Bible Believing IFB church then you either aren't a Christian or a very backslidden one. In the media, it generally means the "Christian right" or the political movement at the forefront of the culture wars and the base of the Republican Party's platform. From that, it has someone developed a stigma among the world as being somebody who is hypocritical, intolerant, and judgmental.
All of that is nothing new. Christian was initially used as a derogatory term in the days of the early church. I don't believe, as a Christian, that it would be right to call myself something else due to the stigma that surrounds the word these days. Plus, how will the world see what a true Christian looks like if they drop the name to avoid being associated with the stereotype?