Many Christians have become a bit smarter in selling their fiction. To reach a secular audience, they write good plots, good stories, and rather than the Christian element being full front and center, it is embedded in the book. This is seen as wise marketing and wise outreach. However, many secular readers are crying foul, some saying that Christian Fiction should come with a warning label.
If we’re going to put warning labels on books as a general rule, then I’m okay with that. However, it seems to me that to single out Christian books as the ones who have to provide some level of warning suggests something especially toxic about Christianity that is incredibly offensive.
As a reader, I often find myself coming into books full of all kinds of biases and agendas. I remember reading a children’s book by Isaac Asimov and stumbling into new age religion. I picked up the Pelican Brief as a kid and in the first few pages, found a boat load full of profanity, sexual immorality, and Republican-bashing.
Writers of various types of novels use their books to push environmentalism, feminism, gun control, gay rights, atheism, humanism, and no one ever asks them to put a warning label on the book. Yet, if a Christian writes a book and straightforward explains what the plot is but doesn’t explicitly state, “Religion warning! Religion warning” they’re guilty of some sort of fraud.
If you truly don’t want to read Christian fiction and it really bothers when you stumble into it, then Google the author, read the reviews. Don’t make an impulse buy and then whine that the author didn’t explicitly tell you his biases up front.
If there is a warning label that should go on books, perhaps it is. “Warning: The author of this book has biases, and this book may be a conscious or unconscious attempt to influence you towards his way of thinking. This is the freedom that authors of all stripes enjoy in a free society. Be advised.” Such a warning label ought to be common sense, but who believes in that any more?