Yes, the Bible should be taught in our schools because it is necessary to understand the Bible if we are to truly understand our own culture and how it came to be. The Bible has influenced every part of western culture from our art, music, and history, to our sense of fairness, charity, and business.