Many Americans think these religious ideas should be taught in school because they believe them and they're true. But a lot of Americans say that because they think it's fair.