Yes, history should definitely be taught in schools. History teaches us how we even got here and why our society is the way it is. If we don’t understand our history we can’t learn from it. If we don’t know the history of other nations then we can’t understand why the culture has developed in the way it has. History is a very crucial part in understanding not only our own culture but cultures all over the world. Needless to say, I think history should with out a doubt be taught in schools.