Americans are taught that white people did everything, but...
Americans are taught that white people did everything, but that is changing. American history and our dealings with other cultures are a constant conflict of understanding.
Americans are taught that white people did everything, but that is changing. American history and our dealings with other cultures are a constant conflict of understanding.