Browsing: myths

Other than the fact that they existed, pretty much everything Hollywood and TV has taught you about Native Americans is purest fiction, drummed up by racist white people desperate to demonize the people that already lived on the lands that they wanted. Hopefully, we can set at least a few records straight with this article.