"Gritty" can mean different things to different people, but when a Western is gritty, we often know what that means—uncompromising and often unpleasant depictions of what life in the Old West really felt like. Though we can't ever truly know, with all the research and exploration people have done so far, well-researched Western movies and TV shows can help us feel closer to it all.