I heard when Saving Private Ryan came out, a lot of people left the theaters upset because of the brutality. They left in the beginning when the allies are invading Omaha Beach. While war is very awful, I'm kind of glad that they didn't shy away or tone down the violence. War needs to be shown for what it is. A brutal gory mess. Believe it or not, a lot of soldiers in the American Civil War were excited to go to war because they thought it would be fun. It wasn't until they got their first taste of war that they realized what they signed up for. Even today people glorify this war.
To many times people glorify war as this amazing thing that it really isn't. In fact. Civilians never understood why soldiers returned 'damaged' when they were fine. It wasn't until Matthew Brady and other Civil War photographers decided to take pictures of the aftermath of war that the public realized what they had sent their husbands, sons, and family off to. Heck even in WW2 the public didn't quite understand what was really going on because all they were getting was radio news of people hyping up the war or exaggerated comical takes on what war is through plays, cartoons, ect.
Anyway I just saw some posting about how they disliked the war movies and I wanted to chime in that it's better for them to depict war for what it is rather than how they used to depict it as this amazing glamorous journey.