You dodged the draft during Viet Nam (with Daddy's help) and now you question the courage and patriotism of men who came back with medals (and scars) and understand wars are bad.
Is there any soul throughout United States' history, presidents or Congress, who actually believed that wars are
good?
Of course they are bad. They are also quite necessary. Kind of how our country was founded.
Grow the hell up and move out of your parent's basement.