Stop implying American political terms left and right are spinoffs of radical dictatorships which are all fascists regardless of what these oligarchies call themselves.
The US doesn't have a monopoly on left or right. It is a worldwide perception. Of course there are/were bad right winged governments. Just think about it for a moment. I can list most conflicts or politically turbulent times in the 20th century, but which have been branded as truly terrible? Its the right vs the left that gets the most attention. Some examples:
the Spanish civil War - the Republican's were supports by the West and the USSR vs the Nationalists supported by Germany and Italy.
Vietnam - The US beating up on a poor communist (left-wing) country to support a right wing dictator ship
Lots of media attention in these conflicts. This is just a small sampling. now look at some left on right (or even center) action:
Russo Japanese War 1939 - Many people never even heard of this war. Left on right.
Cambodia 75-79 - The killing fields, Maoist dictator (leftist) Pol Pot wipes out an estimated 1.7 million. Most in the US knew little or nothing until they saw the movie.
So between this and my initial post, it is my perception that left wing governments get a mostly free pass from the civilized world (and have killed more in peace and war) than right wing governments that are (rightly so) vilified for their actions. Left and right in US politics is not exactly what I'm implying here as there are wide degrees of affiliation (someone may be conservative on religion and liberal on spending for example). My meaning in this context is left (Communist, and to a lesser degree Socialist) to right (Fascist, and to a lesser degree Nationalist). I'd like to co sider the US as Centrist, although IMO it is drifting to wards Socialist. So in a nut shell the world turns a blind eye to atrocities under Communism, but cries out in anger when the US accidentally kills a civialian...even a blind man can see it.