Do you ever notice the way American mainstream culture, media, and people interpret black people and Arab people?
American mainstream media constantly shows Black people in favourable lights (ie Music Stars, Sports Stars, Celebrities), but shows Arabs and Middle Easterners in ways that are mean, offensive and demeaning (ie Always trying to criticize Iran, the whole "war on terror", offensive comments to Islam etc).
Isn't it interesting that white America seems to favour Blacks over Arabs?
Also, White Americans give so much money and aid to Black countries (African nations, Haiti), but when it comes to foreign aid to needy parts of the Middle East, America doesn't give enough.
Iraq and Gaza are 2 of the neediest places on Earth, and yet both are receiving less foreign aid than Haiti did.
Why are Americans so proBlack and antiArab? Why is the foreign aid to black countries so much more than to Arab countries which are just as needy? This is Racism against Arabs.
Why do Americans always portray Arabs in bad ways, but seem to portray Blacks in positive ways?
Isn't it also true that white Americans are out of touch with reality (ie high rates of black-on-white crime, black-on-white murders etc)?
Why does the American media portray Arabs negatively, but does never mention negative things about Blacks such as the high rate of Black-on-white murders?
American mainstream media constantly shows Black people in favourable lights (ie Music Stars, Sports Stars, Celebrities), but shows Arabs and Middle Easterners in ways that are mean, offensive and demeaning (ie Always trying to criticize Iran, the whole "war on terror", offensive comments to Islam etc).
Isn't it interesting that white America seems to favour Blacks over Arabs?
Also, White Americans give so much money and aid to Black countries (African nations, Haiti), but when it comes to foreign aid to needy parts of the Middle East, America doesn't give enough.
Iraq and Gaza are 2 of the neediest places on Earth, and yet both are receiving less foreign aid than Haiti did.
Why are Americans so proBlack and antiArab? Why is the foreign aid to black countries so much more than to Arab countries which are just as needy? This is Racism against Arabs.
Why do Americans always portray Arabs in bad ways, but seem to portray Blacks in positive ways?
Isn't it also true that white Americans are out of touch with reality (ie high rates of black-on-white crime, black-on-white murders etc)?
Why does the American media portray Arabs negatively, but does never mention negative things about Blacks such as the high rate of Black-on-white murders?
Comment