skip to main |
skip to sidebar
The American Dream
An ideological term that means that in America, everyone is free to pursue whatever life he/she wants to live. According to everything we have read so far, this really only applies to whites in America.
No comments:
Post a Comment