We are constantly encouraged to believe we in the West are the crown of creation, that we live in the elite and thoroughly exceptional kingdom of the western world. How does this compare to reality?
Share this post
IS THE WEST *REALLY* THE BEST?
Share this post
We are constantly encouraged to believe we in the West are the crown of creation, that we live in the elite and thoroughly exceptional kingdom of the western world. How does this compare to reality?