Western world

From Simple English Wikipedia, the free encyclopedia

When people say the Western world, they mean Europe and the Americas as a whole.