The Far West is the region comprising the westernmost states of the United States. As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before around 1800, the crest of the Appalachian Mountains was seen as the western frontier. The frontier moved westward and eventually the lands west of the Mississippi River were considered the West.
The Wild West encompasses the history, geography, people, and culture of life here. It was associated with the wave of American expansion that began in the original Thirteen Colonies with European colonial settlements around the early 17th century and ended with the admission of the last few western territories as states in 1912 (except Alaska). This era of massive migration and settlement was particularly encouraged by President Thomas Jefferson following the Louisiana Purchase, giving rise to the manifest destiny and Turner's frontier thesis. Thus the Wild West is also known as the American frontier.
All items (18)