The Western United States—commonly referred to as the American West or simply The West—traditionally refers to the region comprising the westernmost states of the United States (see geographical terminology section for further discussion of these terms). Since the United States has expanded westward since its founding, the definition of the West has evolved over time. The Mississippi River is often referenced as the easternmost possible boundary of the West.
The "West" had played an important part in American history; the Old West is embedded in America's folklore.