I do not mean to suggest that the imminent disappearance of nature – if that is what we are witnessing – is a peculiarly American development. But why, then, is the general idea of nature – nature in all its meanings – falling into disuse? What other reasons might there be for the seeming end of nature? With these questions in mind, I want to reconsider the idea’s changing role in American thought.īut, first, these preliminary caveats.
What are we to make of the purported demise of nature? Can it be that the venerable idea is no longer meaningful? If that seems improbable on its face, it is because nature is our oldest, most nearly universal name for the material world, and despite the alarming extent of the transformation – and devastation – we humans have visited on it, that world is still very much with us. Reputable scholars and journalists published essays and books about the ‘death’ – or the ‘end’ – of nature the University of California recruited a dozen humanities professors to participate in a semester-long research seminar designed to “reinvent nature” 1 and the association of European specialists in American studies chose, as the aim of its turn-of-the-century conference, to reassess the changing role played by the idea of nature in America. Before the end of the century, the marked loss of status and currency suffered by the idea of nature had become a hot subject in academic and intellectual circles. Then, in the 1970s, with the onset of the ecological ‘crisis,’ the refurbished, matter-of-fact word environment took over a large part of the niche in public discourse hitherto occupied by the word nature. By 1920, half the population lived in cities, and as the natural world became a less immediate presence, images of the pristine landscape – chief icon of American nature – lost their power to express the nation’s vision of itself.
For some three centuries, in fact, from the founding of Jamestown in 1607 to the closing of the Western frontier in 1890, the encounter of white settlers with what they perceived as wilderness – unaltered nature – was the defining American experience.īy the end of that era, however, the wilderness had come to seem a thing of the past, and the land of farms and villages was rapidly becoming a land of factories and cities. In its time it served – as the ideas of freedom, democracy, or progress did in theirs – to define the meaning of America.
The idea of nature is – or, rather, was – one of the fundamental American ideas.