THE
American culture is rich,
complex, and unique. It emerged from the short and rapid European conquest of
an enormous landmass sparsely settled by diverse indigenous peoples. Although
European cultural patterns predominated, especially in language, the arts, and
political institutions, peoples from Africa, Asia, and North America also
contributed to American culture. All of these groups influenced popular tastes
in music, dress, entertainment, and cuisine. As a result, American culture
possesses an unusual mixture of patterns and forms forged from among its
diverse peoples. The many melodies of American culture have not always been
harmonious, but its complexity has created a society that struggles to achieve
tolerance and produces a uniquely casual personal style that identifies
Americans everywhere. The country is strongly committed to democracy, in which
views of the majority prevail, and strives for equality in law and
institutions.
Characteristics such as
democracy and equality flourished in the American environment long before
taking firm root in European societies, where the ideals originated. As early
as the 1780s, Michel Guillaume Jean de Crиvecoeur,
a French writer living in Pennsylvania who wrote under the pseudonym J. Hector
St. John, was impressed by the democratic nature of early American society. It
was not until the 19th century that these tendencies in America were most fully expressed. When French political writer Alexis de Tocqueville, an
acute social observer, traveled through the United States in the 1830s, he
provided an unusually penetrating portrait of the nature of democracy in America and its cultural consequences. He commented that in all areas of culture—family
life, law, arts, philosophy, and dress—Americans were inclined to emphasize the
ordinary and easily accessible, rather than the unique and complex. His insight
is as relevant today as it was when de Tocqueville visited the United States. As a result, American culture is more often defined by its popular and
democratically inclusive features, such as blockbuster movies, television
comedies, sports stars, and fast food, than by its more cultivated aspects as
performed in theaters, published in books, or viewed in museums and galleries.
Even the fine arts in modern America often partake of the energy and forms of
popular culture, and modern arts are often a product of the fusion of fine and
popular arts.
While America is probably most well known for its popular arts, Americans partake in an enormous
range of cultural activities. Besides being avid readers of a great variety of
books and magazines catering to differing tastes and interests, Americans also
attend museums, operas, and ballets in large numbers. They listen to country
and classical music, jazz and folk music, as well as classic rock-and-roll and
new wave. Americans attend and participate in basketball, football, baseball,
and soccer games. They enjoy food from a wide range of foreign cuisines, such
as Chinese, Thai, Greek, French, Indian, Mexican, Italian, Ethiopian, and
Cuban. They have also developed their own regional foods, such as California cuisine and Southwestern, Creole, and Southern cooking. Still evolving and
drawing upon its ever more diverse population, American culture has come to
symbolize what is most up-to-date and modern. American culture has also become
increasingly international and is imported by countries around the world.
FORCES THAT SHAPED AMERICAN
CULTURE
Imported Traditions
Today American culture often
sets the pace in modern style. For much of its early history, however, the United States was considered culturally provincial and its arts second-rate, especially in
painting and literature, where European artists defined quality and form.
American artists often took their cues from European literary salons and art
schools, and cultured Americans traveled to Europe to become educated. In the
late 18th century, some American artists produced high-quality art, such as the
paintings of John Singleton Copley and Gilbert Charles Stuart and the silver
work of Paul Revere. However, wealthy Americans who collected art in the 19th
century still bought works by European masters and acquired European decorative
arts—porcelain, silver, and antique furniture—. They then ventured further
afield seeking more exotic decor, especially items from China and Japan. By acquiring foreign works, wealthy Americans were able to obtain the
status inherent in a long historical tradition, which the United States lacked. Americans such as Isabella Stewart Gardner and Henry Clay Frick
amassed extensive personal collections, which overwhelmingly emphasized
non-American arts.
In literature, some
19th-century American writers believed that only the refined manners and
perceptions associated with the European upper classes could produce truly
great literary themes. These writers, notably Henry James and Edith Wharton,
often set their novels in the crosswinds of European and American cultural
contact. Britain especially served as the touchstone for culture and quality
because of its role in America's history and the links of language and
political institutions. Throughout the 19th century, Americans read and
imitated British poetry and novels, such as those written by Sir Walter Scott
and Charles Dickens.
The Emergence of an American
Voice
American culture first
developed a unique American voice during the 19th century. This voice included
a cultural identity that was strongly connected to nature and to a divine
mission. The new American voice had liberating effects on how the culture was
perceived, by Americans and by others. Writers Ralph Waldo Emerson and Henry
David Thoreau proposed that the American character was deeply individualistic
and connected to natural and spiritual sources rather than to the conventions
of social life. Many of the 19th century’s most notable figures of American
literature—Herman Melville, Emily Dickinson, and Mark Twain—also influenced
this tradition. The poetry of Walt Whitman, perhaps above all, spoke in a
distinctly American voice about people’s relation to one another, and described
American freedom, diversity, and equality with fervor.
Landscape painting in the United States during the 19th century vividly captured the unique American cultural
identity with its emphasis on the natural environment. This was evident in the
huge canvases set in the West by Albert Bierstadt and the more intimate
paintings of Thomas Cole. These paintings, which were part of the Hudson River School, were often enveloped in a radiant light suggesting a special connection
to spiritual sources. But very little of this American culture moved beyond the
United States to influence art trends elsewhere. American popular culture,
including craft traditions such as quilting or local folk music forged by
Appalachian farmers or former African slaves, remained largely local.
This sense of the special
importance of nature for American identity led Americans in the late 19th
century to become increasingly concerned that urban life and industrial
products were overwhelming the natural environment. Their concern led for calls
to preserve areas that had not been developed. Naturalists such as John Muir
were pivotal in establishing the first national parks and preserving scenic
areas of the American West. By the early 20th century, many Americans supported
the drive to preserve wilderness and the desire to make the great outdoors
available to everyone.
Immigration and Diversity
By the early 20th century, as
the United States became an international power, its cultural self-identity
became more complex. The United States was becoming more diverse as immigrants
streamed into the country, settling especially in America’s growing urban
areas. At this time, America's social diversity began to find significant
expression in the arts and culture. American writers of German, Irish, Jewish,
and Scandinavian ancestry began to find an audience, although some of the
cultural elite resisted the works, considering them crude and unrefined.
Many of these writers focused
on 20th-century city life and themes, such as poverty, efforts to assimilate
into the United States, and family life in the new country. These ethnically
diverse writers included Theodore Dreiser, of German ancestry; Henry Roth, a
Jewish writer; and Eugene O'Neill and James Farrell, of Irish background.
European influence now meant something very different than it once had: Artists
changed the core of American experience by incorporating their various
immigrant origins into its cultural vision. During the 1920s and 1930s, a host
of African American poets and novelists added their voices to this new American
vision. Langston Hughes, Zora Neale Hurston, and Countee Cullen, among others,
gathered in New York City’s Harlem district. They began to write about their
unique experiences, creating a movement called the Harlem Renaissance.
Visual artists of the early
20th century also began incorporating the many new sights and colors of the
multiethnic America visible in these new city settings. Painters associated
with a group known as The Eight (also called the Ashcan school), such as Robert
Henri and John Sloan, portrayed the picturesque sights of the city. Later
painters and photographers focused on the city’s squalid and seamier aspects.
Although nature remained a significant dimension of American cultural
self-expression, as the paintings of Georgia O'Keeffe demonstrated, it was no
longer at the heart of American culture. By the 1920s and 1930s few artists or
writers considered nature the singular basis of American cultural identity.
In popular music too, the songs
of many nations became American songs. Tin Pan Alley (Union Square in New York City, the center of music publishing at the turn of the 20th century) was full of
immigrant talents who helped define American music, especially in the form of
the Broadway musical. Some songwriters, such as Irving Berlin and George M.
Cohan, used their music to help define American patriotic songs and holiday
traditions. During the 1920s musical forms such as the blues and jazz began to
dominate the rhythms of American popular music. These forms had their roots in
Africa as adapted in the American South and then in cities such as New Orleans,
Louisiana; Kansas City, Missouri; Detroit, Michigan; and Chicago, Illinois.
Black artists and musicians such as Louis Armstrong, Duke Ellington, Ella
Fitzgerald, and Count Basie became the instruments of a classic American sound.
White composers such as George Gershwin and performers such as Bix Beiderbecke
also incorporated jazz rhythms into their music, while instrumentalists such as
Benny Goodman adopted jazz’s improvisational style to forge a racially blended
American form called swing music.
Development of Mass Media
In the late 19th century,
Americans who enjoyed the arts usually lived in big cities or had the money to
attend live performances. People who were poor or distant from cultural centers
settled for second-rate productions mounted by local theater troupes or touring
groups. New technologies, such as the motion-picture camera and the phonograph,
revolutionized the arts by making them available to the masses. The movies, the
phonograph, and, somewhat later, the radio made entertainment available daily
and allowed Americans to experience elaborately produced dramas and all types
of music.
While mass media made
entertainment available to more people, it also began to homogenize tastes,
styles, and points of view among different groups in the United States. Class and ethnic distinctions in American culture began to fade as mass
media transmitted movies and music to people throughout the United States. Some people criticized the growing uniformity of mass culture for lowering
the general standard of taste, since mass media sought to please the largest
number of people by appealing to simpler rather than more complex tastes.
However, culture became more democratic as modern technology and mass media
allowed it to reach more people.
During the 20th century, mass
entertainment extended the reach of American culture, reversing the direction
of influence as Europe and the world became consumers of American popular
culture. America became the dominant cultural source for entertainment and
popular fashion, from the jeans and T-shirts young people wear to the music
groups and rock stars they listen to and the movies they see. People all over
the world view American television programs, often years after the program’s
popularity has declined in the United States. American television has become
such an international fixture that American news broadcasts help define what
people in other countries know about current events and politics. American
entertainment is probably one of the strongest means by which American culture
influences the world, although some countries, such as France, resist this influence because they see it as a threat to their unique national
culture.
The Impact of Consumerism
Popular culture is linked to
the growth of consumerism, the repeated acquisition of an increasing variety of
goods and services. The American lifestyle is often associated with clothing,
houses, electronic gadgets, and other products, as well as with leisure time.
As advertising stimulates the desire for updated or improved products, people
increasingly equate their well-being with owning certain things and acquiring
the latest model. Television and other mass media broadcast a portrayal of a
privileged American lifestyle that many Americans hope to imitate.
Americans often seek
self-fulfillment and status through gaining material items. Indeed, products
consumed and owned, rather than professional accomplishments or personal
ideals, are often the standard of success in American society. The media
exemplify this success with the most glamorous models of consumption: Hollywood actors, sports figures, or music celebrities. This dependence on products and on
constant consumption defines modern consumer society everywhere. Americans have
set the pace for this consumer ideal, especially young people, who have helped
fuel this consumer culture in the United States and the world. Like the mass media
with which it is so closely linked, consumption has been extensively
criticized. Portrayed as a dizzy cycle of induced desire, consumerism seems to
erode older values of personal taste and economy. Despite this, the mass
production of goods has also allowed more people to live more comfortably and
made it possible for anyone to attain a sense of style, blurring the most
obvious forms of class distinction.
WAYS OF LIFE
Living Patterns
A fundamental element in the
life of the American people was the enormous expanse of land available. During
the colonial period, the access to open land helped scatter settlements. One
effect was to make it difficult to enforce traditional European social
conventions, such as primogeniture, in which the eldest son inherited the
parents’ estate. Because the United States had so much land, sons became less
dependent on inheriting the family estate. Religious institutions were also
affected, as the widely spread settlements created space for newer religious
sects and revivalist practices.
In the 19th century, Americans
used their land to grow crops, which helped create the dynamic agricultural
economy that defined American society. Many Americans were lured westward to
obtain more land. Immigrants sought land to settle, cattle ranchers wanted land
for their herds, Southerners looked to expand their slave economy into Western
lands, and railroad companies acquired huge tracts of land as they bound a
loose society into a coherent economic union. Although Native Americans had inhabited
most of the continent, Europeans and American settlers often viewed it as
empty, virgin land that they were destined to occupy. Even before the late 19th
century, when the last bloody battles between U.S. troops and Native Americans
completed the white conquest of the West, the idea of possessing land was
deeply etched into American cultural patterns and national consciousness.
Throughout the 19th century,
agricultural settlements existed on large, separate plots of land, often
occupying hundreds of acres. The Homestead Act of 1862 promised up to 65
hectares (160 acres) of free land to anyone with enough fortitude and vision to
live on or cultivate the land. As a result, many settlements in the West
contained vast areas of sparsely settled land, where neighbors lived great
distances from one another. The desire for residential privacy has remained a
significant feature of American culture.
This heritage continues to
define patterns of life in the United States. More than any other Western
society, Americans are committed to living in private dwellings set apart from
neighbors. Despite the rapid urbanization that began in the late 19th century,
Americans insisted that each nuclear family (parents and their children) be
privately housed and that as many families as possible own their own homes.
This strong cultural standard sometimes seemed unusual to new immigrants who
were used to the more crowded living conditions of Europe, but they quickly
adopted this aspect of American culture.
As cities became more densely
populated, Americans moved to the suburbs. Streetcars, first used during the
1830s, opened suburban rings around city centers, where congestion was
greatest. Banks offered long-term loans that allowed individuals to invest in a
home. Above all, the automobile in the 1920s was instrumental in furthering the
move to the suburbs.
After World War II (1939-1945),
developers carved out rural tracts to build millions of single-family homes,
and more Americans than ever before moved to large suburban areas that were
zoned to prevent commercial and industrial activities. The federal government
directly fueled this process by providing loans to war veterans as part of the
Servicemen’s Readjustment Act of 1944, known as the GI Bill of Rights, which
provided a wide range of benefits to U.S. military personnel. In many of the
new housing developments, builders constructed homes according to a single
model, a process first established in Levittown, New York. These identical,
partially prefabricated units were rapidly assembled, making suburban life and
private land ownership available to millions of returning soldiers in search of
housing for their families.
American families still choose
to live in either suburbs or the sprawling suburban cities that have grown up
in newer regions of the country. Vast areas of the West, such as the Los Angeles metropolitan region in California, the area around Phoenix, Arizona, and the Puget Sound area of Washington state, became rapidly populated with new housing because of
the American desire to own a home on a private plot of land. In much of this
suburban sprawl, the central city has become largely indistinct. These suburban
areas almost invariably reflect Americans’ dependence on automobiles and on
government-supported highway systems.
As a result of Americans
choosing to live in the suburbs, a distinctly American phenomenon developed in
the form of the shopping mall. The shopping mall has increasingly replaced the
old-fashioned urban downtown, where local shops, restaurants, and cultural
attractions were located. Modern malls emphasize consumption as an exclusive
activity. The shopping mall, filled with department stores, specialty shops,
fast-food franchises, and movie multiplexes, has come to dominate retailing,
making suburban areas across America more and more alike. In malls, Americans
purchase food, clothing, and entertainment in an isolated environment
surrounded by parking lots.
The American preference for
living in the suburbs has also affected other living experiences. Because
suburbs emphasize family life, suburban areas also place a greater emphasis on
school and other family-oriented political issues than more demographically
diverse cities. At their most intense levels, desire for privacy and fear of
crime have led to the development of gated suburban communities that keep out
those who are not wanted.
Despite the growth of suburbs,
American cities have maintained their status as cultural centers for theaters,
museums, concert halls, art galleries, and more upscale restaurants, shops, and
bookstores. In the past several decades, city populations grew as young and
trendy professionals with few or no children sought out the cultural
possibilities and the diversity not available in the suburbs. Housing can be
expensive and difficult to find in older cities such as New York; Boston, Massachusetts; and San Francisco, California. To cope, many city dwellers restored
older apartment buildings and houses. This process, called gentrification,
combines the American desire for the latest technology with a newer
appreciation for the classic and vintage.
Many poorer Americans cannot
afford homes in the suburbs or apartments in the gentrified areas of cities.
They often rely upon federal housing subsidies to pay for apartments in
less-desirable areas of the city or in public housing projects. Poorer people
often live crowded together in large apartment complexes in congested
inner-city areas. Federal public housing began when President Franklin
Roosevelt sought to relieve the worst conditions associated with poverty in the
1930s. It accelerated during the 1950s and 1960s, as the government subsidized
the renewal of urban areas by replacing slums with either new or refurbished
housing. In the late 20th century, many people criticized public housing
because it was often the site for crime, drug deals, gangs, and other social
ills. Nevertheless, given the expensive nature of rental housing in cities,
public housing is often the only option available to those who cannot afford to
buy their own home. Private efforts, such as Habitat for Humanity, have been
organized to help the urban poor move from crowded, high-rise apartments. These
organizations help construct low-cost homes in places such as the South Bronx in New York City, and they emphasize the pride and autonomy of home ownership.
In recent years, the importance
of home ownership has increased as higher real estate prices have made the
house a valuable investment. The newest home construction has made standard the
comforts of large kitchens, luxurious bathrooms, and small gardens. In line
with the rising cost of land, these houses often stand on smaller lots than
those constructed in the period following World War II, when one-story ranch
houses and large lawns were the predominant style. At the same time, many
suburban areas have added other kinds of housing in response to the needs of
single people and people without children. As a result, apartments and
townhouses—available as rentals and as condominiums—have become familiar parts
of suburban life. For more information on urbanization and suburbanization.
Food and Cuisine
The United States has rich and
productive land that has provided Americans with plentiful resources for a
healthy diet. Despite this, Americans did not begin to pay close attention to
the variety and quality of the food they ate until the 20th century, when they
became concerned about eating too much and becoming overweight. American food
also grew more similar around the country as American malls and fast-food outlets
tended to standardize eating patterns throughout the nation, especially among
young people. Nevertheless, American food has become more complex as it draws
from the diverse cuisines that immigrants have brought with them.
Historically, the rest of the
world has envied the good, wholesome food available in the United States. In the 18th and 19th centuries, fertile soil and widespread land ownership
made grains, meats, and vegetables widely available, and famine that was common
elsewhere was unknown in the United States. Some immigrants, such as the Irish,
moved to the United States to escape famine, while others saw the bounty of
food as one of the advantages of immigration. By the late 19th century, America’s food surplus was beginning to feed the world. After World War I (1914-1918) and
World War II, the United States distributed food in Europe to help countries
severely damaged by the wars. Throughout the 20th century, American food
exports have helped compensate for inadequate harvests in other parts of the
world. Although hunger does exist in the United States, it results more from
food being poorly distributed rather than from food being unavailable.
Traditional American cuisine
has included conventional European foodstuffs such as wheat, dairy products,
pork, beef, and poultry. It has also incorporated products that were either
known only in the New World or that were grown there first and then introduced
to Europe. Such foods include potatoes, corn, codfish, molasses, pumpkin and
other squashes, sweet potatoes, and peanuts. American cuisine also varies by
region. Southern cooking was often different from cooking in New England and
its upper Midwest offshoots. Doughnuts, for example, were a New England staple,
while Southerners preferred corn bread. The availability of foods also affected
regional diets, such as the different kinds of fish eaten in New England and
the Gulf Coast. For instance, Boston clam chowder and Louisiana gumbo are
widely different versions of fish soup. Other variations often depended on the
contributions of indigenous peoples. In the Southwest, for example, Mexican and
Native Americans made hot peppers a staple and helped define the spicy hot
barbecues and chili dishes of the area. In Louisiana, Cajun influence similarly
created spicy dishes as a local variation of Southern cuisine, and African
slaves throughout the South introduced foods such as okra and yams
By the late 19th century,
immigrants from Europe and Asia were introducing even more variations into the
American diet. American cuisine began to reflect these foreign cuisines, not
only in their original forms but in Americanized versions as well. Immigrants
from Japan and Italy introduced a range of fresh vegetables that added
important nutrients as well as variety to the protein-heavy American diet.
Germans and Italians contributed new skills and refinements to the production
of alcoholic beverages, especially beer and wine, which supplemented the more
customary hard cider and indigenous corn-mash whiskeys. Some imports became
distinctly American products, such as hot dogs, which are descended from German
wurst, or sausage. Spaghetti and pizza from Italy, especially, grew
increasingly more American and developed many regional spin-offs. Americans
even adapted chow mein from China into a simple American dish. Not until the
late 20th century did Americans rediscover these cuisines, and many others,
paying far more attention to their original forms and cooking styles.
Until the early 20th century,
the federal government did not regulate food for consumers, and food was
sometimes dangerous and impure. During the Progressive period in the early 20th
century, the federal government intervened to protect consumers against the
worst kinds of food adulterations and diseases by passing legislation such as
the Pure Food and Drug Acts. As a result, American food became safer. By the
early 20th century, Americans began to consume convenient, packaged foods such
as breads and cookies, preserved fruits, and pickles. By the mid-20th century,
packaged products had expanded greatly to include canned soups, noodles,
processed breakfast cereals, preserved meats, frozen vegetables, instant
puddings, and gelatins. These prepackaged foods became staples used in recipes
contained in popular cookbooks, while peanut butter sandwiches and packaged
cupcakes became standard lunchbox fare. As a result, the American diet became
noteworthy for its blandness rather than its flavors, and for its wholesomeness
rather than its subtlety.
Americans were proud of their
technology in food production and processing. They used fertilizers,
hybridization (genetically combining two varieties), and other technologies to
increase crop yields and consumer selection, making foods cheaper if not always
better tasting. Additionally, by the 1950s, the refrigerator had replaced the
old-fashioned icebox and the cold cellar as a place to store food.
Refrigeration, because it allowed food to last longer, made the American
kitchen a convenient place to maintain readily available food stocks. However,
plentiful wholesome food, when combined with the sedentary 20th-century
lifestyle and work habits, brought its own unpleasant consequences—overeating
and excess weight. During the 1970s, 25 percent of Americans were overweight;
by the 1990s that had increased to 35 percent.
America’s foods began to affect the rest of the
world—not only raw staples such as wheat and corn, but a new American cuisine
that spread throughout the world. American emphasis on convenience and rapid
consumption is best represented in fast foods such as hamburgers, french fries,
and soft drinks, which almost all Americans have eaten. By the 1960s and 1970s
fast foods became one of America's strongest exports as franchises for
McDonald’s and Burger King spread through Europe and other parts of the world,
including the former Soviet Union and Communist China. Traditional meals cooked
at home and consumed at a leisurely pace—common in the rest of the world, and
once common in the United States—gave way to quick lunches and dinners eaten on
the run as other countries mimicked American cultural patterns.
By the late 20th century,
Americans had become more conscious of their diets, eating more poultry, fish,
and fresh fruits and vegetables and fewer eggs and less beef. They also began
appreciating fresh ingredients and livelier flavors, and cooks began to
rediscover many world cuisines in forms closer to their original. In California, chefs combined the fresh fruits and vegetables available year-round with
ingredients and spices sometimes borrowed from immigrant kitchens to create an
innovative cooking style that was lighter than traditional French, but more
interesting and varied than typical American cuisine. Along with the state’s
wines, California cuisine eventually took its place among the acknowledged
forms of fine dining.
As Americans became more
concerned about their diets, they also became more ecologically conscious. This
consciousness often included an antitechnology aspect that led some Americans
to switch to a partially or wholly vegetarian diet, or to emphasize products
produced organically (without chemical fertilizers and pesticides). Many
considered these foods more wholesome and socially responsible because their
production was less taxing to the environment. In the latter 20th century,
Americans also worried about the effects of newly introduced genetically
altered foods and irradiation processes for killing bacteria. They feared that
these new processes made their food less natural and therefore harmful.
These concerns and the emphasis
on variety were by no means universal, since food habits in the late 20th
century often reflected society’s ethnic and class differences. Not all
Americans appreciated California cuisine or vegetarian food, and many recent
immigrants, like their immigrant predecessors, often continued eating the foods
they knew best.
At the end of the 20th century,
American eating habits and food production were increasingly taking place
outside the home. Many people relied on restaurants and on new types of fully
prepared meals to help busy families in which both adults worked full-time.
Another sign of the public’s changing food habits was the microwave oven,
probably the most widely used new kitchen appliance, since it can quickly cook
foods and reheat prepared foods and leftovers. Since Americans are generally
cooking less of their own food, they are more aware than at any time since the
early 20th century of the quality and health standards applied to food. Recent
attention to cases in which children have died from contaminated and poorly
prepared food has once again directed the public’s attention to the
government's role in monitoring food safety.
In some ways, American food
developments are contradictory. Americans are more aware of food quality
despite, and maybe because of, their increasing dependence on convenience. They
eat a more varied diet, drawing on the cuisines of immigrant groups (Thai,
Vietnamese, Greek, Indian, Cuban, Mexican, and Ethiopian), but they also
regularly eat fast foods found in every shopping mall and along every highway.
They are more suspicious of technology, although they rely heavily on it for
their daily meals. In many ways, these contradictions reflect the many
influences on American life in the late 20th century—immigration, double-income
households, genetic technologies, domestic and foreign travel—and food has
become an even deeper expression of the complex culture of which it is part.
Dress
In many regions of the world,
people wear traditional costumes at festivals or holidays, and sometimes more
regularly. Americans, however, do not have distinctive folk attire with a long
tradition. Except for the varied and characteristic clothing of Native American
peoples, dress in the United States has rarely been specific to a certain
region or based on the careful preservation of decorative patterns and crafts.
American dress is derived from the fabrics and fashions of the Europeans who
began colonizing the country in the 17th century. Early settlers incorporated some
of the forms worn by indigenous peoples, such as moccasins and garments made
from animal skins (Benjamin Franklin is famous for flaunting a raccoon cap when
he traveled to Europe), but in general, fashion in the United States adapted
and modified European styles. Despite the number and variety of immigrants in
the United States, American clothing has tended to be homogeneous, and attire
from an immigrant’s homeland was often rapidly exchanged for American apparel.
American dress is distinctive
because of its casualness. American style in the 20th century is recognizably
more informal than in Europe, and for its fashion sources it is more dependent
on what people on the streets are wearing. European fashions take their cues
from the top of the fashion hierarchy, dictated by the world-famous haute
couture (high fashion) houses of Paris, France, and recently those of Milan, Italy, and London, England. Paris designers, both today and in the past, have also
dressed wealthy and fashionable Americans, who copied French styles. Although
European designs remain a significant influence on American tastes, American
fashions more often come from popular sources, such as the school and the
street, as well as television and movies. In the last quarter of the 20th century,
American designers often found inspiration in the imaginative attire worn by
young people in cities and ballparks, and that worn by workers in factories and
fields.
Blue jeans are probably the
single most representative article of American clothing. They were originally
invented by tailor Jacob Davis, who together with dry-goods salesman Levi
Strauss patented the idea in 1873 as durable clothing for miners. Blue jeans
(also known as dungarees) spread among workers of all kinds in the late 19th
and early 20th centuries, especially among cowboys, farmers, loggers, and
railroad workers. During the 1950s, actors Marlon Brando and James Dean made
blue jeans fashionable by wearing them in movies, and jeans became part of the
image of teenage rebelliousness. This fashion statement exploded in the 1960s
and 1970s as Levi's became a fundamental part of the youth culture focused on
civil rights and antiwar protests. By the late 1970s, almost everyone in the United States wore blue jeans, and youths around the world sought them. As designers began
to create more sophisticated styles of blue jeans and to adjust their fit,
jeans began to express the American emphasis on informality and the importance
of subtlety of detail. By highlighting the right label and achieving the right
look, blue jeans, despite their worker origins, ironically embodied the status
consciousness of American fashion and the eagerness to approximate the latest
fad.
American informality in dress
is such a strong part of American culture that many workplaces have adopted the
idea of “casual Friday,” a day when workers are encouraged to dress down from
their usual professional attire. For many high-tech industries located along
the West Coast, as well as among faculty at colleges and universities, this emphasis
on casual attire is a daily occurrence, not just reserved for Fridays.
The fashion industry in the United States, along with its companion cosmetics industry, grew enormously in the second
half of the 20th century and became a major source of competition for French
fashion. Especially notable during the late 20th century was the incorporation
of sports logos and styles, from athletic shoes to tennis shirts and baseball
caps, into standard American wardrobes. American informality is enshrined in the
wardrobes created by world-famous U.S. designers such as Calvin Klein, Liz
Claiborne, and Ralph Lauren. Lauren especially adopted the American look, based
in part on the tradition of the old West (cowboy hats, boots, and jeans) and in
part on the clean-cut sportiness of suburban style (blazers, loafers, and
khakis).
Sports and Recreation
Large numbers of Americans
watch and participate in sports activities, which are a deeply ingrained part
of American life. Americans use sports to express interest in health and
fitness and to occupy their leisure time. Sports also allow Americans to
connect and identify with mass culture. Americans pour billions of dollars into
sports and their related enterprises, affecting the economy, family habits,
school life, and clothing styles. Americans of all classes, races, sexes, and
ages participate in sports activities—from toddlers in infant swimming groups
and teenagers participating in school athletics to middle-aged adults bowling
or golfing and older persons practicing t’ai chi.
Public subsidies and private
sponsorships support the immense network of outdoor and indoor sports,
recreation, and athletic competitions. Except for those sponsored by public
schools, most sports activities are privately funded, and even American Olympic
athletes receive no direct national sponsorship. Little League baseball teams,
for example, are usually sponsored by local businesses. Many commercial
football, basketball, baseball, and hockey teams reflect large private
investments. Although sports teams are privately owned, they play in stadiums
that are usually financed by taxpayer-provided subsidies such as bond measures.
State taxes provide some money for state university sporting events. Taxpayer
dollars also support state parks, the National Park Service, and the Forest
Service, which provide places for Americans to enjoy camping, fishing, hiking,
and rafting. Public money also funds the Coast Guard, whose crews protect those
enjoying boating around the nation's shores.
Sports in North America go back
to the Native Americans, who played forms of lacrosse and field hockey. During
colonial times, early Dutch settlers bowled on New York City's Bowling Green, still a small park in southern Manhattan. However, organized sports
competitions and local participatory sports on a substantial scale go back only
to the late 19th century. Schools and colleges began to encourage athletics as
part of a balanced program emphasizing physical as well as mental vigor, and
churches began to loosen strictures against leisure and physical pleasures. As
work became more mechanized, more clerical, and less physical during the late
19th century, Americans became concerned with diet and exercise. With sedentary
urban activities replacing rural life, Americans used sports and outdoor
relaxation to balance lives that had become hurried and confined. Biking,
tennis, and golf became popular for those who could afford them, while sandlot
baseball and an early version of basketball became popular city activities. At
the same time, organizations such as the Boy Scouts and the Young Men’s
Christian Association (YMCA) began to sponsor sports as part of their efforts
to counteract unruly behavior among young people.
Baseball teams developed in
Eastern cities during the 1850s and spread to the rest of the nation during the
Civil War in the 1860s. Baseball quickly became the national pastime and began
to produce sports heroes such as Cy Young, Ty Cobb, and Babe Ruth in the first
half of the 20th century. With its city-based loyalties and all-American aura,
baseball appealed to many immigrants, who as players and fans used the game as
a way to fit into American culture.
Starting in the latter part of
the 19th century, football was played on college campuses, and intercollegiate
games quickly followed. By the early 20th century, football had become a
feature of college life across the nation. In the 1920s football pep rallies
were commonly held on college campuses, and football players were among the
most admired campus leaders. That enthusiasm has now spilled way beyond college
to Americans throughout the country. Spectators also watch the professional
football teams of the National Football League (NFL) with enthusiasm.
Basketball is another sport
that is very popular as both a spectator and participant sport. The National
Collegiate Athletic Association (NCAA) hosts championships for men’s and
women’s collegiate teams. Held annually in March, the men’s NCAA national
championship is one of the most popular sporting events in the United States.
The top men’s professional basketball league in the United States is the
National Basketball Association; the top women’s is Women’s National Basketball
Association. In addition, many people play basketball in amateur leagues and
organizations. It is also common to see people playing basketball in parks and
local gymnasiums around the country.
Another major sport played in
the United States is ice hockey. Ice hockey began as an amateur sport played
primarily in the Northeast. The first U.S. professional ice hockey team was
founded in Boston in 1924. Ice hockey’s popularity has spread throughout the
country since the 1960s. The NCAA holds a national collegiate ice hockey
championship in April of each year. The country’s top professional league is the
National Hockey League (NHL). NHL teams play a regular schedule that culminates
in the championship series. The winner is awarded the Stanley Cup, the league’s
top prize.
Television transformed sports
in the second half of the 20th century. As more Americans watched sports on
television, the sports industry grew into an enormous business, and sports
events became widely viewed among Americans as cultural experiences. Many
Americans shared televised moments of exaltation and triumph throughout the year:
baseball during the spring and summer and its World Series in the early fall,
football throughout the fall crowned by the Super Bowl in January, and the
National Basketball Association (NBA) championships in the spring. The Olympic
Games, watched by millions of people worldwide, similarly rivet Americans to
their televisions as they watch outstanding athletes compete on behalf of their
nations. Commercial sports are part of practically every home in America and
have allowed sports heroes to gain prominence in the national imagination and
to become fixtures of the consumer culture. As well-known faces and bodies,
sports celebrities such as basketball player Michael Jordan and baseball player
Mark McGwire are hired to endorse products.
Although televised games remove
the viewing public from direct contact with events, they have neither
diminished the fervor of team identification nor dampened the enthusiasm for
athletic participation. Americans watch more sports on television than ever,
and they personally participate in more varied sporting activities and athletic
clubs. Millions of young girls and boys across the country play soccer,
baseball, tennis, and field hockey.
At the end of the 20th century,
Americans were taking part in individual sports of all kinds—jogging,
bicycling, swimming, skiing, rock climbing, playing tennis, as well as more
unusual sports such as bungee jumping, hang gliding, and wind surfing. As
Americans enjoy more leisure time, and as Hollywood and advertising emphasize
trim, well-developed bodies, sports have become a significant component of many
people's lives. Many Americans now invest significant amounts of money in
sports equipment, clothing, and gym memberships. As a result, more people are
dressing in sporty styles of clothing. Sports logos and athletic fashions have
become common aspects of people’s wardrobes, as people need to look as though
they participate in sports to be in style. Sports have even influenced the cars
Americans drive, as sport utility vehicles accommodate the rugged terrain,
elaborate equipment, and sporty lifestyles of their owners.
Probably the most significant
long-term development in 20th-century sports has been the increased
participation of minorities and women. Throughout the early 20th century,
African Americans made outstanding contributions to sports, despite being
excluded from organized white teams. The exclusion of black players from white
baseball led to the creation of a separate Negro National League in 1920. On
the world stage, track-and-field star Jessie Owens became a national hero when
he won four gold medals and set world and Olympic records at the Berlin
Olympics in 1936. The racial segregation that prevented African Americans from
playing baseball in the National League until 1947 has been replaced by the
enormous successes of African Americans in all fields of sport.
Before the 20th century women
could not play in most organized sports. Soon, however, they began to enter the
sports arena. Helen Wills Moody, a tennis champion during the 1920s, and Babe
Didrikson Zaharias, one of the 20th century’s greatest women athletes, were
examples of physical grace and agility. In 1972 Title IX of the Education
Amendments Act outlawed discrimination based on gender in education, including
school sports. Schools then spent additional funding on women's athletics,
which provided an enormous boost to women’s sports of all kinds, especially
basketball, which became very popular. Women's college basketball, part of the
National Collegiate Athletic Association (NCAA), is a popular focus of
interest. By the end of the 20th century, this enthusiasm led to the creation
of a major professional women’s basketball league. Women have become a large
part of athletics, making their mark in a wide range of sports.
Sports have become one of the
most visible expressions of the vast extension of democracy in 20th-century America. They have become more inclusive, with many Americans both personally participating
and enjoying sports as spectators. Once readily available only to the
well-to-do, sports and recreation attract many people, aided by the mass media,
the schools and colleges, the federal and state highway and park systems, and
increased leisure time.
Celebrations and Holidays
Americans celebrate an enormous
variety of festivals and holidays because they come from around the globe and
practice many religions. They also celebrate holidays specific to the United States that commemorate historical events or encourage a common national memory.
Holidays in America are often family or community events. Many Americans travel
long distances for family gatherings or take vacations during holidays. In
fact, by the end of the 20th century, many national holidays in the United States had become three-day weekends, which many people used as mini vacations.
Except for the Fourth of July and Veterans Day, most commemorative federal
holidays, including Memorial Day, Labor Day, Columbus Day, and Presidents’ Day,
are celebrated on Mondays so that Americans can enjoy a long weekend. Because
many Americans tend to create vacations out of these holiday weekends rather
than celebrate a particular event, some people believe the original
significance of many of these occasions has been eroded.
Because the United States is a secular society founded on the separation of church and state, many of
the most meaningful religiously based festivals and rituals, such as Easter,
Rosh Hashanah, and Ramadan, are not enshrined as national events, with one
major exception. Christmas, and the holiday season surrounding it, is an
enormous commercial enterprise, a fixture of the American social calendar, and
deeply embedded in the popular imagination. Not until the 19th century did
Christmas in the United States begin to take on aspects of the modern holiday
celebration, such as exchanging gifts, cooking and eating traditional foods,
and putting up often-elaborate Christmas decorations. The holiday has grown in
popularity and significance ever since. Santa Claus; brightly decorated
Christmas trees; and plenty of wreathes, holly, and ribbons help define the
season for most children. Indeed, because some religious faiths do not
celebrate Christmas, the Christmas season has expanded in recent years to
become the “holiday season,” embracing Hanukkah, the Jewish Festival of Lights,
and Kwanzaa, a celebration of African heritage. Thus, the Christmas season has
become the closest thing to a true national festival in the United States.
The expansion of Christmas has
even begun to encroach on the most indigenous of American festivals,
Thanksgiving. Celebrated on the last Thursday in November, Thanksgiving has
largely shed its original religious meaning (as a feast of giving thanks to
God) to become a celebration of the bounty of food and the warmth of family
life in America. American children usually commemorate the holiday’s origins at
school, where they re-create the original event: Pilgrims sharing a harvest
feast with Native Americans. Both the historical and the religious origins of
the event have largely given way to a secular celebration centered on the
traditional Thanksgiving meal: turkey—an indigenous American bird—accompanied
by foods common in early New England settlements, such as pumpkins, squashes,
and cranberries. Since many Americans enjoy a four-day holiday at Thanksgiving,
the occasion encourages family reunions and travel. Some Americans also
contribute time and food to the needy and the homeless during the Thanksgiving
holiday.
Another holiday that has lost
its older, religious meaning in the United States is Halloween, the eve of All
Saints’ Day. Halloween has become a celebration of witches, ghosts, goblins,
and candy that is especially attractive to children. On this day and night,
October 31, many homes are decorated and lit by jack-o'-lanterns, pumpkins that
have been hollowed out and carved. Children dress up and go trick-or-treating,
during which they receive treats from neighbors. An array of orange-colored
candies has evolved from this event, and most trick-or-treat bags usually brim
with chocolate bars and other confections.
The Fourth of July, or
Independence Day, is the premier American national celebration because it
commemorates the day the United States proclaimed its freedom from Britain with the Declaration of Independence. Very early in its development, the holiday
was an occasion for fanfare, parades, and speeches celebrating American freedom
and the uniqueness of American life. Since at least the 19th century, Americans
have commemorated their independence with fireworks and patriotic music.
Because the holiday marks the founding of the republic in 1776, flying the flag
of the United States (sometimes with the original 13 stars) is common, as are
festive barbecues, picnics, fireworks, and summer outings.
Most other national holidays
have become less significant over time and receded in importance as ways in
which Americans define themselves and their history. For example, Columbus Day
was formerly celebrated on October 12, the day explorer Christopher Columbus
first landed in the West Indies, but it is now celebrated on the second Monday
of October to allow for a three-day weekend. The holiday originally served as a
traditional reminder of the "discovery" of America in 1492, but as
Americans became more sensitive to their multicultural population, celebrating
the conquest of Native Americans became more controversial.
Holidays honoring wars have
also lost much of their original significance. Memorial Day, first called
Decoration Day and celebrated on May 30, was established to honor those who
died during the American Civil War (1861-1865), then subsequently those who
died in all American wars. Similarly, Veterans Day was first named Armistice
Day and marked the end of World War I (1914-1918). During the 1950s the name of
the holiday was changed in the United States, and its significance expanded to
honor armed forces personnel who served in any American war.
The memory of America's first president, George Washington, was once celebrated on his birthday, February
22nd. The date was changed to the third Monday in February to create a
three-day weekend, as well as to incorporate the birthday of another president,
Abraham Lincoln, who was born on February 12th. The holiday is now popularly
called Presidents’ Day and is less likely to be remembered as honoring the
first and 16th American
presidents than as a school and work holiday. Americans also memorialize Martin
Luther King, Jr., the great African American civil rights leader who was
assassinated in 1968. King’s birthday is celebrated as a national holiday in
mid-January. The celebration of King's birthday has become a sign of greater
inclusiveness in 20th-century American society.
EDUCATION
Role of Education
The United States has one of
the most extensive and diverse educational systems in the world. Educational
institutions exist at all learning levels, from nursery schools for the very
young to higher education for older youths and adults of all ages. Education in
the United States is notable for the many goals it aspires to accomplish—promoting
democracy, assimilation, nationalism, equality of opportunity, and personal
development. Because Americans have historically insisted that their schools
work toward these sometimes conflicting goals, education has often been the
focus of social conflict.
While schools are expected to
achieve many social objectives, education in America is neither centrally
administered nor supported directly by the federal government, unlike education
in other industrialized countries. In the United States, each state is
responsible for providing schooling, which is funded through local taxes and
governed by local school boards. In addition to these government-funded public
schools, the United States has many schools that are privately financed and
maintained. More than 10 percent of all elementary and secondary students in
the United States attend private schools. Religious groups, especially the
Roman Catholic Church, run many of these. Many of America's most renowned
universities and colleges are also privately endowed and run. As a result,
although American education is expected to provide equality of opportunity, it
is not easily directed toward these goals. This complex enterprise, once one of
the proudest achievements of American democracy because of its diversity and
inclusiveness, became the subject of intense debate and criticism during the
second half of the 20th century. People debated the goals of schools as well as
whether schools were educating students well enough.
History of Education in America
Until the 1830s, most American
children attended school irregularly, and most schools were either run
privately or by charities. This irregular system was replaced in the Northeast
and Midwest by publicly financed elementary schools, known as common schools.
Common schools provided rudimentary instruction in literacy and trained
students in citizenship. This democratic ideal expanded after the Civil War to
all parts of the nation. By the 1880s and 1890s, schools began to expand
attendance requirements so that more children and older children attended
school regularly. These more rigorous requirements were intended to ensure that
all students, including those whose families had immigrated from elsewhere,
were integrated into society. In addition, the schools tried to equip children
with the more complex skills required in an industrialized urban society.
Education became increasingly
important during the 20th century, as America’s sophisticated industrial
society demanded a more literate and skilled workforce. In addition, school
degrees provided a sought-after means to obtain better-paying and higher-status
jobs. Schools were the one American institution that could provide the literate
skills and work habits necessary for Americans of all backgrounds to compete in
industries. As a result, education expanded rapidly. In the first decades of
the 20th century, mandatory education laws required children to complete grade
school. By the end of the 20th century, many states required children to attend
school until they were at least 16. In 1960, 45 percent of high school
graduates enrolled in college; by 1996 that enrollment rate had risen to 65
percent. By the late 20th century, an advanced education was necessary for
success in the globally competitive and technologically advanced modern
economy. According to the U.S. Census Bureau, workers with a bachelor’s degree
in 1997 earned an average of ,000 annually, while those with a high school
degree earned about ,000. Those who did not complete high school earned
about ,000.
In the United States, higher education is widely
available and obtainable through thousands of private, religious, and state-run
institutions, which offer advanced professional, scientific, and other training
programs that enable students to become proficient in diverse subjects.
Colleges vary in cost and level of prestige. Many of the oldest and most famous
colleges on the East Coast are expensive and set extremely high admissions
standards. Large state universities are less difficult to enter, and their fees
are substantially lower. Other types of institutions include state universities
that provide engineering, teaching, and agriculture degrees; private
universities and small privately endowed colleges; religious colleges and
universities; and community and junior colleges that offer part-time and
two-year degree programs. This complex and diverse range of schools has made
American higher education the envy of other countries and one of the nation’s
greatest assets in creating and maintaining a technologically advanced society.
When more people began to
attend college, there were a number of repercussions. Going to college delayed
maturity and independence for many Americans, extending many of the stresses of
adolescence into a person’s 20s and postponing the rites of adulthood, such as
marriage and childbearing. As society paid more attention to education, it also
devoted a greater proportion of its resources to it. Local communities were
required to spend more money on schools and teachers, while colleges and
universities were driven to expand their facilities and course offerings to
accommodate an ever-growing student body. Parents were also expected to support
their children longer and to forgo their children's contribution to the
household.
Funding
Education is an enormous
investment that requires contributions from many sources. American higher
education is especially expensive, with its heavy investment in laboratory
space and research equipment. It receives funding from private individuals,
foundations, and corporations. Many private universities have large endowments,
or funds, that sustain the institutions beyond what students pay in tuition and
fees. Many, such as Harvard University in Massachusetts and Stanford University in California, raise large sums of money through fund drives. Even many
state-funded universities seek funds from private sources to augment their
budgets. Most major state universities, such as those in Michigan and California, now rely on a mixture of state and private resources.
Before World War II, the
federal government generally played a minor role in financing education, with
the exception of the Morrill Acts of 1862 and 1890. These acts granted the
states public lands that could be sold for the purpose of establishing and
maintaining institutions of higher education. Many so-called land-grant state
universities were founded during the 19th century as a result of this funding.
Today, land-grant colleges include some of the nation’s premier state
universities. The government also provided some funding for basic research at
universities.
The American experience in
World War II (especially the success of the Manhattan Project, which created
the atomic bomb) made clear that scientific and technical advances, as well as
human resources, were essential to national security. As a result, the federal
government became increasingly involved in education at all levels and
substantially expanded funding for universities. The federal government began
to provide substantial amounts of money for university research programs
through agencies such as the National Science Foundation, and later through the
National Institutes of Health and the departments of Energy and Defense. At the
same time, the government began to focus on providing equal educational
opportunities for all Americans. Beginning with the GI Bill, which financed
educational programs for veterans, and later in the form of fellowships and
direct student loans in the 1960s, more and more Americans were able to attend
colleges and universities.
During the 1960s the federal
government also began to play more of a role in education at lower levels. The
Great Society programs of President Lyndon Johnson developed many new
educational initiatives to assist poor children and to compensate for
disadvantage. Federal money was funneled through educational institutions to
establish programs such as Head Start, which provides early childhood education
to disadvantaged children. Some Americans, however, resisted the federal
government’s increased presence in education, which they believed contradicted
the long tradition of state-sponsored public schooling.
By the 1980s many public
schools were receiving federal subsidies for textbooks, transportation,
breakfast and lunch programs, and services for students with disabilities. This
funding enriched schools across the country, especially inner-city schools, and
affected the lives of millions of schoolchildren. Although federal funding
increased, as did federal supervision, to guarantee an equitable distribution
of funds, the government did not exercise direct control over the academic
programs schools offered or over decisions about academic issues. During the
1990s, the administration of President Bill Clinton urged the federal
government to move further in exercising leadership by establishing academic
standards for public schools across the country and to evaluate schools through
testing.
Concerns in Elementary
Education
The United States has
historically contended with the challenges that come with being a nation of
immigrants. Schools are often responsible for modifying educational offerings
to accommodate immigrants. Early schools reflected many differences among
students and their families but were also a mechanism by which to overcome these
differences and to forge a sense of American commonality. Common schools, or
publicly financed elementary schools, were first introduced in the mid-19th
century in the hopes of creating a common bond among a diverse citizenship. By
the early 20th century, massive immigration from Europe caused schools to
restructure and expand their programs to more effectively incorporate immigrant
children into society. High schools began to include technical, business, and
vocational curricula to accommodate the various goals of its more diverse
population. The United States continues to be concerned about how to
incorporate immigrant groups.
The language in which students
are taught is one of the most significant issues for schools. Many Americans
have become concerned about how best to educate students who are new to the
English language and to American culture. As children of all ages and from
dozens of language backgrounds seek an education, most schools have adopted
some variety of bilingual instruction. Students are taught in their native
language until their knowledge of English improves, which is often accomplished
through an English as a Second Language (ESL) program. Some people have
criticized these bilingual programs for not encouraging students to learn English
more quickly, or at all. Some Americans fear that English will no longer
provide a uniform basis for American identity; others worry that immigrant
children will have a hard time finding employment if they do not become fluent
in English. In response to these criticisms, voters in California, the state
that has seen the largest influx of recent immigrants, passed a law in 1998
requiring that all children attending public schools be taught in English and
prohibiting more than one year of bilingual instruction.
Many Americans, including
parents and business leaders, are also alarmed by what they see as inadequate
levels of student achievement in subjects such as reading, mathematics, and
science. On many standardized tests, American students lag behind their
counterparts in Europe and Asia. In response, some Americans have urged the
adoption of national standards by which individual schools can be evaluated.
Some have supported more rigorous teacher competency standards. Another
response that became popular in the 1990s is the creation of charter schools.
These schools are directly authorized by the state and receive public funding,
but they operate largely outside the control of local school districts. Parents
and teachers enforce self-defined standards for these charter schools.
Schools are also working to
incorporate computers into classrooms. The need for computer literacy in the
21st century has put an additional strain on school budgets and local
resources. Schools have struggled to catch up by providing computer equipment
and instruction and by making Internet connections available. Some companies,
including Apple Computer, Inc., have provided computer equipment to help
schools meet their students’ computer-education needs.
Concerns in Higher Education
Throughout the 20th century,
Americans have attended schools to obtain the economic and social rewards that
come with highly technical or skilled work and advanced degrees. However, as
the United States became more diverse, people debated how to include different
groups, such as women and minorities, into higher education. Blacks have
historically been excluded from many white institutions, or were made to feel
unwelcome. Since the 19th century, a number of black colleges have existed to
compensate for this broad social bias, including federally chartered and funded
Howard University. In the early 20th century, when Jews and other Eastern
Europeans began to apply to universities, some of the most prestigious colleges
imposed quotas limiting their numbers.
Americans tried various means
to eliminate the most egregious forms of discrimination. In the early part of
the century, "objective" admissions tests were introduced to
counteract the bias in admissions. Some educators now view admissions tests
such as the Scholastic Achievement Test (SAT), originally created to simplify
admissions testing for prestigious private schools, as disadvantageous to women
and minorities. Critics of the SAT believed the test did not adequately account
for differences in social and economic background. Whenever something as
subjective as ability or merit is evaluated, and when the rewards are
potentially great, people hotly debate the best means to fairly evaluate these
criteria.
Until the middle of the 20th
century, most educational issues in the United States were handled locally.
After World War II, however, the federal government began to assume a new
obligation to assure equality in educational opportunity, and this issue began
to affect college admissions standards. In the last quarter of the 20th
century, the government increased its role in questions relating to how all
Americans could best secure equal access to education.
Schools had problems providing
equal opportunities for all because quality, costs, and admissions criteria
varied greatly. To deal with these problems, the federal government introduced
the policy of affirmative action in education in the early 1970s. Affirmative
action required that colleges and universities take race, ethnicity, and gender
into account in admissions to provide extra consideration to those who have
historically faced discrimination. It was intended to assure that Americans of
all backgrounds have an opportunity to train for professions in fields such as
medicine, law, education, and business administration.
Affirmative action became a
general social commitment during the last quarter of the 20th century. In
education, it meant that universities and colleges gave extra advantages and
opportunities to blacks, Native Americans, women, and other groups that were
generally underrepresented at the highest levels of business and in other
professions. Affirmative action also included financial assistance to members
of minorities who could not otherwise afford to attend colleges and universities.
Affirmative action has allowed many minority members to achieve new prominence
and success.
At the end of the 20th century,
the policy of affirmative action was criticized as unfair to those who were
denied admission in order to admit those in designated group categories. Some
considered affirmative action policies a form of reverse discrimination, some
believed that special policies were no longer necessary, and others believed
that only some groups should qualify (such as African Americans because of the
nation’s long history of slavery and segregation). The issue became a matter of
serious discussion and is one of the most highly charged topics in education
today. In the 1990s three states—Texas, California, and Washington—eliminated
affirmative action in their state university admissions policies.
Several other issues have become troubling to higher
education. Because tuition costs have risen to very high levels, many smaller
private colleges and universities are struggling to attract students. Many students
and their parents choose state universities where costs are much lower. The
decline in federal research funds has also caused financial difficulties to
many universities. Many well-educated students, including those with doctoral
degrees, have found it difficult to find and keep permanent academic jobs, as
schools seek to lower costs by hiring part-time and temporary faculty. As a
result, despite its great strengths and its history of great variety, the
expense of American higher education may mean serious changes in the future.
Education is fundamental to
American culture in more ways than providing literacy and job skills.
Educational institutions are the setting where scholars interpret and pass on
the meaning of the American experience. They analyze what America is as a society by interpreting the nation’s past and defining objectives for the
future. That information eventually forms the basis for what children learn
from teachers, textbooks, and curricula. Thus, the work of educational
institutions is far more important than even job training, although this is
usually foremost in people’s minds.
ARTS AND LETTERS
The arts, more than other
features of culture, provide avenues for the expression of imagination and
personal vision. They offer a range of emotional and intellectual pleasures to
consumers of art and are an important way in which a culture represents itself.
There has long been a Western tradition distinguishing those arts that appeal
to the multitude, such as popular music, from those—such as classical
orchestral music—normally available to the elite of learning and taste. Popular
art forms are usually seen as more representative American products. In the United States in the recent past, there has been a blending of popular and elite art forms,
as all the arts experienced a period of remarkable cross-fertilization. Because
popular art forms are so widely distributed, arts of all kinds have prospered.
The arts in the United States express the many faces and the enormous creative range of the American
people. Especially since World War II, American innovations and the immense
energy displayed in literature, dance, and music have made American cultural
works world famous. Arts in the United States have become internationally
prominent in ways that are unparalleled in history. American art forms during
the second half of the 20th century often defined the styles and qualities that
the rest of the world emulated. At the end of the 20th century, American art
was considered equal in quality and vitality to art produced in the rest of the
world.
Throughout the 20th century,
American arts have grown to incorporate new visions and voices. Much of this
new artistic energy came in the wake of America’s emergence as a superpower
after World War II. But it was also due to the growth of New York City as an
important center for publishing and the arts, and the immigration of artists
and intellectuals fleeing fascism in Europe before and during the war. An
outpouring of talent also followed the civil rights and protest movements of
the 1960s, as cultural discrimination against blacks, women, and other groups
diminished.
American arts flourish in many
places and receive support from private foundations, large corporations, local
governments, federal agencies, museums, galleries, and individuals. What is
considered worthy of support often depends on definitions of quality and of
what constitutes art. This is a tricky subject when the popular arts are
increasingly incorporated into the domain of the fine arts and new forms such
as performance art and conceptual art appear. As a result, defining what is art
affects what students are taught about past traditions (for example, Native
American tent paintings, oral traditions, and slave narratives) and what is
produced in the future. While some practitioners, such as studio artists, are
more vulnerable to these definitions because they depend on financial support
to exercise their talents, others, such as poets and photographers, are less
immediately constrained.
Artists operate in a world
where those who theorize and critique their work have taken on an increasingly
important role. Audiences are influenced by a variety of
intermediaries—critics, the schools, foundations that offer grants, the
National Endowment for the Arts, gallery owners, publishers, and theater
producers. In some areas, such as the performing arts, popular audiences may
ultimately define success. In other arts, such as painting and sculpture,
success is far more dependent on critics and a few, often wealthy, art
collectors. Writers depend on publishers and on the public for their success.
Unlike their predecessors, who
relied on formal criteria and appealed to aesthetic judgments, critics at the
end of the 20th century leaned more toward popular tastes, taking into account
groups previously ignored and valuing the merger of popular and elite forms.
These critics often relied less on aesthetic judgments than on social measures
and were eager to place artistic productions in the context of the time and social
conditions in which they were created. Whereas earlier critics attempted to
create an American tradition of high art, later critics used art as a means to
give power and approval to nonelite groups who were previously not considered
worthy of including in the nation’s artistic heritage.
Not so long ago, culture and
the arts were assumed to be an unalterable inheritance—the accumulated wisdom
and highest forms of achievement that were established in the past. In the 20th
century generally, and certainly since World War II, artists have been boldly
destroying older traditions in sculpture, painting, dance, music, and
literature. The arts have changed rapidly, with one movement replacing another
in quick succession.
Visual Arts
The visual arts have traditionally
included forms of expression that appeal to the eyes through painted surfaces,
and to the sense of space through carved or molded materials. In the 19th
century, photographs were added to the paintings, drawings, and sculpture that
make up the visual arts. The visual arts were further augmented in the 20th
century by the addition of other materials, such as found objects. These
changes were accompanied by a profound alteration in tastes, as earlier
emphasis on realistic representation of people, objects, and landscapes made
way for a greater range of imaginative forms.
During the late 19th and early
20th centuries, American art was considered inferior to European art. Despite
noted American painters such as Thomas Eakins, Winslow Homer, Mary Cassatt, and
John Marin, American visual arts barely had an international presence.
American art began to flourish
during the Great Depression of the 1930s as New Deal government programs
provided support to artists along with other sectors of the population. Artists
connected with each other and developed a sense of common purpose through
programs of the Public Works Administration, such as the Federal Art Project,
as well as programs sponsored by the Treasury Department. Most of the art of
the period, including painting, photography, and mural work, focused on the
plight of the American people during the depression, and most artists painted
real people in difficult circumstances. Artists such as Thomas Hart Benton and
Ben Shahn expressed the suffering of ordinary people through their
representations of struggling farmers and workers. While artists such as Benton
and Grant Wood focused on rural life, many painters of the 1930s and 1940s
depicted the multicultural life of the American city. Jacob Lawrence, for example,
re-created the history and lives of African Americans. Other artists, such as
Andrew Wyeth and Edward Hopper, tried to use human figures to describe
emotional states such as loneliness and despair.
Abstract Expressionism
Shortly after World War II,
American art began to garner worldwide attention and admiration. This change
was due to the innovative fervor of abstract expressionism in the 1950s and to
subsequent modern art movements and artists. The abstract expressionists of the
mid-20th century broke from the realist and figurative tradition set in the
1930s. They emphasized their connection to international artistic visions
rather than the particularities of people and place, and most abstract
expressionists did not paint human figures (although artist Willem de Kooning
did portrayals of women). Color, shape, and movement dominated the canvases of
abstract expressionists. Some artists broke with the Western art tradition by
adopting innovative painting styles—during the 1950s Jackson Pollock "painted"
by dripping paint on canvases without the use of brushes, while the paintings
of Mark Rothko often consisted of large patches of color that seem to vibrate.
Abstract expressionists felt
alienated from their surrounding culture and used art to challenge society’s
conventions. The work of each artist was quite individual and distinctive, but
all the artists identified with the radicalism of artistic creativity. The
artists were eager to challenge conventions and limits on expression in order
to redefine the nature of art. Their radicalism came from liberating themselves
from the confining artistic traditions of the past.
The most notable activity took
place in New York City, which became one of the world’s most important art
centers during the second half of the 20th century. The radical fervor and
inventiveness of the abstract expressionists, their frequent association with
each other in New York City’s Greenwich Village, and the support of a group of
gallery owners and dealers turned them into an artistic movement. Also known as
the New York School, the participants included Barnett Newman, Robert
Motherwell, Franz Kline, and Arshile Gorky, in addition to Rothko and Pollock.
The members of the New York School came from diverse backgrounds such as the American Midwest and Northwest, Armenia, and Russia, bringing an international flavor to the group and its artistic
visions. They hoped to appeal to art audiences everywhere, regardless of
culture, and they felt connected to the radical innovations introduced earlier
in the 20th century by European artists such as Pablo Picasso and Marcel
Duchamp. Some of the artists—Hans Hofmann, Gorky, Rothko, and de Kooning—were
not born in the United States, but all the artists saw themselves as part of an
international creative movement and an aesthetic rebellion.
As artists felt released from
the boundaries and conventions of the past and free to emphasize expressiveness
and innovation, the abstract expressionists gave way to other innovative styles
in American art. Beginning in the 1930s Joseph Cornell created hundreds of
boxed assemblages, usually from found objects, with each based on a single
theme to create a mood of contemplation and sometimes of reverence. Cornell's
boxes exemplify the modern fascination with individual vision, art that breaks
down boundaries between forms such as painting and sculpture, and the use of
everyday objects toward a new end. Other artists, such as Robert Rauschenberg,
combined disparate objects to create large, collage-like sculptures known as
combines in the 1950s. Jasper Johns, a painter, sculptor, and printmaker,
recreated countless familiar objects, most memorably the American flag.
The most prominent American
artistic style to follow abstract expressionism was the pop art movement that
began in the 1950s. Pop art attempted to connect traditional art and popular
culture by using images from mass culture. To shake viewers out of their
preconceived notions about art, sculptor Claes Oldenburg used everyday objects
such as pillows and beds to create witty, soft sculptures. Roy Lichtenstein
took this a step further by elevating the techniques of commercial art, notably
cartooning, into fine art worthy of galleries and museums. Lichtenstein's
large, blown-up cartoons fill the surface of his canvases with grainy black
dots and question the existence of a distinct realm of high art. These artists
tried to make their audiences see ordinary objects in a refreshing new way,
thereby breaking down the conventions that formerly defined what was worthy of
artistic representation.
Probably the best-known pop artist, and a leader in
the movement, was Andy Warhol, whose images of a Campbell’s soup can and of the
actress Marilyn Monroe explicitly eroded the boundaries between the art world
and mass culture. Warhol also cultivated his status as a celebrity. He worked
in film as a director and producer to break down the boundaries between
traditional and popular art. Unlike the abstract expressionists, whose
conceptual works were often difficult to understand, Andy Warhol's pictures,
and his own face, were instantly recognizable.
Conceptual art, as it came to
be known in the 1960s, like its predecessors, sought to break free of
traditional artistic associations. In conceptual art, as practiced by Sol
LeWitt and Joseph Kosuth, concept takes precedent over actual object, by
stimulating thought rather than following an art tradition based on
conventional standards of beauty and artisanship.
Modern artists changed the
meaning of traditional visual arts and brought a new imaginative dimension to
ordinary experience. Art was no longer viewed as separate and distinct, housed
in museums as part of a historical inheritance, but as a continuous creative
process. This emphasis on constant change, as well as on the ordinary and
mundane, reflected a distinctly American democratizing perspective. Viewing art
in this way removed the emphasis from technique and polished performance, and
many modern artworks and experiences became more about expressing ideas than
about perfecting finished products.
Photography
Photography is probably the
most democratic modern art form because it can be, and is, practiced by most
Americans. Since 1888, when George Eastman developed the Kodak camera that
allowed anyone to take pictures, photography has struggled to be recognized as
a fine art form. In the early part of the 20th century, photographer, editor,
and artistic impresario Alfred Stieglitz established 291, a gallery in New York City, with fellow photographer Edward Steichen, to showcase the works of
photographers and painters. They also published a magazine called Camera
Work to increase awareness about photographic art. In the United States, photographic art had to compete with the widely available commercial
photography in news and fashion magazines. By the 1950s the tradition of
photojournalism, which presented news stories primarily with photographs, had
produced many outstanding works. In 1955 Steichen, who was director of
photography at the Museum of Modern Art in New York, called attention to this
work in an exhibition called The Family of Man.
Throughout the 20th century,
most professional photographers earned their living as portraitists or
photojournalists, not as artists. One of the most important exceptions was
Ansel Adams, who took majestic photographs of the Western American landscape. Adams used his art to stimulate social awareness and to support the conservation cause of
the Sierra Club. He helped found the photography department at the Museum of Modern Art in 1940, and six years later helped establish the photography
department at the California School of Fine Arts in San Francisco (now the San
Francisco Art Institute). He also held annual photography workshops at Yosemite National Park from 1955 to 1981 and wrote a series of influential books on
photographic technique.
Adams's elegant landscape photography was only
one small stream in a growing current of interest in photography as an art
form. Early in the 20th century, teacher-turned-photographer Lewis Hine
established a documentary tradition in photography by capturing actual people,
places, and events. Hine photographed urban conditions and workers, including
child laborers. Along with their artistic value, the photographs often
implicitly called for social reform. In the 1930s and 1940s, photographers
joined with other depression-era artists supported by the federal government to
create a photographic record of rural America. Walker Evans, Dorothea Lange,
and Arthur Rothstein, among others, produced memorable and widely reproduced
portraits of rural poverty and American distress during the Great Depression
and during the dust storms of the period.
In 1959, after touring the United States for two years, Swiss-born photographer Robert Frank published The
Americans, one of the landmarks of documentary photography. His photographs
of everyday life in America introduced viewers to a depressing, and often
depressed, America that existed in the midst of prosperity and world power.
Photographers continued to
search for new photographic viewpoints. This search was perhaps most
disturbingly embodied in the work of Diane Arbus. Her photos of mental patients
and her surreal depictions of Americans altered the viewer’s relationship to
the photograph. Arbus emphasized artistic alienation and forced viewers to
stare at images that often made them uncomfortable, thus changing the meaning
of the ordinary reality that photographs are meant to capture.
American photography continues
to flourish. The many variants of art photography and socially conscious
documentary photography are widely available in galleries, books, and
magazines.
A host of other visual arts
thrive, although they are far less connected to traditional fine arts than
photography. Decorative arts include, but are not limited to, art glass,
furniture, jewelry, pottery, metalwork, and quilts. Often exhibited in craft
galleries and studios, these decorative arts rely on ideals of beauty in shape
and color as well as an appreciation of well-executed crafts. Some of these
forms are also developed commercially. The decorative arts provide a wide range
of opportunity for creative expression and have become a means for Americans to
actively participate in art and to purchase art for their homes that is more
affordable than works produced by many contemporary fine artists.
Literature
American literature since World
War II is much more diverse in its voices than ever before. It has also
expanded its view of the past as people rediscovered important sources from
non-European traditions, such as Native American folktales and slave
narratives. Rediscovering these traditions expanded the range of American
literary history.
American Jewish writing from
the 1940s to the 1960s was the first serious outpouring of an American
literature that contained many voices. Some Jewish writers had begun to be
heard as literary critics and novelists before World War II, part of a general
broadening of American literature during the first half of the 20th century.
After the war, talented Jewish writers appeared in such numbers and became so
influential that they stood out as a special phenomenon. They represented at
once a subgroup within literature and the new voice of American literature.
Several Jewish American
novelists, including Herman Wouk and Norman Mailer, wrote important books about
the war without any special ethnic resonance. But writers such as novelists
Saul Bellow, Bernard Malamud, and Philip Roth, and storytellers Grace Paley and
Cynthia Ozick wrote most memorably from within the Jewish tradition. Using
their Jewish identity and history as background, these authors asked how moral
behavior was possible in modern America and how the individual could survive in
the contemporary world. Saul Bellow most conspicuously posed these questions,
framing them even before the war was over in his earliest novel, Dangling
Man (1944). He continued to ask them in various ways through a series of
novels paralleling the life cycle, including The Adventures of Augie March (1953),
Herzog (1964), and Mr. Sammler’s Planet (1970). One novel in the
series earned a Pulitzer Prize (Humboldt's Gift, 1975). Bellow was
awarded the Nobel Prize for literature in 1976. Like Bellow, Philip Roth and
Bernard Malamud struggled with identity and selfhood as well as with morality
and fate. However, Roth often resisted being categorized as a Jewish writer.
Playwright Arthur Miller rarely invoked his Jewish heritage, but his plays
contained similar existential themes.
Isaac Bashevis Singer was also
part of this postwar group of American Jewish writers. His novels conjure up
his lost roots and life in prewar Poland and the ghostly, religiously inspired
fantasies of Jewish existence in Eastern Europe before World War II. Written in
Yiddish and much less overtly American, Singer’s writings were always about his
own specific past and that of his people. Singer's re-creation of an earlier
world as well as his stories of adjusting to the United States won him a Nobel
Prize in literature in 1978.
Since at least the time of the
Harlem Renaissance in the 1920s, American writers of African descent, such as
Richard Wright, sought to express the separate experiences of their people
while demanding to be recognized as fully American. The difficulty of that
pursuit was most completely and brilliantly realized in the haunting novel Invisible
Man (1952) by Ralph Ellison. African American writers since then have
contended with the same challenge of giving voice to their experiences as a
marginalized and often despised part of America.
Several African American
novelists in recent decades have struggled to represent the wounded manner in
which African Americans have participated in American life. In the 1950s and
1960s, James Baldwin discovered how much he was part of the United States after a period of self-imposed exile in Paris, and he wrote about his dark
and hurt world in vigorous and accusatory prose. The subject has also been at
the heart of an extraordinary rediscovery of the African American past in the
plays of Lorraine Hansberry and the fiction of Alice Walker, Charles Johnson,
and Toni Morrison. Probably more than any American writer before her, Morrison
has grappled with the legacy that slavery inflicted upon African Americans and
with what it means to live with a separate consciousness within American
culture. In 1993 Morrison became the first African American writer to be
awarded a Nobel Prize in literature.
Writers from other groups,
including Mexican Americans, Native Americans, Chinese Americans, Korean
Americans, and Filipino Americans, also grappled with their separate
experiences within American culture. Among them, N. Scott Momaday, Leslie
Marmon Silko, and Louise Erdrich have dealt with issues of poverty, life on
reservations, and mixed ancestry among Native Americans. Rudolfo Anaya and Sandra
Cisneros have dealt with the experiences of Mexican Americans, and Amy Tan and
Maxine Hong Kingston have explored Chinese American family life.
Even before World War II,
writers from the American South reflected on what it meant to have a separate
identity within American culture. The legacy of slavery, the Civil War, and
Reconstruction left the South with a sense of a lost civilization, embodied in
popular literature such as Gone With the Wind (1936) by Margaret
Mitchell, and with questions about how a Southern experience could frame a
literary legacy. Southern literature in the 20th century draws deeply on
distinct speech rhythms, undercurrents of sin, and painful reflections on evil
as part of a distinctly Southern tradition. William Faulkner most fully
expressed these issues in a series of brilliant and difficult novels set in a
fictional Mississippi county. These novels, most of them published in the
1930s, include The Sound and the Fury (1929), Light in August
(1932), and Absalom, Absalom (1936). For his contribution,
Faulkner received the Nobel Prize in literature in 1949. More recent Southern
writers, such as Carson McCullers, Flannery O'Connor, Walker Percy, James
Dickey, and playwright Tennessee Williams, have continued this tradition of
Southern literature.
In addition to expressing the
minority consciousness of Southern regionalism, Faulkner's novels also
reflected the artistic modernism of 20th-century literature, in which reality
gave way to frequent interruptions of fantasy and the writing is characterized
by streams of consciousness rather than by precise sequences in time. Other
American writers, such as Thomas Pynchon, Kurt Vonnegut, Jr., and E. L.
Doctorow also experimented with different novel forms and tried to make their
writing styles reflect the peculiarities of consciousness in the chaos of the
modern world. Doctorow, for example, in his novel Ragtime juxtaposed
real historical events and people with those he made up. Pynchon questioned the
very existence of reality in The Crying of Lot 49 (1966) and Gravity’s
Rainbow (1973).
Aside from Faulkner, perhaps
the greatest modernist novelist writing in the United States was йmigrй Vladimir Nabokov. Nabokov
first wrote in his native Russian, and then in French, before settling in the
United States and writing in English. Nabokov saw no limits to the
possibilities of artistic imagination, and he believed the artist's ability to
manipulate language could be expressed through any subject. In a series of
novels written in the United States, Nabokov demonstrated that he could develop
any situation, even the most alien and forbidden, to that end. This was
demonstrated in Lolita (1955), a novel about sexual obsession that
caused a sensation and was first banned as obscene.
Despite its obvious achievements,
modernism in the United States had its most profound effect on other forms of
literature, especially in poetry and in a new kind of personal journalism that
gradually erased the sharp distinctions between news reporting, personal
reminiscence, and fiction writing.
20th-Century Poetry
Modern themes and styles of
poetry have been part of the American repertoire since the early part of the
20th century, especially in the work of T. S. Eliot and Ezra Pound. Their works
were difficult, emotionally restrained, full of non-American allusions, and
often inaccessible. After World War II, new poetic voices developed that were
more exuberant and much more American in inspiration and language. The poets
who wrote after the war often drew upon the work of William Carlos Williams and
returned to the legacy of Walt Whitman, which was democratic in identification
and free-form in style. These poets provided postwar poetry with a uniquely
American voice.
The Beatnik, or Beat, poets of
the 1950s notoriously followed in Whitman’s tradition. They adopted a radical
ethic that included drugs, sex, art, and the freedom of the road. Jack Kerouac
captured this vision in On the Road (1957), a quintessential book about
Kerouac’s adventures wandering across the United States. The most significant
poet in the group was Allen Ginsberg, whose sexually explicit poem Howl
(1956) became the subject of a court battle after it was initially banned as
obscene. The Beat poets spanned the country, but adopted San Francisco as their
special outpost. The city continued to serve as an important arena for poetry
and unconventional ideas, especially at the City Lights Bookstore co-owned by
writer and publisher Lawrence Ferlinghetti. Other modernist poets included
Gwendolyn Brooks, who retreated from the conventional forms of her early poetry
to write about anger and protest among African Americans, and Adrienne Rich,
who wrote poetry focused on women's rights, needs, and desires.
Because it is open to
expressive forms and innovative speech, modern poetry is able to convey the
deep personal anguish experienced by several of the most prominent poets of the
postwar period, among them Robert Lowell, Sylvia Plath, Theodore Roethke, Anne
Sexton, and John Berryman. Sometimes called confessional poets, they used
poetry to express nightmarish images of self-destruction. As in painting,
removing limits and conventions on form permitted an almost infinite capacity
for conveying mood, feeling, pain, and inspiration. This personal poetry also
brought American poetry closer to the European modernist tradition of emotional
anguish and madness. Robert Frost, probably the most famous and beloved of
modern American poets, wrote evocative and deeply felt poetry that conveyed
some of these same qualities within a conventional pattern of meter and rhyme.
Another tradition of modern
poetry moved toward playful engagement with language and the creative process.
This tradition was most completely embodied in the brilliant poetry of Wallace
Stevens, whose work dealt with the role of creative imagination. This tradition
was later developed in the seemingly simple and prosaic poetry of John Ashbery,
who created unconventional works that were sometimes records of their own
creation. Thus, poetry after World War II, like the visual arts, expanded the
possibilities of emotional expression and reflected an emphasis on the creative
process. The idea of exploration and pleasure through unexpected associations
and new ways of viewing reality connected poetry to the modernism of the visual
arts.
Journalism
Modernist sensibilities were
also evident in the emergence of a new form of journalism. Journalism
traditionally tried to be factual and objective in presentation. By the
mid-1970s, however, some of America's most creative writers were using
contemporary events to create a new form of personal reporting. This new
approach stretched the boundaries of journalism and brought it closer to
fiction because the writers were deeply engaged and sometimes personally
involved in events. Writers such as Norman Mailer, Truman Capote, and Joan
Didion created a literary journalism that infused real events with their own
passion. In Armies of the Night (1968), the record of his involvement in
the peace movement, Mailer helped to define this new kind of writing. Capote's In
Cold Blood (1966), the retelling of the senseless killing of a Kansas family, and Mailer’s story of a murderer's fate in The Executioner's Song (1979)
brought this hyperrealism to chilling consummation. No less vivid were Didion's
series of essays on California culture in the late 1960s and her reporting of
the sensational trial of football star O. J. Simpson in 1995.
Performing Arts
As in other cultural spheres,
the performing arts in the United States in the 20th century increasingly
blended traditional and popular art forms. The classical performing arts—music,
opera, dance, and theater—were not a widespread feature of American culture in
the first half of the 20th century. These arts were generally imported from or
strongly influenced by Europe and were mainly appreciated by the wealthy and
well educated. Traditional art usually referred to classical forms in ballet
and opera, orchestral or chamber music, and serious drama. The distinctions
between traditional music and popular music were firmly drawn in most areas.
During the 20th century, the
American performing arts began to incorporate wider groups of people. The
African American community produced great musicians who became widely known
around the country. Jazz and blues singers such as Bessie Smith, Louis
Armstrong, Duke Ellington, and Billie Holiday spread their sounds to black and
white audiences. In the 1930s and 1940s, the swing music of Benny Goodman,
Tommy Dorsey, and Glenn Miller adapted jazz to make a unique American music
that was popular around the country. The American performing arts also blended
Latin American influences beginning in the 20th century. Between 1900 and 1940,
Latin American dances, such as the tango from Argentina and the rumba from Cuba, were introduced into the United States. In the 1940s a fusion of Latin and jazz
elements was stimulated first by the Afro-Cuban mambo and later on by the
Brazilian bossa nova.
Throughout the 20th century,
dynamic classical institutions in the United States attracted international
talent. Noted Russian-born choreographer George Balanchine established the
short-lived American Ballet Company in the 1930s; later he founded the company
that in the 1940s would become the New York City Ballet. The American Ballet
Theatre, also established during the 1940s, brought in non-American dancers as
well. By the 1970s this company had attracted Soviet defector Mikhail
Baryshnikov, an internationally acclaimed dancer who served as the company’s
artistic director during the 1980s.
In classical music, influential
Russian composer Igor Stravinsky, who composed symphonies using innovative
musical styles, moved to the United States in 1939. German-born pianist,
composer, and conductor Andrй Previn, who started out as a
jazz pianist in the 1940s, went on to conduct a number of distinguished
American symphony orchestras. Another Soviet, cellist Mstislav Rostropovich,
became conductor of the National Symphony Orchestra in Washington, D.C., in 1977.
Some of the most innovative
artists in the first half of the 20th century successfully incorporated new
forms into classical traditions. Composers George Gershwin and Aaron Copland,
and dancer Isadora Duncan were notable examples. Gershwin combined jazz and
spiritual music with classical in popular works such as Rhapsody in Blue
(1924) and the opera Porgy and Bess (1935). Copland developed a unique
style that was influenced by jazz and American folk music. Early in the
century, Duncan redefined dance along more expressive and free-form lines.
Some artists in music and
dance, such as composer John Cage and dancer and choreographer Merce
Cunningham, were even more experimental. During the 1930s Cage worked with
electronically produced sounds and sounds made with everyday objects such as
pots and pans. He even invented a new kind of piano. During the late 1930s,
avant-garde choreographer Cunningham began to collaborate with Cage on a number
of projects.
Perhaps the greatest, and
certainly the most popular, American innovation was the Broadway musical, which
also became a movie staple. Beginning in the 1920s, the Broadway musical
combined music, dance, and dramatic performance in ways that surpassed the
older vaudeville shows and musical revues but without being as complex as
European grand opera. By the 1960s, this American musical tradition was well
established and had produced extraordinary works by important musicians and
lyricists such as George and Ira Gershwin, Irving Berlin, Cole Porter, Richard
Rodgers, Lorenz Hart, Jerome Kern, and Oscar Hammerstein II. These productions
required an immense effort to coordinate music, drama, and dance. Because of
this, the musical became the incubator of an American modern dance tradition
that produced some of America's greatest choreographers, among them Jerome
Robbins, Gene Kelly, and Bob Fosse.
In the 1940s and 1950s the
American musical tradition was so dynamic that it attracted outstanding
classically trained musicians such as Leonard Bernstein. Bernstein composed the
music for West Side Story, an updated version of Romeo and Juliet
set in New York that became an instant classic in 1957. The following year,
Bernstein became the first American-born conductor to lead a major American
orchestra, the New York Philharmonic. He was an international sensation who
traveled the world as an ambassador of the American style of conducting. He
brought the art of classical music to the public, especially through his
"Young People's Concerts," television shows that were seen around the
world. Bernstein used the many facets of the musical tradition as a force for
change in the music world and as a way of bringing attention to American
innovation.
In many ways, Bernstein
embodied a transformation of American music that began in the 1960s. The
changes that took place during the 1960s and 1970s resulted from a significant
increase in funding for the arts and their increased availability to larger
audiences. New York City, the American center for art performances, experienced
an artistic explosion in the 1960s and 1970s. Experimental off-Broadway
theaters opened, new ballet companies were established that often emphasized
modern forms or blended modern with classical (Martha Graham was an especially
important influence), and an experimental music scene developed that included
composers such as Philip Glass and performance groups such as the Guarneri
String Quartet. Dramatic innovation also continued to expand with the works of
playwrights such as Edward Albee, Tony Kushner, and David Mamet.
As the variety of performances
expanded, so did the serious crossover between traditional and popular music
forms. Throughout the 1960s and 1970s, an expanded repertoire of traditional
arts was being conveyed to new audiences. Popular music and jazz could be heard
in formal settings such as Carnegie Hall, which had once been restricted to
classical music, while the Brooklyn Academy of Music became a venue for
experimental music, exotic and ethnic dance presentations, and traditional
productions of grand opera. Innovative producer Joseph Papp had been staging
Shakespeare in Central Park since the 1950s. Boston conductor Arthur Fiedler
was playing a mixed repertoire of classical and popular favorites to large
audiences, often outdoors, with the Boston Pops Orchestra. By the mid-1970s the
United States had several world-class symphony orchestras, including those in
Chicago; New York; Cleveland, Ohio; and Philadelphia, Pennsylvania. Even
grand opera was affected. Once a specialized taste that often required
extensive knowledge, opera in the United States increased in popularity as the
roster of respected institutions grew to include companies in Seattle, Washington; Houston, Texas; and Santa Fe, New Mexico. American composers such as John
Adams and Philip Glass began composing modern operas in a new minimalist style
during the 1970s and 1980s.
The crossover in tastes also influenced the Broadway
musical, probably America's most durable music form. Starting in the 1960s,
rock music became an ingredient in musical productions such as Hair
(1967). By the 1990s, it had become an even stronger presence in musicals such
as Bring in Da Noise, Bring in Da Funk (1996), which used African
American music and dance traditions, and Rent (1996) a modern, rock
version of the classic opera La Bohиme. This updating of the musical opened the
theater to new ethnic audiences who had not previously attended Broadway shows,
as well as to young audiences who had been raised on rock music.
Performances of all kinds have
become more available across the country. This is due to both the sheer
increase in the number of performance groups as well as to advances in
transportation. In the last quarter of the 20th century, the number of major
American symphonies doubled, the number of resident theaters increased
fourfold, and the number of dance companies increased tenfold. At the same
time, planes made it easier for artists to travel. Artists and companies
regularly tour, and they expand the audiences for individual artists such as
performance artist Laurie Anderson and opera singer Jessye Norman, for musical
groups such as the Juilliard Quartet, and for dance troupes such as the Alvin
Ailey American Dance Theater. Full-scale theater productions and musicals first
presented on Broadway now reach cities across the country. The United States, once a provincial outpost with a limited European tradition in performance,
has become a flourishing center for the performing arts.
Libraries and Museums
Libraries, museums, and other
collections of historical artifacts have been a primary means of organizing and
preserving America’s legacy. In the 20th century, these institutions became an
important vehicle for educating the public about the past and for providing
knowledge about the society of which all Americans are a part.
Libraries
Private book collections go
back to the early European settlement of the New World, beginning with the
founding of the Harvard University library in 1638. Colleges and universities
acquire books because they are a necessary component of higher education.
University libraries have many of the most significant and extensive book
collections. In addition to Harvard’s library, the libraries at Yale University, Columbia University, the University of Illinois at Urbana-Champaign in Urbana, and the University of California in Berkeley and Los Angeles are among the most
prominent, both in scope and in number of holdings. Many of these libraries
also contain important collections of journals, newspapers, pamphlets, and
government documents, as well as private papers, letters, pictures, and
photographs. These libraries are essential for preserving America’s history and for maintaining the records of individuals, families, institutions,
and other groups.
Books in early America were scarce and expensive. Although some Americans owned books, Benjamin Franklin
made a much wider range of books and other printed materials available to many
more people when he created the first generally recognized public library in
1731. Although Franklin’s Library Company of Philadelphia loaned books only to
paying subscribers, the library became the first one in the nation to make
books available to people who did not own them. During the colonial period Franklin’s idea was adopted by cities such as Boston, Massachusetts; Providence, Rhode Island; and Charleston, South Carolina.
These libraries set the
precedent for the free public libraries that began to spread through the United States in the 1830s. Public libraries were seen as a way to encourage literacy among
the citizens of the young republic as well as a means to provide education in
conjunction with the public schools that were being set up at the same time. In
1848 Boston founded the first major public library in the nation. By the late
19th century, libraries were considered so essential to the nation's well-being
that industrialist Andrew Carnegie donated part of his enormous fortune to the
construction of library buildings. Because Carnegie believed that libraries
were a public obligation, he expected the books to be contributed through
public expenditure. Since the 19th century, locally funded public libraries
have become part of the American landscape, often occupying some of the most
imposing public buildings in cities such as New York, Los Angeles, Detroit, and Philadelphia. The belief that the knowledge and enjoyment that books provide
should be accessible to all Americans also resulted in bookmobiles that serve
in inner cities and in rural counties.
In addition to the numerous
public libraries and university collections, the United States boasts two major
libraries with worldwide stature: the Library of Congress in Washington, D.C., and the New York Public Library. In 1800 Congress passed legislation founding the
Library of Congress, which was initially established to serve the needs of the
members of Congress. Since then, this extraordinary collection has become one
of the world's great libraries and a depository for every work copyrighted in
the United States. Housed in three monumental buildings named after Presidents
John Adams, Thomas Jefferson, and James Madison, the library is open to the
public and maintains major collections of papers, photographs, films, maps, and
music in addition to more than 17 million books.
The New York Public Library was
founded in 1895. The spectacular and enormous building that today houses the
library in the heart of the city opened in 1911 with more than a million
volumes. The library is guarded by a famous set of lion statues, features a
world-famous reading room, and contains more than 40 million catalogued items.
Although partly funded through public dollars, the library also actively seeks
funds from private sources for its operations.
Institutions such as these
libraries are fundamental to the work of scholars, who rely on the great
breadth of library collections. Scholars also rely on many specialized library
collections throughout the country. These collections vary greatly in the
nature of their holdings and their affiliations. The Schmulowitz Collection of
Wit and Humor at the San Francisco Public Library contains more than 20,000
volumes in 35 languages. The Schomburg Center for Research in Black Culture in Harlem, part of the New York Public Library, specializes in the history of Africans around
the world. The Schlesinger Library on the History of Women in America, located at Radcliffe Institute for Advanced Study in Massachusetts, houses the
papers of prominent American women such as Susan B. Anthony and Amelia Earhart.
The Bancroft Collection of Western Americana and Latin Americana is connected
with the University of California at Berkeley. The Huntington Library in San Marino, California, was established by American railroad executive Henry Huntington
and contains a collection of rare and ancient books and manuscripts. The
Newberry Library in Chicago, one of the most prestigious research libraries in
the nation, contains numerous collections of rare books, maps, and manuscripts.
Scholars of American history
and culture also use the vast repository of the National Archives and Records
Administration in Washington, D.C., and its local branches. As the repository
and publisher of federal documents, the National Archives contain an
extraordinary array of printed material, ranging from presidential papers and
historical maps to original government documents such as the Declaration of
Independence, the Constitution, and the Bill of Rights. It houses hundreds of
millions of books, journals, photos, and other government papers that document
the life of the American people and its government. The library system is
deeply entrenched in the cultural life of the American people, who have from
their earliest days insisted on the importance of literacy and education, not
just for the elite but for all Americans.
Museums
The variety of print resources
available in libraries is enormously augmented by the collections housed in
museums. Although people often think of museums as places to view art, in fact
museums house a great variety of collections, from rocks to baseball
memorabilia. In the 20th century, the number of museums exploded. And by the
late 20th century, as institutions became increasingly aware of their important
role as interpreters of culture, they attempted to bring their collections to
the general public. Major universities have historically also gathered various
kinds of collections in museums, sometimes as a result of gifts. The Yale University Art Gallery, for example, contains an important collection of American
arts, including paintings, silver, and furniture, while the Phoebe Hearst
Museum of Anthropology at the University of California at Berkeley specializes
in archaeological objects and Native American artifacts.
The earliest museums in the United States grew out of private collections, and throughout the 19th century they
reflected the tastes and interests of a small group. Often these groups
included individuals who cultivated a taste for the arts and for natural
history, so that art museums and natural history museums often grew up side by
side. American artist Charles Willson Peale established the first museum of
this kind in Philadelphia in the late 18th century.
The largest and most varied
collection in the United States is contained in the separate branches of the
Smithsonian Institution in Washington, D.C. The Smithsonian, founded in 1846 as
a research institution, developed its first museums in the 1880s. It now
encompasses 16 museums devoted to various aspects of American history, as well
as to artifacts of everyday life and technology, aeronautics and space, gems
and geology, and natural history.
The serious public display of
art began when the Metropolitan Museum of Art in New York City, founded in
1870, moved to its present location in Central Park in 1880. At its
installation, the keynote speaker announced that the museum’s goal was
education, connecting the museum to other institutions with a public mission.
The civic leaders, industrialists, and artists who supported the Metropolitan Museum, and their counterparts who established the Museum of Fine Arts, Boston, the Art Institute of Chicago, and the Philadelphia Museum of Art, were also
collectors of fine art. Their collections featured mainly works by European
masters, but also Asian and American art. They often bequeathed their
collections to these museums, thus shaping the museum’s policies and holdings.
Their taste in art helped define and develop the great collections of art in
major metropolitan centers such as New York, Chicago, Philadelphia, and Boston. In several museums, such as the Metropolitan and the National Gallery of Art in Washington, D.C., collectors created institutions whose holdings challenged the cultural
treasures of the great museums of Europe.
Funding
Museums continued to be largely
elite institutions through the first half of the 20th century, supported by
wealthy patrons eager to preserve collections and to assert their own
definitions of culture and taste. Audiences for most art museums remained an
educated minority of the population through the end of the 19th century and
into the 20th century. By the second decade of the 20th century, the tastes of
this elite became more varied. In many cases, women within the families of the
original art patrons (such as Gertrude Vanderbilt Whitney, Abby Aldrich
Rockefeller, and Peggy Guggenheim) encouraged the more avant-garde artists of
the modern period. Women founded new institutions to showcase modern art, such
as the Museum of Modern Art (established by three women in 1929) and the
Whitney Museum of American Art in New York. Although these museums still
catered to small, educated, cosmopolitan groups, they expanded the definition
of refined taste to include more nontraditional art. They also encouraged
others to become patrons for new artists, such as the abstract expressionists
in the mid-20th century, and helped establish the United States as a
significant place for art and innovation after World War II.
Although individual patronage
remained the most significant source of funding for the arts throughout the
20th century, private foundations began to support various arts institutions by
the middle of the century. Among these, the Carnegie Corporation of New York and the Rockefeller Foundation were especially important in the 1920s and 1930s,
and the Ford Foundation in the 1960s. The federal government also became an
active sponsor of the arts during the 20th century. Its involvement had
important consequences for expanding museums and for creating a larger
audience.
The federal government first
began supporting the arts during the Great Depression of the 1930s through New
Deal agencies, which provided monetary assistance to artists, musicians,
photographers, actors, and directors. The Work Projects Administration also
helped museums to survive the depression by providing jobs to restorers, cataloguers,
clerical workers, carpenters, and guards. At the same time, innovative
arrangements between wealthy individuals and the government created a new kind
of joint patronage for museums. In the most notable of these, American
financier, industrialist, and statesman Andrew W. Mellon donated his extensive
art collection and a gallery to the federal government in 1937 to serve as the
nucleus for the National Gallery of Art. The federal government provides funds
for the maintenance and operation of the National Gallery, while private
donations from foundations and corporations pay for additions to the collection
as well as for educational and research programs.
Government assistance during
the Great Depression set a precedent for the federal government to start
funding the arts during the 1960s, when Congress appropriated money for the
National Endowment for the Arts (NEA) as part of the National Foundation on the
Arts and the Humanities. The NEA provides grants to individuals and nonprofit
organizations for the cultivation of the arts, although grants to institutions
require private matching funds. The need for matching funds increased private
and state support of all kinds, including large donations from newer arts
patrons such as the Lila Wallace-Reader's Digest Fund and the Pew Charitable
Trusts. Large corporations such as the DuPont Company, International Business
Machines Corporation (IBM), and the Exxon Corporation also donated to the arts.
Expansion
The increased importance placed
on art throughout the 20th century helped fuel a major expansion in museums. By
the late 1960s and 1970s, art museums were becoming aware of their potential
for popular education and pleasure. Audiences for museums increased as museums
received more funding and became more willing to appeal to the public with
blockbuster shows that traveled across the country. One such show, The
Treasures of Tutankhamun, which featured ancient Egyptian artifacts, toured
the country from 1976 to 1979. Art museums increasingly sought attractions that
would appeal to a wider audience, while at the same time expanding the
definition of art. This effort resulted in museums exhibiting even motorcycles
as art, as did the Guggenheim Museum in New York in 1998.
Museums also began to expand
the kinds of art and cultural traditions they exhibited. By the 1990s, more and
more museums displayed natural and cultural artifacts and historical objects
from non-European societies. These included objects ranging from jade carvings,
baskets, and ceramics to calligraphy, masks, and furniture. Egyptian artifacts
had been conspicuous in the holdings of New York's Metropolitan Museum and the Brooklyn Museum since the early 20th century. The opening in 1989 of two
Smithsonian museums in Washington, D.C., the National Museum of African Art and
the National Museum of the American Indian, indicated an awareness of a much
broader definition of the American cultural heritage. The Asian Art Museum of San Francisco and the Freer Gallery at the Smithsonian in Washington, D.C., maintain collections of Asian art and cultural objects. The 1987 opening of the
Arthur M. Sackler Gallery, a new Smithsonian museum dedicated to Asian and Near
Eastern arts, confirmed the importance of this tradition.
Collectors and museums did not
neglect the long-venerated Western tradition, as was clear from the personal
collection of ancient Roman and Greek art owned by American oil executive and
financier J. Paul Getty. Opened to the public in 1953, the museum named after
him was located in Malibu, California, but grew so large that in 1997 the J.
Paul Getty Museum expanded into a new Getty Center, a complex of six buildings
in Los Angeles. By the end of the 20th century, Western art was but one among
an array of brilliant cultural legacies that together celebrate the human
experience and the creativity of the American past.
Memorials and Monuments
The need to memorialize the
past has a long tradition and is often associated with wars, heroes, and
battles. In the United States, monuments exist throughout the country, from the
Revolutionary site of Bunker Hill to the many Civil War battlefields. The
nation’s capital features a large number of monuments to generals, war heroes,
and leaders. Probably the greatest of all these is Arlington National Cemetery in
Virginia, where there are thousands of graves of veterans of American wars,
including the Tomb of the Unknowns and the gravesite of President John F.
Kennedy. In addition to these traditional monuments to history, millions of
people are drawn to the polished black wall that is the Vietnam Veterans
Memorial, located on the National Mall in Washington, D.C. The memorial is a
stark reminder of the losses suffered in a war in which more than 58,000
Americans died and of a time of turmoil in the nation.
No less important than
monuments to war heroes are memorials to other victims of war. The United States Holocaust Memorial Museum, which opened in 1993 in Washington, D.C., is dedicated to documenting the extermination of millions of Jews and others by the
Nazis during World War II. It contains photographs, films, oral histories, and
artifacts as well as a research institute, and has become an enormous tourist
attraction. It is one example of a new public consciousness about museums as
important sources of information and places in which to come to terms with
important and painful historical events. Less elaborate Holocaust memorials
have been established in cities across the country, including New York, San Francisco, and Los Angeles.
Monuments to national heroes
are an important part of American culture. These range from the memorials to
Presidents George Washington, Thomas Jefferson, and Abraham Lincoln on the
National Mall in Washington, D.C., to the larger-than-life faces of Washington,
Jefferson, Lincoln, and Theodore Roosevelt carved into Mount Rushmore in South Dakota. Some national memorials also include monuments to ordinary citizens, such as
the laborers, farmers, women, and African Americans who are part of the new
Franklin Delano Roosevelt Memorial in Washington, D.C.
Americans also commemorate
popular culture with museums and monuments such as the Rock and Roll Hall of
Fame and Museum in Cleveland, Ohio, and the Baseball Hall of Fame and Museum in
Cooperstown, New York. These collections of popular culture are as much a
part of American heritage as are fine arts museums and statues of national
heroes. As a result of this wide variety of institutions and monuments, more
people know about the breadth of America’s past and its many cultural
influences. This new awareness has even influenced the presentation of
artifacts in natural history museums. Where these once emphasized the
differences among human beings and their customs by presenting them as discrete
and unrelated cultures, today’s museums and monuments emphasize the flow of
culture among people.
The expansion in types of
museums and the increased attention to audience is due in part to new groups
participating in the arts and in discussions about culture. In the early 20th
century, many museums were supported by wealthy elites. Today’s museums seek to
attract a wider range of people including students from inner cities, families
from the suburbs, and Americans of all backgrounds. The diverse American population
is eager to have its many pasts and talents enshrined. The funding now
available through foundations and federal and state governments provides
assistance. This development has not been without resistance. In the 1980s and
1990s people challenged the role of the federal government in sponsoring
certain controversial art and culture forms, posing threats to the existence of
the National Endowment for the Arts and the National Endowment for the
Humanities. Nevertheless, even these controversies have made clearer how much
art and cultural institutions express who we are as a people. Americans possess
many different views and pasts, and they constantly change what they create,
how they communicate, and what they appreciate about their past.