Suburbanization Derived from As Two Ships

The Perfect Storm; Regional Change, Big City Implosion and Second War Between the States

Chapter 22 Pre Elgar Draft of Chapter 22

Chapter 17 as Final Revision

Appendix I Midcentury

Part III: Big City Decline and UR as Hinge

Chapter 14 Final Mid-Century Bic City Suburbanization

Chapter 9 1920’s Suburbs

 

Should We Be Concerned about Suburbs? 1920’s

 

Sociologist R. D. McKenzie in 1933 commented Big Cities were surrounded by growing suburbs/unincorporated areas, acknowledging new economic, political and social interrelationships between City and hinterland meant changes in function, roles and physical requirements of each–an “entirely new social and economic entity” (McKenzie, 1933). Since 1880 the Census Bureau had reported statistics for central cities and “metropolitan district”; in1910 the Bureau provided data on twenty-five metros. The 1920 census reported a majority of Americans lived in urban areas; more than two-thirds of that majority lived in fifty-eight metro areas accounting for half the nation’s overall population increase (McKelvey, 1968, p. 31). The Bureau, however, also noted outlying peripheries grew at a faster rate than the central city itself. It wasn’t until the 1920’s, however, that a critical mass, combined with a visible periphery exodus caught the attention of planners and elements of the business community—and, of course, our RPNY planners.

 

But in this decade there was little consensus whether that was to be feared or fostered. Continued central city dominance was simply assumed as more or less the natural order of things.

 

We may think of [the] metropolitan economy as an organization of people having a large city as a nucleus … of producers and consumers mutually dependent for goods and service … concentrated in a large city which is the focus of local trade and the center through which normal economic relations with the outside are established and maintained …. A closer examination of these dependent towns [suburbs] would show different types performing different functions, but all subordinate (Glabb & Brown, 1983, pp. 270-1)

 

Suburbs, it was thought, could be managed or coordinated using alternatives such as regional planning, centralized infrastructure, and a network of highways leading to the CBD were prospective solutions. Battered neighborhoods and a first rate housing crisis were perceived as the central city’s chief obstacles to maintaining the status quo. The hope was to limit suburban growth by reducing central city population exodus. Still there was no disputing, the suburbs were gathering momentum.

 

Growth in the Twenties Suburb

No doubt a large part of the answer to suburban growth involved the car. Everyone had one—actually with 8 million registration in 1920, only one in thirteen households had one. By 1925, however, 17.5 million cars were registered and 2.5 million trucks on the road (Jackson, 1985, p. 162). In the 1920’s the era of the streetcar era (and streetcar suburbs) was over.  Hated streetcar monopolies with its all-too-corrupt streetcar politics lost ridership which led to General Motors and municipal-owned bus systems and ripping up the lines. Streetcar ridership peaked in 1923. “Several cities made key decisions against major spending on public transit during the 1920’s …. [For example] Detroit voters in 1929 rejected a $280 million dollar proposal that would have resulted in subways, 65 miles of surface rapid transit, and 560 miles of trolley lines” (Abbott, 1987, p. 43).

 

Suburbs exploded in the interwar period “simply because millions of people wanted to live in them”. In 1920 a little less than 7 million lived in suburbs—by 1940 nearly 17.5 million (Wilson, 1974, pp. 34, pp. 46-57). Baltimore grew by 9.7% (1920-1930), its suburbs, 52%; Shaker Hills, Ohio (Cleveland suburb) grew 1000% and Elmwood Park, Illinois by 717%. The link between car registration and explosive suburban growth is undeniable. These are auto suburbs. One need not have been “elite” to buy a car. The era of solely elite suburbs is over. Dolores Hayden describes these years as “Mail-Order and Self-Built Suburbs”; she includes a section on Sears’s mail order (whose motto was “A Home of Your Own is an Absolute Necessity”) from which she estimates 50,000 houses were constructed (Hayden, 2003, p. 97). The era of American home ownership, an era that last until 2008, had started. Manufacturing also decentralized to the suburbs as well. “In 1919 eleven central cities in the country’s forty largest manufacturing counties still accounted for 85 percent of the [nation’s] manufacturing workers. By 1937 this percentage had fallen to just under 60” (Glabb & Brown, 1983, p. 275).

 

Typical was Shaker Heights (Cleveland suburb) pioneered by two brothers Otis and Mantis (I did not make up their names) Van Sweringen, former clerks and bicycle shop owners. Purchasing 1400 acres in what had been the site of a Shaker religious community, they meticulously planned a suburb comprised of different subdivisions at varying price levels ensuring each subdivision included homes with similar priced units on 100 foot lots. Cheaper units would not impinge on their neighbors. They abandoned the city grid and platted curved and semi-elliptical roads off of large boulevards. Natural parks areas were retained and each subdivision was comprised of units with similar housing design and followed strict exterior and landscaping appearances. As always marketing and promotion was core to the model. From 1919 to 1929 (the Depression) three hundred homes were sold each year. Starting with a population of 1,500, by decade’s end Shaker Heights was 15,000. Unfortunately for the two brothers, they sold their company, invested the profits, and went bankrupt in the Depression.

Pre Depression Initiatives to Deal with Growing Suburbia

Since the 1850’s state legislatures (Dillon’s Law again) confronted repeated and divisive requests from central cities to annex and prospective suburbs to incorporate. The longstanding truism state legislatures were disproportionately controlled by rural and non-Big City politicians, while correct, is probably over-stated in regards to annexation in the 19th century. Big Cities were authorize to annex repeatedly by state legislatures but “special legislation” or special charters were more a colossal pain the in the butt than an opportunity to poke a stick in the Big City eye. So early in the Civil War period state legislatures increasingly abandoned special legislation in favor of voter referendum. By 1910 27 of 46 states had repealed these special charters in favor referendum (Teaford J. , 1979, p. 38)—let the voter decide.

 

So long as the Big City could provide enhanced services, safe, clean, cheap water being the most central, suburbs were willing to vote annexation. Yellow fever epidemics in the 1890’s convinced many suburbs to join the Big City. Big Cities, therefore were very successful in annexation drives, most often led by chambers who sincerely believed “the bigger the city, the better” because stagnant cities risk non-competitiveness with growing cities (Teaford J. , 1979, p. 88). But after 1900, this Big City advantage dried up with each year. Several alternative ways for suburbs to acquire the needed services were increasingly available. Between 1900 and 1910 Big Cities were still successful, but this was Big City annexations’ last Hurrah.

 

What made mail order, self-built or builder subdivision suburbanization possible was the development of the service district. The service district, another hybrid structure, set apart from gifts and loan clauses, and outside municipal government budgetary and debt constraints was “governed” by private/public boards, sometimes elected, sometimes appointed. These infrastructure hybrids were responsible for water, sewers, roads, lighting districts and all the good stuff that earlier Big Cities had detailed to corporation charters, franchises, regulated utilities, and state and/or local independent boards and commissions. The service district, without doubt, put to rest the flux of imperfect public/private HEDOs that had dominated infrastructure since Chapter 3. Service districts took over the heavy lifting associated with installing infrastructure necessary for a large urban center and an industrial economic base—an infrastructure that had been the highest priority and first task of early economic developers. To the astute reader that remembers our conceptual model, onionization takes a great step forward. Service districts made large-scale suburbanization possible.

 

Possibly the first Big City service district was Boston’s 1895 Metropolitan Water District; in 1005 NYC established its massive Water Authority, and the North New Jersey Water District was formed in 1918.

Suburban service districts were possible in the Twenties because tax exempt debt was attractive to bond markets and outside the debt limits of state constitutions. Moreover, by the Twenties many of the smaller and poorest suburbs, and the settled unincorporated areas had consolidated or annexed to suburbs. A suburban landscape of larger suburban towns and cities resulted—they reached sufficient scale to justify their own service districts. Service districts cemented suburban autonomy—taking annexation off the table and locking Big Cities into a fixed geography.

 

The year in which Illinois’ legislature enacted the Sanitary District Bill was the last year Chicago would make massive additions to its territory. Boston would annex only one town following the creation of the Metropolitan Water District.” Between 1900 and 1930, the state legislature of Ohio enacted twenty-five measures providing for intergovernmental contractual relations, including statutes that allowed municipalities to construct joint sewage systems and joint water works.” (Teaford J. , 1979, pp. 80-2)

 

If not service districts, then the so-called “urban county” could also provide services. In 1898 only two states empowered counties to operate libraries; by 1930, 28 state counties were empowered. The first county part (Massachusetts) was authorized in 1895; by 1930 there were 45 states authorizing counties to operate parks. And function after function was added during these years to county governments. Baltimore County, Westchester NY, and Los Angeles led the pack.

 

States, like Ohio, empowered counties during the Twenties to assume what formerly were municipal functions for suburbs and unincorporated areas. Sewage and water districts were subordinated to counties, and the scope of functions performed by counties exploded to include libraries, health and sanitation, parks and recreation, fire and police, and regional planning. County fiscal, budgetary and policy capacity reforms accompanied these new responsibilities. “In urban areas from New York to California, the county was extending its governmental functions and responsibilities more rapidly than the municipality, and the county’s share of local government expenditures was rising.” (Teaford J. , 1979, pp. 81-2) Aside from infrastructure urban counties did not become involved with economic development-related programs during the Twenties.

 

During these years (1900-1920) the muckrakers were active and reaction to Big City machine corruption and inefficiency may have been at its highest level. Suburbanites feared the cost of corruption and the inefficiency of service delivery. The issue of ethnic intolerance and racism were also evident, but more like the crime and gangsters associated with Prohibition played a larger role in this period. “Dry” suburbs were an attraction the rising women’s movement. The success of business structural reforms, the city manager system with nonpartisan, at-large elections also popular in these years made an effective alternate form of suburban government possible. Zoning allowed suburbanites to “customize” the suburb to make it more attractive to voters in referendum campaigns. By 1920, Big City annexation efforts were an increasingly hard sell—by the 1920 a near impossible sell.

 

In other Big Cities, powerful forces coalesced around their best idea on how to marry city and suburb to effectively compete against rival cities: city/county consolidation. Cincinnati, for example, obsessed with Cleveland’s perceived growth and success, turned to city/county consolidation. The city manager municipal government, the chamber, the Citizen’s League and the League of Women’s Voters lined up in support. In accordance with Dillon’s Law, the battle was fought in the state legislature where Cincinnati’s Republican political machine rallied state-wide suburban support to defeat the measure. Attempts by Seattle (1923), St Louis (1926, 1930), and two attempts in Cleveland also met with failure during the Twenties. Out West, things were different. Both Denver and Los Angeles approved partial city/county consolidations (McKelvey, 1968, p. 60).

 

The most interesting alternative, a real hot-button during the Twenties was the “Federation”. The federation borrowed from the 1888 County of London (England) model. Within the County were 28 self-governing suburban boroughs responsible for services such as parks, streets, libraries, recreation, while the County Council handled county-wide services such as sewage, planning and water. Suburbs could keep their identity, self-government, and customize key services—while city/county consolidation made the County the government of the area—it was a dual level federation.

 

Massachusetts debated it in the 1890’s but by the Twenties California, Pittsburgh, Cleveland, Milwaukee, and St Louis were hot to trot. Vigorous campaigns were waged and between 1928 and 1934 the federation efforts were on the verge of victory in Pittsburgh and Cleveland. State legislatures, however, for various reasons proved an obstacle, and in the referendums wealthy suburbs supported the federation but the more numerous working and middle class suburbs voted against it—and the promising federation alternative rode off into the policy sunset.

 

J.C. Nichols: the Mall and the Country Club

The contemporary mythical image of the suburban subdivision had to come from somewhere—our candidate is J.C. Nichols and his Country Club District developed during the 1920’s. Jesse Nichols was an out and out Privatist, but not one that perfectly fits the stereotype. His earlier real estate experiences included building working class subdivisions and City Beautiful-like subdivisions. Taking a gamble in 1912, he started development on a new type of subdivision which borrowed heavily from a 1907-1908 Baltimore suburb, Roland Park designed by Olmstead. With no utilities or streetcar, he visualized an upper class suburb designed for, and dependent on the automobile. Moreover, he was determined that it would remain a high class district forever. Property values would never deteriorate and the quality and beauty of the subdivision would last.

 

Homes, on large lots, in park-like settings were only the first step. Statutes, fountains and all the “beauty” associated with the old city beautiful were replicated. Broad streets, plenty of parking, lawns in the front. Open space between streets preserved a country-like atmosphere. The key to “forever” were near- perpetual deed restrictions which prescribed, indeed micromanaged, what could be done with the property, types of improvements, colors, gardens—and, of course, racial covenants. To enforce these restrictions and covenants, owners were required to join a “homeowners association” which contracted for the services necessary to maintain the subdivision, and enforce its “standards”. Fire and police, as well as utilities were contracted by the association as well. By 1915, these homeowner associations were converted into Missouri or Kansas nonprofits and built into state law. Homeowner associations wasted little time and became a sort of neighborhood government who lobbied city, schools, county and state for services and supportive legislation.

The Country Club District proved successful. In the early Twenties, Nichols upped the ante. Believing the future of shopping and consumer commerce did not lie with the congested downtown, he began his design of America’s “first extensively planned and architecturally homogenous shopping center” (Brown & Dorsett, 1978, p. 176) adjacent to his Country Club district—what today is called an outdoor mall. Up to this point “strip” shopping centers, hodge-podge clusters at intersections, without parking typified commercial developments along the streetcar lines.

 

Nichols, again relying on the car, departed from the streetcar line and built on-site housing, a quarry, brickyard, and a landfill. Nichols spent over $1 million to acquire the land; actual construction started in 1922. The shopping center was designed around the automobile with ample parking a defining feature. Kansas City’s famous City Beautiful planner, George Kessler was retrained to design and an Olmstead student brought in to assist. Architectural uniformity, around a Spanish style was incorporated in each building. The first stores opened in 1923, and soon after branches of downtown department and chain stores opened as well. Nichols believed in owner associations and he set up a Plaza Merchants Association to coordinate business activities and programs. The Association in 1923 began its first Christmas display—an event that continues to the present. Apartments ringed the Plaza on three sides, creating traffic; by 1929, 5,000 lived in the six block area (Brown & Dorsett, 1978, pp. 176-9).

 

Both the District and Plaza were financially successful, and extremely popular. By the end of the Depression, nearly 10% of Kansas City’s metro area population lived in the ever-growing Country Club District. It had prospered even during the horrors of the Depression. Nichol’s model was adopted by developers across the nation. And so by the early 1920’s the now infamous, insipid, monotonous, tasteless mall had arrived on America’s suburban scene—Babbitt was happy, less so architecture students and planners. Privatists on the other hand were overjoyed.

 

 

Mid-Century Suburbs

As early as 1950, suburban growth rates were ten times that of the central city, and by 1954 nine million lost souls had moved to the suburbs. The twenty years following War’s end were truly the breakout years of suburbia. Kenneth Jackson estimated between 1950 and 1970, suburban population more than doubled, from 36 million to 70 million—seventy-four percent of the nation’s population growth (Jackson, 1985, p. 238).  Average 1950 to 1970 population growth for seventeen metropolitan areas of the Northeast was 46%; their central cities declined by an average of 2%.—meaning average suburban population growth for these seventeen metros was 107%. Boston’s low of 35% suburban growth was dwarfed by Minneapolis-St Paul’s 281%. New York’s suburbs increased by 4.26 million, Chicago 1.77 million, Philadelphia 1.33 million and Detroit by 1.82 million. Suburban homeownership left central cities with dramatically increased renter populations (McDonald, 2008, pp. 85-7). Contrary to the truism Big Cities couldn’t chase their population. Milwaukee, Kansas City, Indianapolis and Columbus successfully annexed large amounts of land. For all the good it did—they still suburbanized.

 

Echoing James F. McDonald’s “suburbanization is the most important economic and social trend of the second half of the twentieth century” (McDonald, 2008, p. 85) our principal focus is the suburb’s effect on sub-state economic development. Also, incorporating the Big City suburban exodus into our perspective makes it easier to understand the sky-rocketing concern with decentralization. The massive population flow after 1946 electrified postwar Big City census figures and awakened not unreasonable fears for viability of central city hinterland dominance.

 

Our suburb “starting point” is there is no “typical” suburb. Suburbs are not alike—they never have been and we don’t intend to start here. Postwar-pre-1970 suburbs, labeled by Hayden as “sit-com suburbs”, Hanlon as suburban homogeneity, and Jackson as the “cultural home of the white middle-class family”, have been well described by zillions of academic studies and intellectual critiques. There is no doubt a race/class dimension between city and suburb exists during these years—but income class can obscure a lot of internal diversity. First-ring suburbs captured the lion’s share of expansion; they often reflected the spilling over of ethnic lower middle-class periphery neighborhoods across city boundaries and conforming to conventional neighborhood succession patterns. They may have looked much the same in the 1950’s, but their subsequent evolution and political/policy development produced notable variation.

 

 

Levittown

Postwar suburbanization symbolically began on May 7th 1947, the opening day sale of Levittown NY.  By May 9th, developers had sold over 1000 housing units. Levittown built a variant of Wright’s rambler and modeled its subdivision on Broadacre. Built by non-union labor, with restrictive covenants, it included for the first time kitchen appliances. Using Ford’s assembly line technique, Levittown added 30 units daily–selling 4000 units by year’s end. Ultimately, the Levitt Brothers sold over 17,000 homes in the subdivision (1951). Levittown’s suburban business model spread through the East by the early 60’s.

 

Ticky-tacky Levittown “little boxes” (coined in 1962 by Malvina Reynolds, made famous by Pete Seeger’s 1963 ditty) provided grist for academics and writers (Updike, for example). Leave it to Beaver, Father Knows Best and Ozzie & Harriet imparted a visual image to the aspirational American Dream. Despite their inaccurate characterization of postwar families, they became the avatar for suburbs.

 

The suburban residential complex threw off substantial profits, created new occupations, and redefined the BLS-FIRE sector classification. In short, a “subdivision developer” suburban real estate industry exploded after the late 1940’s. This new sector reflected, and created, consumer demand through its understanding of the American dream, and by recognizing links between housing, car, and employment. Separating housing finance from housing construction, the subdivision-industry complex built cheap, small Capes outside a Big City unable to annex. The inability to annex differentiated Western from Eastern cities in this era (McDonald, 2008, p. 90).

 

The Litany and the Dilemma

There was a litany of explanations for the postwar suburban explosion. It is helpful, if not important, the reader appreciate this litany; they are critical to an understanding of suburbs and the “suburban paradigm” that emerged in future years. Suburbanization’s immediate postwar driver was a housing shortage of GI new households and obsolescence, and probably blight/racial change as well. The rapid household formation that followed demobilization, left the greatest generation and their new-born baby boomers sleeping (or not) in the bedrooms of their less than delighted grandparents’ homes/apartments. Deluded by (1944) GI Bill mortgages, recent car purchases, and newly-built freeways, millions of WWII veterans fled central cities (and their parents). The latest version of the American Dream, suburbs, commenced, to the relief of all three generations. Other reasons have been advanced as well.

 

Perhaps the most damning has been suburban linkage to racism. Suburban in-migration was driven by racial change in the central city—which, at some level is obviously accurate. Once there white suburban residents closed the door to African-American in-migration through a variety of techniques, including racially-restrictive covenants, real estate steering, and exclusionary zoning. Income and racial segregation were seen as two sides of the same coin. The Supreme Court (1948), Shelley v. Kraemer (successfully argued by NAACP attorney, Thurgood Marshall) outlawed such covenants, but real estate practices circumvented much of the ruling. Exclusionary zoning persisted, until checked by the 1975 Mount Laurel (New Jersey) decision.

 

It is also asserted the federal government stimulated/facilitated suburbanization in general, andAHA

postwar suburbanization in particular. New Deal residential mortgages (FHA), banking reforms, housing tax incentives, and the GI Bill rendered the federal government complicit in racial discrimination. Highways built since World War I with federal encouragement picked up considerable momentum in the postwar era. Most highways were state and freeways locally-funded, however. Highways certainly facilitated residential mobility and the relocation of manufacturing; the decline of the CBD during this era was attributed to highways, which dispersed city population, while congesting the downtown. The trouble is that Big Cities were building freeways and expressways to counter perceived decentralization problems. Highways reaching out to the farthest suburbs had been a long-standing component of Big City and metropolitan planning; indeed, suburban highways were a central feature of Burnham’s 1909 Chicago Plan and the City Beautiful. If so, then planner/economic developer paradigms were incorrect. In building highways, they thought they were helping to mitigate suburbanization. The federal government had its own agenda for its actions.

 

There is, one supposes, “blame” for suburbanization, (i.e. many judge it “bad”). Maybe we shouldn’t be a polycentric metropolitan area—but we live in one today. Two factors suggest strongly that it is time to move on. At the time of this writing we are a majority suburban nation and have been for almost a half-century. Secondly, there is an inherent dilemma that complicates, frustrates and distorts our normative images of suburbia. Many of us want to live there, even if we “shouldn’t”. This “dilemma” was evident from the beginning and is expressed in Robert C. Wood’s pioneering work on Suburbia (1958). Woods (his daughter Maggie Hassan is 2016 Governor of New Hampshire), a Harvard -MIT scholar at the time, later served as HUD Under Secretary (1965-1969) and (temporary) Secretary. He is a principal author of LBJ’s Model Cities.

 

A final word is due my friends and neighbors in my suburb, Lincoln (Massachusetts). On balance the judgment of this book is not favorable to suburbia and they may wonder why I choose to live to live in a place I criticize so strongly. The answer is simply that my professional opinion should never be confused with my personal tastes, and the fact that I recommend a general philosophy and outlook as desirable does not mean that I have succeeded in living by it. Lincoln is undoubtedly an anachronism and it is probably obstructive to the larger purposes of the Boston region. But it is a very pleasant and hospitable anachronism, and while it exists, I am quite happy to indict myself (Woods, 1958, pp. vii-viii).

 

I might add, Woods not only lived in a suburb—he worked in one, Cambridge.

 

Suburbs exist, they are not going away in my lifetime, and some of them are active in ED policy-making. Suburbs have impacted our history greatly, and they form a prominent element of our contemporary physical landscape. They deserve a place in our history—and to the extent there is something distinctive about suburban ED is should be noted.

 

The Selling of Suburbs

The South wasn’t the only product sold in the postwar era. Suburbs were sold as well. Many believe the “selling of the suburbs” by subdivision and mall developers is an important explanation for postwar suburbanization. Postwar suburbs, often unincorporated areas, developed around subdivisions and malls. Unlike the central city, suburbs did not grow outward from a core area (CBD), but instead, reversed the pattern by developing a CBD after initial subdivisions were in place —or never built one at all (unless a city hall/government office with a post office next door is a CBD?). Suburbs often develop around independent and semi-autonomous neighborhoods.

 

In one of this history’s more outlandish misadventures, I propose we consider subdivision/shopping center/malls and industrial/office/technology parks as specific economic development strategies characteristic of suburban economic development, at least in this era. If we do so, then real estate developers once again return to their role as private economic developers—a role they had played since the 1890’s (and, ironically, were playing in the halls of Congress engaged in public housing, slum clearance and, CBD urban renewal policy-making). Suburban private developers, however, were only part of a hybrid growth strategy: the other half being suburban municipal (county) planners/city managers. While not exactly the “odd couple” of public/private partnerships, this is hybrid coalition was pervasive in early (especially) suburban economic development.

 

Weiss (Weiss, 1987) has constructed such a hybrid model, observing a post-World War I “working relationship” with the comprehensive planning movement. J. C. Nichols (our first suburban mall developer in the 1920’s), Weiss contends, convinced professional planners that substantial public involvement was required if suburban real estate developers, Weiss’s community builders, were to successfully develop a large subdivision. The nub of the problem was the cost of land and length of time required to sell units constructed on that land rendered financing either expensive or impossible. Lenders needed assurances beyond the security offered by the subdivision’s assets. Moreover, prospective owners purchasing units on newly developed subdivisions required assurances that subsequent owners/builders would not cheapen (undesirable uses) or adversely affect their property values. Such assurances were originally provided through a variety restrictive covenant administered by a subdivision homeowners association.

 

But these covenants, discriminatory or otherwise, were only a partial solution—they could not, for instance, carry over to adjacent parcels of land outside the subdivision, nor could they ensure the necessary infrastructure, particularly streets and public services be available. For this set of assurances, public sector participation was required. Public participation also came in the form of the subdivision’s inclusion in the community’s comprehensive master plan, and critical specifications (lot size, setbacks, building codes, fire/safety regulations) included in zoning ordinances, codes, and other regulations approved by the jurisdiction. Infrastructure required additional commitments in the community capital budget and street construction/maintenance countenanced sustained public involvement. Subdivisions (malls, industrial/office parks), to be successful, required detailed and sustained partnership between public and private actors. If so, subdivision and mall development was a public/private ED strategy long before Levittown put its first shovel into the ground.

 

In any case, subdivision and shopping center construction took off after the war; by 1957 ULI was publishing guidelines on shopping centers development. During these years, retail matured into a pillar of the suburban economic base. Planned retail centers (all sizes) grew from fewer than 1,000 in 1950 to over 25,000 in 1984 (40% of all retail sales) (McKeever, 1977, p. 13). Eye-catching regional shopping centers became the skyscrapers of a suburban low-rise downtown. They sprang up across the nation: Northgate near Seattle, Shoppers’ World in Framingham near Boston, Northland near Detroit, and 1962 the Randhurst Center “the largest shopping center under one roof, near Chicago.

 

Shopping centers were designed to “tame the automobile”[i], accommodating its use through easy access to highways and an internal road system that compelled customers to park their cars and walk to pedestrian-only buildings. “Anchors” for larger suburban retail centers were department store(s) and “grocery stores for the smaller centers. By 1960 shopping centers included such diverse and varied uses as adjoining office parks and apartments, medical centers, movie theaters, food courts, and even light entertainment, fountains and mini-landmarks under their weather-controlled roof. The ubiquitous role of the car, however, created a “cycle of dependence”–the car was how one got to the shopping center, and the truck how it was supplied; suburbs as initially built rested on the car and streets, requiring future users to travel by auto. The cycle of dependence became a built-in feature of suburban lifestyle.

 

Subdivision/mall growth strategy created occupations and businesses to service suburban housing and commercial/industrial development. This in turn drove suburban demographic and economic growth. Aside from conventional real estate sales practices, and a “you build it, we will come mentality”, no municipal government promotion (“boosterism”) was required. Real estate, housing, commercial trade, personal services and residential banking developed into key elements of its emerging economic base. Housing proved an excellent generator of small business with its network of supplier-contractor-logistics and consumer-service businesses. Personal services (restaurants, and entertainment) added diversity —at a time of incipient deindustrialization, suburbs instinctively diversified.

 

One does not need public economic development in this environment. Accordingly, early growth-oriented suburban planning departments served double duty (working with local chambers that fostered small business growth and entrepreneurial social integration) to handle run-of-the-mill economic development demands (liaison, regulation compliance). Economic development was handled by the subdivision/mall/planner nexus, supplemented by small business, social-networking, and occasional advocacy, chambers. There was such a thing as “suburban economic development” after all.

 

A final suggestive thought. This section applies to “Big City suburbanization”, the residential suburb and not the shopping mall suburb, and given it centers on subdivisions, not manufacturing decentralization, applies more to East Coast and Middle Atlantic suburb formation than Midwest where manufacturing decentralization played a prominent role. Implied in this ED policy-making scenario is the policy system that it is encased within. East Coast Big City suburbs (Upstate New York, New England) derived their origins from the Yankee Diaspora, formed towns where states played a strong role in ED, the heart of the American Progressive Movement and grew up with “structural reform and city efficient” business elite values. In this cultural atmosphere one can expect community development, rather than mainstream economic development to develop. These are the CBD and central city commuters whose principal goal was home and hearth, low taxes, and minimum public services– not growth. ED policy was not likely to be important or highly valued, nor would concern with the suburb’s jurisdictional economic base. Housing, traffic, education and other “people” services were highly valued. What ED there was focused on neighborhoods, people, and in this culture was often expressed through planning. In these communities/Towns ED’s overlap with planning is most visible.

 

Conversely, some suburbs, through planned attraction, war production decentralization, or simply because of innate location advantages developed a manufacturing economic base. John F. McDonald asserted:

 

The change in location pattern was the result of deaths of firms in the central city, births of firms in the suburbs, employment declines in firms in the city, and growth of firms in the suburbs. Only a relatively small amount of the net change can be explained by the direct relocation of firms from the central city to the suburbs (McDonald, 1984).

 

Manufacturing firms increased suburban production by taking advantage of new single-story-spread-out facilities near cheap, highway accessible land–while their land-locked central city facilities produced as best they could so long as long as they were profitable. When profits declined, shutdowns followed. Suburban facilities increased production; employment and population growth was sure to increase. Through the fifties and early sixties, the hard truths were evident but hope for recovery still existed. McDonald believed the bottom fell out in 1968 (McDonald, 2008, p. 97), when riots prompted a rush to exit the central city. If this scenario is accurate, change in product demand and manufacturing/logistical technology drove suburban manufacturing growth. Suburbs did not need to “steal” central city manufacturing through public or private ED attraction programs.

 

Industrial parks fit well into the real estate/planning-based ED policy fabric. In the 1940’s and 1950’s industrial parks were just that. “Following World War II, the pace of change quickened and the modern industrial park, as an outgrowth of earlier industrial districts, emerged as the major new trend in industrial development” (Beyard, 1988, p. 18). Industrial parks, both private (the overwhelming majority) and public went up by the hundreds across the nation—mostly in suburbs. Industrial parks, congruent with suburban economic development policy-making dominated by comprehensive planners, however, were transitioning from manufacturing into other sectors. During the 1970’s and 1980’s, as the national economy shifted noticeably from manufacturing to service and technology, industrial parks became “business parks”—moving on later to science or technology parks.

 

This line of thought provides a context to evaluate the various suburban types that emerged. It is offers an explanation why there is no single “suburban” style of ED within a metro area, or across the nation. Call it metropolitan pluralism, political fragmentation, multi-nodal nuclei, or sprawl (each describes the same phenomenon), postwar suburbanization, beyond electing state legislators to defend suburban autonomy, radically revolutionized the hinterland landscape. For Abbott (Abbott, 1981, p. 184) the “main political actors are the central city and the suburban governments that have rapidly been developing independent economic and political resources that enable them to treat the central city as a peer”. “Political lineups tend to shift from issue to issue“. But the foundation of metropolitan pluralism rests upon what we used to call “suburban autonomy”. In several ways, the various suburban types that developed during specific time periods closely mirror our onionization of EDO structural types.

 

It would take more than a half-century, however, before the Policy World appreciated the enormous municipal diversity inherent in suburbanization. In the postwar world, suburbs were a stereotype—and a negative one at that. In fifties they were Hayden’s “sitcom suburbs”—residential, family-centered, small town (Woods) “everybody knows your name” suburbs where parents live boring, Organization/Mad Men lives that drove their children away. My research strongly suggest that suburban policy systems were (and are) not “Big City writ small”. Local elites, small town democracy, absentee-commuter suburbs, parochialism, and their own peculiar demographic footprint or jurisdictional economic base meant an amazing variety of policy systems developed in our Big City hinterlands.

 

As far as economic development was concerned, some suburbs wanted to grow; others wanted no part of growth. Some suburbs delegated economic development to the county, spending their “policy-time” in other policy areas such as schools. Gated communities and a phenomenon, later to be discussed, “Privatopia decentralized suburban policy systems. Home and family dominated many a suburb—while others exhibited such population turnover that no one took serious interest in suburban policy making. Private city-builders (mostly in the South and West) like Del Webb (retirement communities) and “corporate” Woodlands Texas (started in 1964) or Irvine California created privatized policy systems. Postwar Rouse New Towns like Columbia Maryland followed in the footsteps of Tugwell’s depression era New Towns, creating a more “progressive” policy system. Suburbs, even in the postwar years, were never pure Republican, bastions of insipid middle-class conservatism and neo-liberal Privatism. If you want diversity, one can find it in suburbs and suburban policy systems.

 

Describing suburban economic development is going to be a challenge in future chapters. In that 53% of Americans in 2015 live in suburbs—and 21% in rural areas—hinterland economic development cannot be ignored. Economic development cannot be discussed solely in terms of Big Cities. The page has turned in our economic development history. To capture transformation in the urban landscape that revolutionized postwar America, this history expands its definition of competitive urban hierarchy to include not only the competition among Big Cities, but competition within metropolitan areas (Florida, January 2013).

 

 

The Big Hinge

Not only was each city different in its own way, but the reader can see that UR served several gods (purposes or goals). UR ranked high on municipal policy agendas during this era. The future of the old order, the hegemony, seemed in question. In this highly intense policy environment, UR became linked with the viability of policy systems—i.e. elites and voters were willing to toss out old bums and bring in new ones. In the Sunbelt the goals/strategy purposes behind UR was less a counter to destabilizing decentralization than a coming out of a new urban and regional competitive hierarchy—and it too was linked to change in policy systems. UR was driven by population migration, by middle class “moving up” and inter-regional generational migration. Both led to explosive growth that did not respect jurisdictional boundaries.

 

UR included a community development approach as well as the more well-known CBD business-led growth coalition. Cities did not have to choose one or the other; most blended the two approaches. Although CBD captured the headlines, neighborhood/housing based UR also was fairly common. Generally, the two approaches involved different actors and certainly sought different goals. During this era, there is a subtle struggle between housing, neighborhood-focused urban renewal and CBD, industrial parks, and “eds and meds”-focused UR. A constant was the bureaucratic nature of the strategy that required specialized expertise, a sophisticated planning, legal and project management professional corps that approached change from the top-down. Neighborhoods, for example, may or may not have been involved in the “planning” but UR was never their strategy of choice. CBD and “eds and meds” UR, however, was quite the reverse.

 

In the course of its implementation, new EDOs, tools, and programs will emerge. Indeed, by the end of the Age, a second economic development professional association will be born—and a number of new community development associations will dot our professional landscape. Whatever else it may be, UR is a professional “hinge” that closes one door and opens another. UR was the midwife for the birth of contemporary economic and community development. This notion of professional hinge is an important underlying theme (and rationale) behind this chapter. To best understand our contemporary ED/CD a background as to how each wing congealed and evolved is critical. UR connects the pre-contemporary ED/CD to our present-day contemporary approaches. UR may be the Scarlet Letter no one (ED or CD) wishes to be held accountable for, but historically there is no denying their shared parentage.

 

Finally, as has been mentioned, UR displays a distinctive regional dimension. BIG City UR is where the strategy originated, as a response to Big City dynamics, threats and policy actors. Big Cities drove the Washington connection until the 1960’s. Western cities took advantage of UR to build modern CBDs congruent with their image of their high status arrival in the competitive urban hierarchy. Over time (in the Seventies) western UR confronted a rising neighborhood movement that wanted its share of the ED/CD pie, but that is a story left for another day and chapter. The South was more complex. Predictably race played a large role—in a very surprising way—but southern cities did share with the West a need to conduct a “City Beautiful” UR to modernize their downtowns. Some cities, Pacific coast and Atlanta overlapped somewhat with Big Cities, but Texas cities in particular predictably put CBD UR on steroids to build downtowns that fit their Texas ambitions. To lend some clarity, each region will be treated in its section. For each region several cities are briefly discussed to offer flavor, examples and observations.

 

 

Won’t you play in my sandbox?

 

                   The rise of community development coincided with the implosion of hegemonic Big Cities and the early onslaught of a new immigration wave—affecting the rising Sunbelt the most. In this section the history focuses on the collapse of the northern Big Cities. Whether housing/slum clearance, highways or urban renewal ever stood a change of combating suburbanization is unlikely at best. By the mid-1960s the question was moot. The sixties extinguished whatever hope that urban renewal would eliminate slums and blight and “stanch the flow of white, middle-class families to the suburbs” (Beauregard, 1993, p. 165). A sustained Great Migration, a massive suburban exodus and multi-year riots brought existing Big City policy systems to their knees. The Brown decision and bussing were frosting on a “cake left out in the rain.” The media, being the media, trumpeted the extremes, and the Big City Policy World went into a deep and bitter funk—railing against suburbs and the South. The seventies proved to be the nadir of Big City central city ED. Big City policy systems had a “hard landing”—as much “psychological” as demographic, economic and fiscal. Big Cities lost faith in themselves.

 

The Facts: So help me God

Simply put, Big Cities stopped growing between 1960 and 1990.[ii] Between 1950 and 1970 St. Louis lost 27 percent of its central city population; Pittsburgh, Boston and Buffalo lost 20 percent; Detroit and Cleveland 18 percent. Only Indianapolis (+74 percent), Columbus (+44 percent) and Kansas City (+11 percent had grown—through annexing suburbs. In the latter three cities manufacturing employment grew 20 percent, 67 percent and 51 percent respectively. Big City suburbs, however, exploded.

Washington metro led the pack with 93 percent suburban growth; Minneapolis and Columbus (80 percent); Kansas City (54 percent), Milwaukee (51 percent) and Detroit (49 percent) followed. Boston and Pittsburgh’s suburban growth were more muted (21 and 19 percent). Pittsburgh, however, (-23 percent) and Buffalo (-18 percent) lost manufacturing employment—the earliest non-textile victims of the yet unnamed manufacturing decline. NYC (-9 percent), Philadelphia, Boston, Detroit and Cleveland (-2 percent) also lost manufacturing jobs. Overall, surprisingly in these years the 17 hegemonic Big Cities grew manufacturing employment by 18 percent.

At least part of the reason this all seemed so confusing at the time was that regional change was not yet understood, deindustrialization still unnamed, and even Big City growth and decline seemed somewhat uneven. Suburban decentralization of manufacturing made the waters even murkier. It was hindsight that produced today’s experts.[iii]

It didn’t get much better between 1970 and 1990. St. Louis lost another 37 percent, Detroit and Cleveland 32 percent, Buffalo 29 percent population. Only Columbus, of the 17 hegemonic Big Cities, increased (17 percent). Overall, in these two decades, hegemonic Big Cities lost over 17 percent of their central city population. For most, even metro (suburban) growth was muted compared to national averages. Big City metro growth was an anemic 5.6 percent. Leaving aside DC (32 percent), Minneapolis-St. Paul was highest (25 percent) and Columbus, KC, Baltimore and Indianapolis followed in line. But, Buffalo (-12 percent), Pittsburgh (-11 percent) and Cleveland (-9 percent) metro areas lost population. Even NYC’s metro area lost more than 3 percent. While many academics were wailing about the power of central city business/real estate growth machines, the Big City central city was in a vicious population decline that lasted for 40 years.

All this was bad enough, but simply put, hegemonic Big City central cities lost much of their middle class. Home ownership rates plummeted in central cities and skyrocketed in the suburbs,[iv] and median family income did not keep pace with national averages—in some cities (1970 to 1990) it actually, and amazingly, declined for the metro area: Buffalo (-6 percent), Cleveland (-3 percent) and Detroit (-1.5 percent). Pittsburgh grew only 2 percent for the 20-year period. The Great Lakes chronic city syndrome had arrived on the ED scene.

 

Myrdal’s Vicious Circle: Race and Functions

McDonald described the 1970–1989 central city urban crisis as “the vicious circle in urban America“. His vicious circle asserted that “once a central city or part of a central city starts downhill, the negative social and economic features of that downhill slide reinforce each other. The start of the downhill slide can be caused by a variety of external forces, including industrialization, building an expressway leading to suburbanization of both jobs and people …. These forces act on the central city [as a whole], but typically the worst outcomes are confined to particular portions of the central city”. (McDonald, 2008, p. 222)  Long-term decline, perhaps surprisingly to current readers, became most apparent after the 1980 census—prompting many to think of the Eighties as “bottom”. Collapse, however, was clearly revealed in the 1970 Census. Detroit, Cleveland, Buffalo, Milwaukee, Chicago, Minneapolis and St Louis (descending order) declined most. (McDonald, 2008, pp. 224-5, 243, Tables 13-1 &2).

 

Whatever the problem identified during this period, it took on a racial dimension. A Fortune editorial wrote in 1968 “the Negro Problem represents a crisis within a crisis, a specific and acute syndrome in a body already ill from more general disorders” (Ways, 1968, p. 133). Race was the bottom line and race redefined slum into ghetto—making mainstream Chamber/UR economic development less relevant than place-based community development. Hidden from sight is the total rejection of the previous Age of UR physical paradigm. The paradigm gave way to social action/concerns and other policy areas: welfare, crime, education, and racial discrimination. The ghetto “served as the symbolic reference [to urban decline and] to the many social ills … associated with social disorganization” (Beauregard, 1993, pp. 172-3). If ghetto separatism/ anti-colonialism infused community development, it also prompted a shift of political power to Blacks, i.e. major policy system change marked by the rise of black mayors/city councilors. In short, most postwar Big City municipal policy systems came tumbling down.

 

“New guys”, white mavericks or non-charismatic technicians often replaced them (Rizzo or Beame for example). A few mayors stand out (White in Boston), but Big City instability decapitated the old political establishment. Big City economic developers were on their own, making do as best they could with weaker mayors (Lipsky, 1980). City councils became more diverse and black mayors were elected. The first (1967) black mayor, Carl Stokes of Cleveland was followed by Gary Indiana’s Richard Hatcher, Detroit’s Coleman Young (1973), Bradley (Los Angeles, 1973), and Walter Washington (Washington DC, 1975). In the Eighties Atlanta’s Maynard Jackson (1981), Chicago’s Harold Washington (1983), Clarence Byrnes in Baltimore (1987) and NYC’s David Dinkins (1990) were elected. Significant central city policy systems change resulted after the Great Society, urban riots, and urban crisis aftermath.

 

In this atmosphere yet another issue appeared in Policy World publications: the issue of central city “purpose”—so-called “functions” traditionally associated with Big City metropolitan hegemony that were now lost. It started with Norton Long’s “the City as Reservation” (with inmates and keepers, “economically dependent on intergovernmental transfer payments“). For Long, the only “function” retained by the “new” central city was to serve as a sort of Indian reservation “for the poor, the deviant, the unwanted, and for those who make a business or career managing them” (Long, 1971, p. 32).

 

Long was followed by scholar George Sternlieb who pinpointed the problem: “the crisis of the city is not a crisis of race. Rather, the crisis of the city is a crisis of function. The major problem … of our cities is simply their lack of economic value … what is left to the city that it does better than someplace else”. The only function left, he observed, was housing Long’s unfortunates. The central city had become a “sandbox” for them to play and leave the rest of us alone (Sternlieb, 1971). True to course, these articles were found in urban textbooks for decades to follow. In 1976 central cities were likened to a “cemetery” (urban death). (Baer, 1976, pp. 18-9)  Accordingly, planners, economic developers, academics and Think Tanks considered adopting “planned shrinkage”, “mothballing” or abandoning parts of cities until attractive to business investment. This strategy entailed a deliberate cutting back of public services in the most deteriorated city areas and encouragement of the population to leave.

 

At a Brookings Institution 1977 Round Table, Sternlieb, head of Rutgers Center for Urban Policy Research suggested founding a federal urban development bank: “[We should view] the city very much the way we viewed the development of a bomb shelter or fallout shelter program….The question then, of public policy, is what is the least cost approach that is politically feasible to preserve an infrastructure so that if, and when there is a public recognition, desire, and necessity to reutilize them, there is something left to reutilize[v].. In this 1970’s environment “land parcels cleared in renewal areas [will] remain vacant for long periods of time. Society can be viewed as ‘banking this land for potential future use whenever changed local conditions stimulate increased demand there’”. Downs suggested a “modified form of triage …This strategy means there would be no large expenditures for upgrading efforts in most parts of much deteriorated areas…. Eventually after the much deteriorated areas are almost totally vacant, they may be redeveloped with wholly different uses” (Downs, 1976).

 

The importance of “functions” to contemporary economic development should not be understated. The concept of functions started economic development down the long trail that, when combined with deindustrialization, led to job creation as the primary criterion, if not “goal” of mainstream economic development. To future community developers lay the task of revitalizing the ghetto, facilitating control by residents over their fate. CUED, holding an optimistic perspective advocated government assumption of chamber business assistance programs, including generalized strategies of retention and attraction, relying on public lending, tax-exempt financing, tax abatement and industrial parks—coupled with “reformed” physical redevelopment targeting waterfronts, “eds and meds”, and mixed used downtown redevelopment.  One can see in CETA and JTPA a workforce approach that stressed continued and enhanced shared decision-making with business leadership in training and skills upgrading—along with youth training compatible with community development objectives. Also in the period, one can see movement toward place- based, i.e. destination tourism gathering steam (Faneuil Hall).

 

The issue of functions, however important, is an elusive topic. Where the list of functions originated is unknown—this author never got a copy. Apparently known only to urban intelligentsia, there is such a list, and the Big City had served many key (and lesser) functions that upheld the “correct” metropolitan order (monocentric). To this day, these “functions” remain fundamental to the “legacy city” paradigm. In any case, in these years the questions of what should be done with central city functions? Were they lost forever? Could they be recaptured? Could new functions be found?  These questions were/are important. Turned inside out they can, and will, serve as strategy goals for future central city economic developers.

 

The Fiscal Crisis

Central cities were broken fiscally as well as psychologically. The causes of fiscal stress are rather obvious: declining/stagnant tax base, residential abandonment, CBD retail/commercial decline (sales tax), and higher expenditures to house the nation’s poor. Revenues were flat or falling. The national economy suffered from record-breaking inflation and increasing energy costs. CPI in 1970 was 3%; rising to 11% annually. Central city payrolls increased for a variety of good, and not so good reasons. Police departments expanded to cope with rising crime, gang and murder rates. Unions especially militant during the decade, increased wages and personnel. Productivity dipped. Only three central cities (Buffalo, Cleveland, and Pittsburgh) reduced payrolls during this period. (Teaford, 1990, pp. 221-23). Taxes went up–to the delight of all. Budgets cut where possible. Sale of municipal assets was common. Regionalization of expenditures through authorities/service districts (sewage, libraries, parks, transportation, zoos, museums, and power plants) commonplace.  Annual deficits were masked with one time revenues— and short-term debt. And still that wasn’t enough. St Louis ran an illegal deficit for three of four years (1971-1974). Philadelphia (1975) ran a half-billion dollar deficit. Penn Central Railroad went bankrupt, gutting Buffalo’s property base.

 

Short-term debt was the real problem. Two or three year notes must be refinanced. In an inflationary period, each refinancing with higher interest rates cost more. New York was left in deep fiscal distress by Lindsey; 1970 short term debt was 20% of total revenue–when he left office in 1974 it was 25%. NYC accounted for about one-quarter of the nation’s outstanding short term state/local indebtedness. Boston, Cleveland, Baltimore, St. Louis and Philadelphia were deeply in hock also. State legislatures reluctantly crawled into this out-of-control spending machine. By 1976 two-thirds of Baltimore’s budget originated from intergovernmental (state and federal) transfer payments–so did Buffalo. New York City got 50% of its revenue from inter-government transfers, Cincinnati, Pittsburgh, Boston, Detroit and Minneapolis 40% or higher.

 

The fiscal crisis made central cities wards of state and federal governments. Buffalo’s Mayor Stanley Makowski bemoaned “We are creatures of the state–its children, if you will. If we cannot turn to our parent, where     can we turn? (Teaford, 1990, pp. 224-6) The delicate balance of power between Big Cities and state legislatures seemed upset—arguably permanently altered. Running a reservation and sandbox is expensive. Fiscal capacity, the lack of it to be precise, redefined Big City fiscal stability into a central city economic development objective. On top of recapturing lost “functions”, central city ED was entrusted with the task of “paying the central city’s bills”. The scramble to pay bills yielded at least one interesting innovation.

 

Tax-sharing may be a partial answer to metropolitan tax incentive competition—and the Twin Cities in 1971 approved a path-breaking commercial/industrial tax-based sharing legislation, the Metropolitan Redistribution (Fiscal Disparities) Act. An important element of the so-called “Minnesota Miracle” the act became effective in 1975 after being upheld by the courts. The Act required all communities in a seven-county area to share 40% of the incremental growth of industrial/commercial taxes. The proceeds determined and allocated by formula (based on fiscal capacity) to social services. Developed from a 1969 Citizen League Report partly prompted by a bitter annexation battle to annex a high-tax yielding power plant between Bloomington and another suburb (Orfield & Wallace, 2007). Since passage, it has worked well and has been periodically updated to reflect new issues and concerns.

 

Whose Default Was It? Drop Dead Big Cities!

The 1975 NYC financial collapse elevated central city debt into a national crisis. With deficits of $ 1.5 billion, and $5.3 billion in refinanced short term notes, the city was on the brink of default. Previously stop gap remedies and one-time fixes kept the city afloat, but by June banks flat out refused to refinance the short term debt. NYC, in technical default, was shut out of the credit markets. The state legislature responded by creating the Municipal Assistance Corporation (Big Mac) which hired Felix Rohatyn to restructure city debt and impose operating efficiencies. The Emergency Financial Control Board, controlled by state appointees, assumed more direct control over City Hall administration. Substantial layoffs, increased public transportation fares, wage freezes, and an end to City University free tuition were recommended. (Teaford, 1990, p. 227). November, 1975, the city secured congressional approval for up to $2.3 billion in short term loans. President Ford said No! The headline read “Ford to New York City: Drop Dead“. The evolving federal position regarding “municipal bailouts” couldn’t be clearer! For the next four years New York City remained closed out of money markets and was a ward of the state.

 

Cleveland was next up. With the indubitable Dennis Kucinich newly elected and pledging no new taxes, Cleveland did its version of the short-term debt polka, suspension of bond ratings, trying to sell its Municipal Light Commission–and then refusing to do so. Finally, Cleveland defaulted outright in December 1977, “the first major municipality to do so since the Great Depression” Fiscal instability “spread like an epidemic through the nation’s central cities” In this environment, what is a good central city economic developer to do?

 

The first thing was to stop building expressways, highways, and freeways. Public spending, union negotiations, and tax increases resulted in brutal political contests with citizens, small business, unions and residents–and the city/mayor lost most of them. CBD and commercial/manufacturing/office redevelopment projects, the meat and potatoes of redevelopment came under attack for nearly a decade. Worse, it was at this point the full attack on UR hit its highest level; it was painfully obvious many projects were not working.  Many urban renewal agencies changed their name to community development. (Teaford, 1990, pp. 230-6) The real purpose of economic development in these years was helping to “pay the bills”—the rest just rhetoric.

 

The Second War Between the States: The Ships Collide

 

In May 1976 Business Week devoted a special issue entitled “the second war between the states[vi]. The article rang out like a fire bell in the night. Regional competition surfaced to the top of America’s 1976 policy agenda. The first impulse was to place blame on who started the fire. Thoughtful observers, with a few years of hindsight, knew the problem was more complex than the so-called “southern branch recruitment strategy”–which by then was forty years long in the tooth. In the best of traditions, we shot first, and thought later. The problem for our history, however, is the first irrational shots became engrained in our professional and Policy World legacy.

How Economic Development Became Ground Zero

In the midst of the fiscal turmoil, a 1970 Times article by Kevin Phillips popularized an alleged Nixonian “southern strategy” (Phillips, 1969) that resulted in his presidential victory (somewhat dubious in that Nixon lost the majority of the south’s 1968 electoral votes to George Wallace). The southern strategy rested on conservative Republican Sunbelt voters. Impetus behind the southern strategy was a repudiation of the Great Society (and the Civil Rights Movement) (Boyd, 1970). On its face, economic development was not a central player. Nevertheless, the 1972 landslide results (70% of Deep South vote for Nixon, victory in every state but Massachusetts) proved Phillips more right than wrong–although Democrats retained control of Congress. The political rise of the South was a punch to Northern noses.

 

Business Week devoted a special issue entitled “The Second War between the States.” May 1976[vii] The article rang out like a fire bell in the night. The first impulse was to place blame on who started the fire. Southern competition surfaced to the top of America’s 1976 policy agenda. Thoughtful observers, with a few years of hindsight, knew the problem was more complex than the so-called “southern branch recruitment strategy”—which by then was 40 years long in the tooth. In the best of traditions, we shot first and thought later. The problem for our history, however, is the first irrational shots became engrained in our professional and Policy World legacy. In the process, Big City fiscal collapse, suburban growth, and Big City policy system change all became “smushed”, along with allegations of southern piracy (with federal help) into ED’s institutional memory.

 

<b>How Economic Development Became Ground Zero

If regional change meant the rise of the West that was one thing. If it meant the rise of the South, that was quite another. Regional change was perceived, albeit incorrectly, as mostly Southern aggression: a new Civil War fought not with bullets but with tax abatements, IRBs, right to work laws, and state and local incentive deals that lured highly visible automotive firms from the North and Midwest to the South. Arguably the opening salvo in this new North-South war was 1976 New Stanton PA Volkswagen Plant bidding war—probably the first public/media “bidding for firms with tax abatements”.

 

Pennsylvania competed for site location with two sites in neighboring Ohio. Pennsylvania “won”. The state bidding war culminated in the largest incentive package America had ever seen: $71 million (1970 dollars) tax abatement, highway, rail improvements and assorted (some say sordid) business incentives. Volkswagen invested nearly $250 million to produce the Volkswagen Rabbit C, but simultaneously Volkswagen also purchased an American Motors plant in South Charleston, West Virginia, and an auto air conditioning plant in Fort Worth Texas—each with state incentives. The Southern plants captured the media attention and notoriety. The South somehow had won the bidding war. Ironically, by 1984 all three plants were either sold or closed. In 2016 the New Stanton plant sits closed and empty.

 

Only a scant year later Honda, in 1977, commenced a successful negotiation for a massive incentive deal to build a motorcycle plant in Marysville Ohio. That was the opening salvo in what would later be called the post-1980 “Auto Alley”, stretching from Great Lakes to the Gulf of Mexico (Kilner & Rubenstein, 2010). In any case, in both instances the incentive war was initiated by foreign companies and states competing for foreign direct investment (FDI). The blame for bidding wars was placed on the doorstep of the rising and aggressive South by Northern politicians (Moynihan, 1977), Policy World and media. Highly publicized works by Kirkpatrick Sale’s popular 1975 book (Sale, 1975) timed perfectly with the politically tumultuous post-Watergate years. A Policy World flurry of books, proclaimed the existence of regional change and regional competition (Goodman, 1979).

 

Who’s to Blame?

The widely-quoted, somewhat sensational Last Entrepreneurs in particular, cemented the image of anti-union Southern states peddling tax abatements and just about every other ED subsidy to foot-loose greedy capitalist firms. Goodman argued that southern states “are selling not merely climate or regional culture, but an ever-expanding package of tax breaks, subsidized job training, public financing, and an anti-labor, anti-environmental control climate. In this war for manufacturing, Southern states were labeled as “the last entrepreneurs“–a euphemism for a pirate. With the exception of the last two incentives, however, the same could be said for northern and Midwestern states as well. The reality was an arms race had started long before the actual fight began.

 

In these pre-Deindustrialization years southern states were perceived as the cause for the “increasingly virulent plague of plant closings in the industrial Northeast and Midwest” (Goodman, 1979, p. Backcover). While not without its ideological and pro-labor baggage, the Last Entrepreneurs conveyed a dynamic that had developed after the Second War: state/local governments, regardless of the region, were engaged in a deadly serious competition for migrating business—each for their own reasons. But something had changed. Economic development always attacked for its “deal-making”, incentives, bidding wars and inter-jurisdictional competition. It had always been so, of course, but these nefarious activities were no longer confined to the municipal level. The states were now involved—in a very big way. And the competition captured heavy duty media and Policy World attention.

 

The numbers were numbing, and the alleged implications so dear to the future—and to the fate of political officials and career economic developers. When confined to municipalities, deal-making and incentives were tied to the urban competitive hierarchy; much less so when deal-making shifted to states. Within the profession, the importance—and the status—of deal-makers changed the character, priority and tone of economic development policy. If in the past, retention as a strategy held a slightly greater priority, in these years it began its all-too-rapid descent into a sideshow. Capturing big foreign companies was a matter of public prestige—and any loss public shame wrapped with the ribbon on irresponsible bidding wars.

 

To make matters even more complicated (and persuasive) the Federal government, controlled by “Sunbelt” Presidents (Johnson, Nixon) and southern Congressional committee chairs had feed Federal funds in huge amounts to game changing new industries such as space travel and national defense (ACIR, 1977). A raft of serious policy and analytic literature documented the seriousness and impact of federal funds in the rise of the Sunbelt (Perry & Watkins, 1977). These works described a political-economic-social zero sum game in which a negligent North was upstaged by the cagey Southern country bumpkin (Editorial, 1976). The task, from the Northern and Midwestern perspective, was to cut off the spigot of federal spending to the South and reroute it to the North and Midwest, to the central cities.

 

Federal tax and spending policies are causing a massive flow of wealth from the Northeast and Midwest to the fast-growing Southern and Western regions of the nation …. The states at the receiving end of high federal outlays also tend to be those that tax their own citizens least for state and local government services. On the other hand the balance of payment situation generally is adverse in the Northeast and Midwest, where population is stagnant or declining, where unemployment is the most severe, where relative personal income is falling and where the heaviest state and local tax burdens imposed. (Editorial, 1976)

 

The region-building war production initiatives and the heritage of military installations and defense spending that followed after the war were critical to the South. Cape Canaveral and the Space Industry. Federal grants in aid formulas overall were transformative. Make no mistake, federal spending had played a major role in the southern economic transformation. But it also might be noted that no one has counted the benefits set by hegemonic, Eastern, Big City corporations, their Pittsburgh Plus pricing et al., and its effects on the more “colonial” regions of the nation. There are two sides to every coin.

 

Economic development had now become politicized, openly politicized, and the politicians assumed leadership in the initial phases of the second war. The solution to the problem of federal spending, not surprisingly, federal spending. While economic development was the alleged cause of the Second War, the real underlying tension was the North/Midwestern fear that irreversible regional economic change was in process and had to be curtailed or reversed. This demanded resources they did not have. The South (and West) sought the same to protect their new-found growth machine. The battleground of the second war would not only be the deal-making for mobile industry, but would also involve policy and funding found only in the Oval office and halls of Congress or bureaucracy.

 

Hitherto the North had been in undisputed economic ascendency. By 1976, with its central cities ‘hitting the bottom”, it was likely the North/Midwest’s hegemony was threatened. The time had come to fight back. Viewed from the other side of the Mason-Dixon fence, the “worm had turned”. The two-century pattern of Northern economic, and oft-times political dominance was visibly changing in the South’s favor. That regional change also included a sort of “coming out” of the West as well, however, was noticed, but the rise of the South was truly galling—and unforgiveable—demanding special treatment. One may wonder if there was something else going on as well as simple regional change?

 

 

Leave a Reply