Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Wednesday, January 30, 2013

The Making of the Atomic Bomb

Really great book - a mix of the history of science and technology with personalities and social-political forces. It reads like a detective story with questions of morality raised but unanswered. The book is long – 886 pages, but it is definitely worth the effort. I’ve collected a few examples from the book that I really liked. I have not put them in quotes but they all come straight from the book. In some cases I’ve added my footnotes to help understand the selection.

I hope that someone writes a screen play around the struggles that the emigrant scientists had with politicians and the military when they tried to convey the potential of the atomic and later the thermonuclear bombs, and their impacts on political systems.

This is a personal selection of excerpts that appealed to me as I read the book. It’s by no means complete. For an excellent review, read the New York Times book review, “The Men Who Made the Sun Rise” by William J. Broad (http://www.nytimes.com/books/99/09/19/specials/rhodes-making.html)

Sunday, January 27, 2013

Public Death vs. Man-Made Death


“Our societies are dedicated to the preservation and care of life .... Public death was first recognized as a matter of civilized concern in the nineteenth century, when some health workers decided that untimely death was a question between men and society, not between men and God. Infant mortality and endemic disease became matters of social responsibility. Since then, and for that reason, millions of lives have been saved. They are not saved by accident or goodwill. Human life is daily deliberately protected from nature by accepted practices of hygiene and medical care, by the control of living conditions and the guidance of human relationships. Mortality statistics are constantly examined to see if the causes of death reveal any areas needing special attention. Because of the success of these practices, the area of public death has, in advanced societies, been taken over by man-made death--once an insignificant or "merged" part of the spectrum, now almost the whole. 


When politicians, in tones of grave wonder, characterize our age as one of vast effort in saving human life, and enormous vigor in destroying it, they seem to feel they are indicating some mysterious paradox of the human spirit. 


There is no paradox and no mystery. The difference is that one area of public death has been tackled and secured by the forces of reason; the other has not. The pioneers of public health did not change nature, or men, but adjusted the active relationship of men to certain aspects of nature so that the relationship became one of watchful and healthy respect. In doing so they had to contend with and struggle against the suspicious opposition of those who believed that to interfere with nature was sinful, and even that disease and plague were the result of something sinful in the nature of man himself.”


Gil Elliot, Twentieth Century Book of the Dead

Friday, August 12, 2011

Complexity vs. Randomness

History as the battle between complexity and randomness. Insightful!



Backed by stunning illustrations, David Christian narrates a complete history of the universe, from the Big Bang to the Internet, in a riveting 18 minutes. This is "Big History": an enlightening, wide-angle look at complexity, life and humanity, set against our slim share of the cosmic timeline.

Big History

Sunday, January 2, 2011

How Radio and TV Became Centralized and Enclosed Instead of Decentralized and Open

From The Wealth of Networks by Yochai Benkler

This very interesting to read given the parallels of is origins with the Internet and how business and the Federal Government worked to alter the course of history

“The introduction of radio was the next and only serious potential inflection point, prior to the emergence of the Internet, at which some portion of the public sphere could have developed away from the advertiser- supported mass-media model. In most of Europe, radio followed the path of state-controlled media, with variable degrees of freedom from the executive at different times and places. Britain developed the BBC, a public organization funded by government-imposed levies, but granted sufficient operational freedom to offer a genuine platform for a public sphere, as opposed to a reflection of the government's voice and agenda. While this model successfully developed what is perhaps the gold standard of broadcast journalism, it also grew as a largely elite institution throughout much of the twentieth century. The BBC model of state-based funding and monopoly with genuine editorial autonomy became the basis of the broadcast model in a number of former colonies: Canada and Australia adopted a hybrid model in the 1930S. This included a well-funded public broadcaster, but did not impose a monopoly in its favor, allowing commercial broadcasters to grow alongside it. Newly independent former colonies in the postwar era that became democracies, like India and Israel, adopted the model with monopoly, levy-based funding, and a degree of editorial independence. The most currently visible adoption of a hybrid model based on some state funding but with editorial freedom is AI Jazeera, the Arab satellite station partly funded by the Emir of Qatar, but apparently free to pursue its own editorial policy, whose coverage stands in sharp contrast to that of the state-run broadcasters in the region. In none of these BBC-like places did broadcast diverge from the basic centralized communications model of the mass media, but it followed a path distinct from the commercial mass media. Radio, and later television, was a more tightly controlled medium than was the printed press; its intake, filtering, and synthesis of public discourse were relatively insulated from the pressure of both markets, which typified the American model, and politics, which typified the state-owned broadcasters. These were instead controlled by the professional judgments of their management and journalists, and showed both the high professionalism that accompanied freedom along both those dimensions and the class and professional elite filters that typify those who control the media under that organizational model. The United States took a different path that eventually replicated, extended, and enhanced the commercial, advertiser-supported mass-media model originated in the printed press. This model was to become the template for the development of similar broadcasters alongside the state-owned and independent BBC-model channels adopted throughout much of the rest of the world, and of programming production for newer distribution technologies, like cable and satellite stations.

The birth of radio as a platform for the public sphere in the United States was on election night in 1920.7 Two stations broadcast the election returns as their launch pad for an entirely new medium - wireless broadcast to a wide audience. One was the Detroit News amateur station, 8MK, a broadcast that was framed and understood as an internal communication of a technical fraternity - the many amateurs who had been trained in radio communications for World War I and who then came to form a substantial and engaged technical community. The other was KDKA Pittsburgh, launched by Westinghouse as a bid to create demand for radio receivers of a kind that it had geared up to make during the war. Over the following four or five years, it was unclear which of these two models of communication would dominate the new medium. By 1926, however, the industrial structure that would lead radio to follow the path of commercial, advertiser-supported, concentrated mass media, dependent on government licensing and specializing in influencing its own regulatory oversight process was already in place.

Although this development had its roots in the industrial structure of radio production as it emerged from the first two decades of innovation and businesses in the twentieth century, it was shaped significantly by political - regulatory choices during the 1920S. At the turn of the twentieth century, radio was seen exclusively as a means of wireless telegraphy, emphasizing ship-to-shore and ship-to-ship communications. Although some amateurs experimented with voice programs, broadcast was a mode of point-to-point communications; entertainment was not seen as its function until the 1920S. The first decade and a half of radio in the United States saw rapid innovation and competition, followed by a series of patent suits aimed to consolidate control over the technology. By 1916, the ideal transmitter based on technology available at the time required licenses of patents held by Marconi, AT&T, General Electric (GE), and a few individuals. No licenses were in fact granted. The industry had reached stalemate. When the United States joined the war, however, the navy moved quickly to break the stalemate, effectively creating a compulsory cross-licensing scheme for war production, and brought in Westinghouse, the other major potential manufacturer of vacuum tubes alongside GE, as a participant in the industry. The two years following the war saw intervention by the U.S. government to assure that American radio industry would not be controlled by British Marconi because of concerns in the navy that British control over radio would render the United States vulnerable to the same tactic Britain used against Germany at the start of the war - cutting off all transoceanic telegraph communications. The navy brokered a deal in 1919 whereby a new company was created - the Radio Corporation of America (RCA) - which bought Marconi's American business. By early 1920, RCA, GE, and AT&T entered into a patent cross-licensing model that would allow each to produce for a market segment: RCA would control transoceanic wireless telegraphy, while GE and AT&T's Western Electric subsidiary would make radio transmitters and sell them under the RCA brand. This left Westinghouse with production facilities developed for the war, but shut out of the existing equipment markets by the patent pool. Launching KDKA Pittsburgh was part of its response:

Westinghouse would create demand for small receivers that it could manufacture without access to the patents held by the pool. The other part of its strategy consisted of acquiring patents that, within a few months, enabled Westinghouse to force its inclusion in the patent pool, redrawing the market division map to give Westinghouse 40 percent of the receiving equipment market. The first part of Westinghouse's strategy, adoption of broadcasting to generate demand for receivers, proved highly successful and in the long run more important. Within two years, there were receivers in 10 percent of American homes. Throughout the 1920S, equipment sales were big business.

Radio stations, however, were not dominated by the equipment manufacturers, or by anyone else for that matter, in the first few years. While the equipment manufacturers did build powerful stations like KDKA Pittsburgh, WJZ Newark, KYW Chicago (Westinghouse), and WGY Schenectady (GE), they did not sell advertising, but rather made their money from equipment sales. These stations did not, in any meaningful sense of the word, dominate the radio sphere in the first few years of radio, as the networks would indeed come to do within a decade. In November 1921, the first five licenses were issued by the Department of Commerce under the new category of "broadcasting" of "news, lectures, entertainment, etc." Within eight months, the department had issued another 453 licenses. Many of these went to universities, churches, and unions, as well as local shops hoping to attract business with their broadcasts. Universities, seeing radio as a vehicle for broadening their role, began broadcasting lectures and educational programming. Seventy-four institutes of higher learning operated stations by the end of 1922. The University of Nebraska offered two-credit courses whose lectures were transmitted over the air. Churches, newspapers, and department stores each forayed into this new space, much as we saw the emergence of Web sites for every organization over the course of the mid-1990s. Thousands of amateurs were experimenting with technical and format innovations. While receivers were substantially cheaper than transmitters, it was still possible to assemble and sell relatively cheap transmitters, for local communications, at prices sufficiently low that thousands of individual amateurs could take to the air. At this point in time, then, it was not yet foreordained that radio would follow the mass-media model, with a small number of well-funded speakers and hordes of passive listeners. Within a short period, however, a combination of technology, business practices, and regulatory decisions did in fact settle on the model, comprised of a small number of advertiser-supported national networks, that came to typify the American broadcast system throughout most of the rest of the century and that became the template for television as well.

Herbert Hoover, then secretary of commerce, played a pivotal role in this development. Throughout the first few years after the war, Hoover had positioned himself as the champion of making control over radio a private market affair, allying himself both with commercial radio interests and with the amateurs against the navy and the postal service, each of which sought some form of nationalization of radio similar to what would happen more or less everywhere else in the world. In 1922, Hoover assembled the first of four annual radio conferences, representing radio manufacturers, broadcasters, and some engineers and amateurs. This forum became Hoover's primary stage. Over the next four years, he used its annual meeting to derive policy recommendations, legitimacy, and cooperation for his regulatory action, all 'without a hint' of authority under the Radio Act of 1912. Hoover relied heavily on the rhetoric of public interest and on the support of amateurs to justify his system of private broadcasting coordinated by the Department of Commerce. From 1922 on, however, he followed a pattern that would systematically benefit large commercial broadcasters over small ones; commercial broadcasters over educational and religious broadcasters; and the one-to-many broadcasts over the point-to-point, small-scale wireless telephony and telegraphy that the amateurs were developing. After January 1922, the department inserted a limitation on amateur licenses, excluding from their coverage the broadcast of "weather reports, market reports, music, concerts, of speeches, news or similar information or entertainment." This, together with a Department of Commerce order to all amateurs to stop broadcasting at 360 meters (the wave assigned broadcasting), effectively limited amateurs to shortwave radio telephony and telegraphy in a set of frequencies then thought to be commercially insignificant. In the summer, the department assigned broadcasters, in addition to 360 meters, another band, at 400 meters. Licenses in this Class B category were reserved for transmitters operating at power levels of 500-1,000 watts, who did not use phonograph records. These limitations on Class B licenses made the newly created channel a feasible home only to broadcasters who could afford the much-more-expensive, high- powered transmitters and could arrange for live broadcasts, rather than simply play phonograph records. The success of this new frequency was not immediate, because many receivers could not tune out stations broadcasting at the two frequencies in order to listen to the other. Hoover, failing to move Congress to amend the radio law to provide him with the power necessary to regulate broadcasting, relied on the recommendations of the Second Radio Conference in 1923 as public support for adopting a new regime, and continued to act without legislative authority. He announced that the broadcast band would be divided in three: high-powered (500-1,000 watts) stations serving large areas would have no interference in those large areas, and would not share frequencies. They would transmit on frequencies between 300 and 545 meters. Medium-powered stations served smaller areas without interference, and would operate at assigned channels between 222 and 300 meters. The remaining low-powered stations would not be eliminated, as the bigger actors wanted, but would remain at 360 meters, with limited hours of operation and geographic reach. Many of these lower-powered broadcasters were educational and religious institutions that perceived Hoover's allocation as a preference for the RCA-GE-AT&T-Westinghouse alliance. Despite his protestations against commercial broadcasting ("If a speech by the President is to be used as the meat in a sandwich of two patent medicine advertisements, there will be no radio left"), Hoover consistently reserved clear channels and issued high-power licenses to commercial broadcasters. The final policy action based on the radio conferences came in 1925, when the Department of Commerce stopped issuing licenses. The result was a secondary market in licenses, in which some religious and educational stations were bought out by commercial concerns. These purchases further gravitated radio toward commercial ownership. The licensing preference for stations that could afford high-powered transmitters, long hours of operation, and compliance with high technical constraints continued after the Radio Act of 1927. As a practical matter, it led to assignment of twenty-one out of the twenty-four clear channel licenses created by the Federal Radio Commission to the newly created network-affiliated stations.

Over the course of this period, tensions also began to emerge within the patent alliance. The phenomenal success of receiver sales tempted Western Electric into that market. In the meantime, AT&T, almost by mistake, began to challenge GE, Westinghouse, and RCA in broadcasting as an outgrowth of its attempt to create a broadcast common-carriage facility. Despite the successes of broadcast and receiver sales, it was not clear in 1922-1923 how the cost of setting up and maintaining stations would be paid for. In England, a tax was levied on radio sets, and its revenue used to fund the BBC. No such proposal was considered in the United States, but the editor of Radio Broadcast proposed a national endowed fund, like those that support public libraries and museums, and in 1924, a committee of New York businessmen solicited public donations to fund broadcasters (the response was so pitiful that the funds were returned to their donors). AT&T was the only company to offer a solution. Building on its telephone service experience, it offered radio telephony to the public for a fee. Genuine wireless telephony, even mobile telephony, had been the subject of experimentation since the second decade of radio, but that was not what AT&T offered. In February 1922, AT&T established WEAF in New York, a broadcast station over which AT&T was to provide no programming of its own, but instead would enable the public or program providers to pay on a per-time basis. AT&T treated this service as a form of wireless telephony so that it would fall, under the patent alliance agreements of 1920, under the exclusive control of AT&T.

RCA, Westinghouse, and GE could not compete in this area. "Toll broadcasting" was not a success by its own terms. There was insufficient demand for communicating with the public to sustain a full schedule that would interest listeners tuning into the station. As a result, AT&T produced its own programming. In order to increase the potential audience for its transmissions while using its advantage in wired facilities, AT&T experimented with remote transmissions, such as live reports from sports events, and with simultaneous transmissions of its broadcasts by other stations, connected to its New York feed by cable. In its effort to launch toll broadcasting, AT&T found itself by mid- 1923 with the first functioning precursor to an advertiser-supported broadcast network.

The alliance members now threatened each other: AT&T threatened to enter into receiver manufacturing and broadcast, and the RCA alliance, with its powerful stations, threatened to adopt "toll broadcasting," or advertiser-supported radio. The patent allies submitted their dispute to an arbitrator, who was to interpret the 1920 agreements, reached at a time of wireless telegraphy, to divide the spoils of the broadcast world of 1924. In late 1924, me arbitrator found for RCA-GE-Westinghouse on almost all issues. Capitalizing on RCA's difficulties with the antitrust authorities and congressional hearings over aggressive monopolization practices in the receiving set market, however, AT&T countered that if the 1920 agreements meant what the arbitrator said they meant, they were a combination in restraint of trade to which AT&T would not adhere. Bargaining in the shadow of the mutual threats of contract and antitrust actions, the former allies reached a solution that formed the basis of future radio broadcasting. AT&T would leave broadcasting. A new company, owned by RCA, GE, and Westinghouse would be formed, and would purchase AT&T's stations. The new company would enter into a long-term contract with AT&T to provide the long-distance communications necessary to set up the broadcast network that David Sarnoff envisioned as the future of broadcast. This new entity would, in 1926, become the National Broadcasting Company (NBC). AT&T's WEAF station would become the center of one of NBC's two networks, and the division arrived at would thereafter form the basis of the broadcast system in the United States.
By the middle of 1926, then, the institutional and organizational elements that became the American broadcast system were, to a great extent, in place. The idea of government monopoly over broadcasting, which became dominant in Great Britain, Europe, and their former colonies, was forever abandoned. The idea of a private-property regime in spectrum, which had been advocated by commercial broadcasters to spur investment in broadcast, was rejected on the backdrop of other battles over conservation of federal re-sources. The Radio Act of 1927, passed by Congress in record speed a few months after a court invalidated Hoover's entire regulatory edifice as lacking legal foundation, enacted this framework as the basic structure of American broadcast. A relatively small group of commercial broadcasters and equipment manufacturers took the lead in broadcast development. A govern-mental regulatory agency, using a standard of "the public good," allocated frequency, time, and power assignments to minimize interference and to resolve conflicts. The public good, by and large, correlated to the needs of commercial broadcasters and their listeners. Later, the broadcast networks supplanted the patent alliance as the primary force to which the Federal Radio Commission paid heed. The early 1930S still saw battles over the degree of freedom that these networks had to pursue their own commercial interests, free of regulation (studied in Robert McChesney's work). By that point, however, the power of the broadcasters was already too great to be seriously challenged. Interests like those of the amateurs, whose romantic pioneering mantle still held strong purchase on the process, educational institutions, and religious organizations continued to exercise some force on the allocation and management of the spectrum. However, they were addressed on the periphery of the broadcast platform, leaving the public sphere to be largely mediated by a tiny number of commercial entities running a controlled, advertiser-supported platform of mass media. Following the settlement around radio, there were no more genuine inflection points in the structure of mass media. Television followed radio, and was even more concentrated. Cable networks and satellite networks varied to some extent, but retained the basic advertiser-supported model, oriented toward luring the widest possible audience to view the advertising that paid for the programming.”

The Wealth of Networks: How Social Production Transforms Markets and Freedom, Yochai Benkler, Yale University Press, 2006

Wednesday, September 1, 2010

Wealth Gaps Yawns - and So Do the Media

"Last March, the Insight Center for Community Economic Development released the revelatory report "Lifting as We Climb: Women of Color, Wealth, and Americas Future;' which measures wealth gaps according to gender and race. The results are a national embarrassment, but it's a good guess that you missed the story-because almost no one deemed the data newsworthy."

This article in Utne Reader, September - October, 2010 is well worth reading. As a matter of fact the whole edition is full of valuable information for anyone interested in what's happening to the U.S.

A number of years a go, I got interested in this topic because I perceived that I had just lived through the greatest redistribution of wealth in my life time. And, I had just learned about a new tool to display data, the Motion Chart created by Hans Rosling. Google has the official public version of the tool in its Google Gadgets.

I posted this on my Ning site earlier, but lost it when Ning starting charging. I got the data from the Federal Reserve Board. I had the link to the report, but the report is no longer there (curious). I had hoped I could update the data beyond 2004. There is some data and analysis on Wikipedia but it also ends in 2004.

Personal assets are plotted as a function of personal income by percentile of income. Data was available from 1989 to 2004. You can increase the size of the chart by clicking on the expand symbol in the bottom right hand corner of the video.




What this Motion Chart shows is how much faster wealth and income has grown for the upper percentile of U.S. citizens. What the article and the Wikipedia article talk about is how this is accentuated for women and people of color.

You can download a copy of the video here. By the way, the Google Gadget provides an embeddable flash file, but Blogger wouldn't accept it.

Tuesday, March 24, 2009

History of the Decline and Recreation of Town Centers

Factors Contributing To Town Center Decline
Town centers in the United States over the past fifty (50) years have been under attack by a series of events and concepts that have affected the economic lifeblood of the town center -- its retail, restaurant, service and entertainment businesses. Federal and State governments, driven by the desire to improve the productivity of interstate and intrastate commerce focused on highway throughput, and as a result highways bypassed many town centers in order to improve the speed of transport vehicles through the area. In all cases of a bypass regardless of the size of the city, the downtown was affected negatively. Since most small town centers (less than 10,000 population) are vitally dependent on traffic generated business, they were negatively affected the most. In the cases of larger towns and cities there was usually enough economic movement to the area of the bypass to eventually recover some of the economic impact. However, the down towns of these towns and cities were never the same.

The advent of the shopping mall was an attempt to recreate the retail experience of a town center. Malls were created first with an out door design, but then rather quickly went to indoor, climate controlled facilities in attempts to "always have a good day to shop." These shopping malls first affected the town centers of the larger cities. The concept has continued to develop into "mega" malls (e.g., Katy Mills Mall) and outlet centers (e.g., San Marcos) that have become destinations in their own right, affecting all towns, both large and small within a one to three hundred mile radius. Meanwhile, small town centers were more directly affected by strip shopping malls and franchise restaurants that developed all along the bypasses.

As population grew and national retailing became more sophisticated the "big box" stores were developed to feed the retail frenzy. Whereas big box stores such as Target and K-Mart stayed in the urban and suburban environments, Wal-Mart purposefully developed a strategy to go to the small town and more rural environments. The impact of having a Wal-Mart move into any small town was devastating to the businesses of that town's center.

The impacts of other retailing innovations, such as "power centers" and "E-commerce" are not yet fully known for small town centers. Because of the population densities required to support a power center, the development of a power center on the outskirts of a town with a population less than 50,000 has not yet occurred. The impact of E-commerce on small town centers can be either positive or negative, depending upon on how the town's businesses exploit the new technology. Traditionally, rural residents have been large supporters of catalogue sales. So, it is expected that as Internet access is made more widespread to rural residents that E-commerce activities will follow.

Attempts To Revitalize Town Centers
Over the years many approaches and attempts have been made to counter the effects of bypasses, malls, big box stores and more on town centers across the United States. Three significant attempts are: Urban Renewal; the Pedestrian Mall; and Historical/Heritage Preservation/Restoration.

One of the first methods for revitalization of town centers was Urban Renewal. Developed first for large cities with urban decay, then applied to former industrial towns, then later to the more rural environments. For the most part this approach failed because it actually destroyed what fabric of the town center that existed by creating single use regions that were not viable. During the process of urban renewal many historically significant buildings were razed, robbing the community of an important part of its heritage.

The next methods to be applied to the revitalization of town centers were concepts called Pedestrian Malls. Vehicle traffic was viewed as negative and therefore the idea was to close off some set of the town center's streets to all vehicle traffic, and thus open the entire area up to pedestrian traffic. In a large part, these concepts were applied to the town centers of small to medium sized towns, that had in most instances been missed by urban renewal. There were mixed results.

The most recent approach to the revitalization of town centers is to build on the existing historic buildings to preserve the town center's unique heritage and recreate an environment that encourages economic development. This approach was formalized in 1980 in the National Main Street Program. Since its inception approximately 1,400 towns and cities have taken the steps of becoming a Main Street participant. In addition, twenty states, including Texas have taken the steps of creating a state wide Main Street program.

National Main Street Program
Overview

Since 1980 the National Main Street Center of the National Trust for Historic Preservation has been working with communities across the nation to revitalize their historic or traditional commercial areas. Based in Historic Preservation, the Main Street approach was developed to save historic commercial architecture and the fabric of American communities' built environment, but has become a powerful economic development tool as well.

The Main Street program is designed to improve all aspects of the down town or central business district (CBD) producing both tangible and intangible benefits. Improving economic management, strengthening public participation, and making down town a fun place to visit are as critical to Main Streets' future as recruiting new businesses, rehabilitating new buildings and expanding parking. Building on down town's inherent assets -- rich architecture, personal service, and traditional values and most of all, a sense of place -- the Main Street approach has rekindled entrepreneurship, down town cooperation, and civic concern. It has earned national recognition as a practical strategy appropriately scaled to a community's local resources and conditions. And because it is a locally driven program, all initiative stems from local issues and concerns.

Approach

The Main Street Four Point Approach is:

• Design -- Enhancing the physical appearance of the commercial district by rehabilitating historic buildings, encouraging supportive new construction, developing sensitive design management systems, and long term planning.
• Organization -- Building consensus and cooperation among the many groups and individuals who have a role in the revitalization process.
• Promotion -- Marketing the traditional commercial districts' assets to customers, potential investors, new businesses, local citizens and visitors.
• Economic Restructuring -- Strengthening the district's existing economic base while finding ways to expand it to meet new opportunities and challenges from outlying development.

Principles

The Main Street Program is based on eight principles:

• Comprehensive -- A single project cannot revitalize a down town or commercial neighborhood. An on going series of initiatives is vital to build community support and create lasting progress.
• Incremental -- Small projects make a big difference. They demonstrate that "things are happing" on main street, and hone the skills and confidence the program will need to tackle more complex problems.
• Self-help -- Although the National Main Street Center can provide valuable direction and hand-on technical assistance, only local leadership can initiate long term success by fostering and demonstrating community involvement and commitment to the revitalization effort.
• Public/Private Partnership -- Every local Main Street Program needs the support and expertise of both the public and private sectors. For an effective partnership, each must recognize the strengths and weaknesses of the other.
• Identifying and Capitalizing on Existing Assets -- One of the National Main Street Centers' key goals is to help communities recognize and make the best use of their unique offerings. Local assets provide the solid foundation for a successful Main Street initiative.
• Quality -- From storefront design to promotional campaigns to special events, quality must be the main goal.
• Change -- Changing community attitudes and habits is essential to bring about a commercial district renaissance. A carefully planned Main Street Program will help shift public perceptions and practices to support and sustain the revitalization process.
• Action Oriented -- Frequent, visible changes in the look and activities of the commercial district will reinforce the perception of positive change. Small, but dramatic improvements early in the process will remind the community that the revitalization effort is underway.

National Main Street Program Summary

Since the National Main Street Program was initiated:

• 1,400 cities and towns have had Main Street Programs
• $10.9 Billion investment by public and private sources
• An average of $5.1 Million investment per town or city
• For every dollar invested in the operation of a Main Street Program, $35 is generated for investment
• 174,000 new jobs
• 47,000 new businesses
• 60,900 buildings rehabilitated
• Programs last on average 5.6 years


National Main Street Trends

There are approximately 1,200 communities actively involved in revitalizing their historic down towns and neighborhood commercial districts. Over 400 hundred communities participated in a survey in 1999 of the economic impacts of Main Street Programs. Among the survey's major findings are:

• Retail Sales Are Increasing -- 65% reported increases in retail sales. Only 3% reported a decrease.
• Ground Floor Occupancy Rates Are Up -- 57% reported higher ground floor occupancy in 1999 compared to 1998.
• Upper Floor Occupancy Rates Are Climbing -- 33% reported higher upper floor occupancy rates.
• Number of Retail Businesses Increase -- 58% reported more retail businesses in 1999 than in 1998.
• The Number of Main Street Businesses Using the Internet Is Growing Dramatically -- 84% reported there were more businesses using the Internet in 1999 than in 1998.
• "Location Neutral" Businesses Continue to Move Into Main Street Districts -- 24% reported an increase in the number of businesses whose trade area is not confined geographically.
• More People Are Living On Main Street -- 33% reported an increase in housing units in Main Street Districts.
• The Number of People Attending Events Is Increasing -- 83% reported that the number of people attending festivals and special events in Main Street Districts increased in 1999.
• More Locally Owned Businesses -- 50% reported more locally businesses in 1999 than 1998 in their Main Street Districts.
• Property Values Are Increasing -- 67% reported that property values were higher in 1999 in Main Street Districts than in 1998.
• Smaller communities reported more dramatic increases in numbers of personal service businesses and numbers of businesses using the Internet, both of these underscore changes in retailing in small towns. Retail businesses in small down towns are finding that the Internet provides a mechanism for reaching larger numbers of customers. Businesses using the Internet fall into three broad categories:
• Provide Better Service to Their Existing Local Customers -- The Front Street Pub, Greenville, AL, uses its web site to list its schedule of live music, information on the Pub's ongoing billiards tournament and to sponsor a chat room for customers. Osborn Drugs, Osborn, OK, lets customers refill prescriptions from its web site and provides links to other pharmaceutical web sites.
• Augments Sales In Their Stores or Offices -- Footwise, Corvallis, OR, specializing in Birkenstock shoes and sandals offers the largest selection of Birkenstock's on the Internet, attracting customers from throughout the world. Whitestone, Livermore, CA, is a bookstore utilizing a similar strategy.
• Almost Exclusively Internet Based, With Few Local Customers -- Kringle Kottage, Scottsbluff, NB, now sells most of its collectible ornaments and figurines through the auction site e-Bay. RJB-The Diner Store, Munedeline, IL, sells jukeboxes, diner fixtures and other 1950s/60s diner related nostalgia items to customers throughout the world.
• Many survey respondents listed high-tech companies among those moving into their historic commercial districts. Again, expanding a trend which has emerged in the National Main Street Trend Survey every year since 1996.

Examples Of Main Street Programs

There are presently seventy-nine Main Street Cities with web sites listed in the National Main Street program web site. As can be seen in the graph below, the majority of the cities are between 10,000 and 30,000 population. There are 23 Main Street Cities on the web that have populations of less than 10,000 people. Of these cities, with populations less than 10,000:
• An average of two years was required to get Main Street designation
• The average age of the program is five years
• An average of four new businesses per year were created

As each Main Street Programs is tailored to the needs of the city, within the loose structure of the National and State Main Street Programs, the tools used by each city to affect a change in their town centers is different. All of the City Main Street Programs had a web site and all had a formal board or committee and paid staff. These are basic requirements of the Main Street Program. However, beyond those, there was little agreement as to what tools were the most important for success. Listed below in order of frequency of use are some examples of tools mentioned two or more times:

• Events in the Town Center
• A Main Street Program newsletter
• Coordinated, thematic dress up of Town Center
• Consulting & training for business owners
• Master land use plan for Town Center
• Improvements of sidewalks
• Creation of historic district
• Development of resource library
• Farmers/crafts market in Town Center
• Landscaping & beautification

In addition, other tools mentioned were: improved window displays, façade grants, design grants, renovation grants, renovation loans, façade loans, street improvements, development of a river walk, more parking lots, restored railroad station, murals on building walls, tax abatements, frequent shopper programs, creation of an assessment district to fund program, recognition awards for businesses, signage program, town center directory & map, volunteer handbook, sign grant program, sales tax exemption on building materials, memorabilia and novelties, and a live mascot (a very friendly cat that lived in the Main Street Office).

The Texas Main Street Program
The Texas Main Street Program is part of the Texas Historical Commission's Community Heritage Development Division. The Texas Main Street Program helps Texas cities revitalize their historic down towns and neighborhood commercial districts by utilizing preservation and economic development strategies.

Each year the Texas Historical Commission typically selects up to five Texas cities and urban areas as official Texas Main Street cities. The 1999 Texas Main Street cities were Gatesville, Gladewater, Shiner, Taylor, and Whitewright. However, in 2000, sixteen (16) cities were selected as Texas Main Street participants from a pool of seventy-six (76) applicants. The 2000 Main Street cities are -- Beaumont, Breckinridge, Celina, Cliffton, Denton, Elgin, Fort Stockton, Garland, Gilmer, Goliad, LaGrange, Nacogdoches, New Braunfels, Rusk, San Marcos, Seguin.

Selected cities are eligible to receive:

• Training For Main Street Managers and Board Members
• Training In Successful Economic Development Approaches
• On Site Evaluation (3-day) and Full Report With Recommendations
• Identification and Assistance with Architectural Elements, such as Façade Drawings and Education of Business Owners in Proper Maintenance Techniques
• Consultation with Down Town Merchants About Visual Merchandizing and Window Displays
• Advice on Heritage Tourism and Marketing

The Texas Main Street Program, affiliated with the National Main Street Program, was begun in 1981. It is one of the most successful down town revitalization programs in the nation. It has assisted 125 Texas cities since its inception. The program has resulted in:

• Reinvestment of More Than $582 Million in Texas Down Towns and Neighborhood Commercial Districts
• Creation of More Than 14,000 Jobs
• Establishment of More Than 3,600 New Businesses

Texas cities with historic commercial buildings in their down towns and neighborhood business districts may apply for Texas Main Street designation. Applications must be received by the last working day of July each year for the following program year. To be eligible to apply cities with less than 5,000 population must make a three (3) year commitment of staffing and funding. It is recommended that a full time Main Street manager be hired. However, to be eligible for Texas Main Street designation an at least half-time manager is required.

Economic Transformation
The world is in the middle of an economic transformation, driven by information technologies, changes in social norms, a political trend toward capitalism, increase in the number of young people rivaling the baby boom, and an unprecedented increase in the number of people over the age of 65. These driving forces are coupled with a current situation in the US where unemployment is at its lowest point in history and Congress has removed the barriers to earnings for people collecting social security. Small towns are just beginning to take advantage of these driving forces to transform their town centers into Twenty First Century economic engines.

Given this new reality, a small town can no longer rely solely on Historic Preservation and Restoration for its economic salvation.

"In the past generation, American communities and local governments have tried a long list of strategies in an effort to revive their down town commercial corridors. Most of them have been failures, from the massive urban renewal projects of the 1950s to the pedestrian shopping malls of the 1960s and 1970s and the hotel/convention center projects financed by Federal subsidies in the 1980s. Planners have tried tearing down older shopping blocks and replacing them with suburban style down town malls; they have even, in a few cases, bulldozed entire down towns and built malls and parking lots on the empty grounds. This approach, too, has nearly failed.

During the 1990s, an increasing number of communities have switched to a strategy of historic preservation, which has been demonstrably more successful. Towns and cities that considered their Victorian shopping districts to be eye sores a decade ago are now promoting them as tourist attractions and drawing large weekend crowds. Preservation is a powerful economic development tool, but its potential has yet to be realized in countless other communities around the country.

In the end, though, it is not physical preservation or any special feature at all that brings an urban retail corridor to health. It is return of a commerce based on human interaction, on stable relationships, on the small comforts that derive from the intercourse of buyer and seller, professional and client, week after week and year after year, during all the seasons of ordinary life. Those relationships have eroded in recent times, but they are starting to return, for the simple reason that people realize what has been lost."

---Douglas Merriam, Preservation (July/August 1999)

The ingredients that are necessary for a small town to take advantage of the economic transformation are:

• A Rich Texture -- To fulfill the human need for a sensory experience, a diversified set of aesthetically pleasing sounds, smells, visual and taste stimuli are required. Historic Preservation can provide visual texture.
• Human Scaled -- Highly valued is the ability to walk around with convenience and safety. This requires a physical environment, which is visually interesting that encourages people to get out of their cars and spend time. It must be pedestrian friendly.
• Interaction -- By putting humanity back into the daily transactions of life and thus encourage people to enjoy the cultivation of new relationships. This requires places of interaction, such as a coffee shop or pub, and the development of a caring approach and interest in the customer by the merchant.
• Mixed Use -- The appropriate mix of retail, commercial, entertainment, restaurants, government, parks and residential that allows continuous utilization of the properties involved.
• Freedom and Choice -- Nor just the freedom to sip expresso and order fresh salmon, but the freedom to do business anywhere on the globe, to communicate with London or Tokyo in a matter of seconds, to live in a safe environment without making the economic or cultural sacrifices that such a choice would have entailed a generation ago.
• A Commitment to Enabling Technologies -- Making businesses more efficient and effective and thus assuring competitiveness in both local and a world marketplace. This requires the city to focus on a communication infrastructure. It also requires that individual businesses exploit information technologies to improve efficiencies of day to day operations, improve communications with suppliers, improve service to existing customers and to improve marketing efforts.
• A Passion for Continuing Education -- In this economy a businesses' greatest asset is its people. The way to sustain competitive advantage is through the care and nurturing of the brains in the business. This means not only higher education accessibility, but also available ongoing training of all types to keep everyone knowledgeable and current.
• A Desire to Positively Affect the Future -- It is important to maintain roots while not getting stuck in the past. People need to positively embrace change while being able to discern what elements of the past to hold onto. It is necessary to have a shared vision of what the town can be in the future to assure that the town doesn't get pulled apart.
• Local Capital -- It is necessary to have some form of local capital available for investment in the community. If the people and organizations in the local community don't commit to helping in investment for growth, it is next to impossible to get outside groups interested. Moreover, the nature of the investment often times will not "pass muster" on a global scale, so it has to be supported locally by people who know the people making it happen. Capital can be provided by individuals, successful local businesses, financial institutions, community foundations, designated city or county tax revenue and investment organizations.
• Civic Capacity -- Organizations and individuals who provide the means, capability and leadership to move the community forward
• A Majority of Locally Owned Business -- "Locally owned and operated" is an imperative to get local money flowing for investment, and the business owners need to be voting members of the community so that they have a say in city policy decisions. The transformation will require long term commitment, and only those with significant stakes in the outcome will be willing to see it through.
• A Strong Identity -- This is a two-sided coin. First is the sense of belonging and pride. From the outward perspective it is easier to market and differentiate (branding) the town and get "share of mind" of the visitor or tourist

The Benefits Of Town Center Revitalization
The most important benefits of Town Center Revitalization include:

• Best Utilization of Existing Infrastructure -- Making use of existing infrastructure (water, sewer, roads and sidewalks) negates the need to build new more expensive infrastructure elsewhere. Cities typically develop outside their city limits, requiring the extension and sometimes development of new infrastructure to meet development needs. Focusing on the core of the city can, when coupled with a comprehensive town center development program, show better return on investment for the city.
• Increased Tax Revenue -- Both sales and ad valorem tax revenues are increased because of the higher revenue brought in by businesses and the increased property values.
• Asset Appreciation -- The value of the buildings and land increase, thus increasing wealth in the community.
• Higher Productivity in Businesses -- More revenue enables businesses to invest more in themselves. Increased revenue is the result of increased customer traffic because the town center is viewed as a destination. Both cooperation and competition increase as business owners as a whole see merit in increased efficiency and effectiveness.
• Higher Wages/More Jobs -- As the businesses become more successful they pay better wages and create more jobs.
• Efficient Use of Land -- Economic forces created by the Town Center drive the highest and best use of land.
• Enables Building Rehabilitation -- By making the Town Center a retail destination, there is economic incentive to rehabilitate buildings within the Town Center.
• Residents Save Time & Money -- Residents can take advantages of the retail, entertainment, restaurants and services of a Town Center reducing the amount of time and money spent traveling to other destinations.
• Reduce Leakage Out of Local Economy -- As residents spend more of their money in the Town Center, they spend less in other destinations thereby reducing leakage from the local economy.

Tuesday, October 14, 2008

The Big Switch: Rewiring the World, from Edison to Google

Nicholas Carr, author of Does IT Matter?, has written a provocative and insightful book about the big switch that we are undergoing with respect to information technology infrastructure. He provides a persuasive historical analogy that computer utilities will replace in house computer facilities.

Carr sums up the basic premise in this way, “Why has computing progressed in such a seemingly dysfunctional way? Why has the personalization of computers been accompanied by such complexity and waste? The reason is fairly simple. It comes down to two laws. The first and most famous was formulated in 1965 by the brilliant Intel engineer Gordon Moore. Moore's Law says that the power of microprocessors doubles every year or two. The second was proposed in the 1990s by Moore's equally distinguished colleague Andy Grove. Grove's Law says that telecommunications bandwidth doubles only every century. Grove intended his "law" more as a criticism of what he considered a moribund telephone industry than as a statement of technological fact, but it nevertheless expresses a basic truth: throughout the history of computing, processing power has expanded far more rapidly than the capacity of communication networks. This discrepancy has meant that a company can only reap the benefits of advanced computers if it installs them in its own offices and hooks them into its own local network. As with electricity in the time of direct-current systems, there's been no practical way to transport computing power efficiently over great distances.”

“’The next sea change is upon us’ Those words appeared in an extraordinary memorandum that Bill Gates sent to Microsoft’s top managers and engineers on October 30, 2005. Bely its bland title, ‘Internet Software Services,’ the memo was intended to sound an alarm, to warn the company that the rise of utility computing threatened to destroy its traditional business.”

Microsoft has dominated the PC desktop. What was emerging was a totally different kind of business – software as a service. Something we now call SaaS. This new way to look at software will be very disruptive.

In 2005, Google began work on its Dalles, OR computing utility facility. “The town's remoteness would make it easier for Google to keep the facility secure-and harder for its employees to be lured away by competitors. More important, the town had ready access to the two resources most critical to the data center's efficient operation: cheap electricity and plentiful bandwidth. Google would be able to power its computers with the electricity produced by the many hydroelectric dams along the Columbia, particularly the nearby The Dalles dam with its 1.8-gigawatt generating station. It would also be able to temper its demand for electricity by tapping the river's icy waters to help cool its machines. As for bandwidth, the town had invested in building a large fiber-optic data network with a direct link to an international Internet hub in nearby Harbour Pointe, Washington. The network provided the rich connection to the Internet that Google needed to deliver its services to the world's Web surfers.” Carr speculates that in the future we may look back on this endeavor much as we now look back on Insull’s early electricity generating plants.

One of the first really successful implementations of SaaS was SalesForce, a CRM application. Marc Benioff, who left Oracle to found SalesForce, proclaimed the end of software as we know it. “As it turned out, the idea of software-as-a-service caught on even more quickly than Benioff expected. In 2002, the firm's sales hit $50 million. Just five years later, they had jumped tenfold, to $500 million. It wasn't just small companies that were buying its service, though they had constituted the bulk of the earliest subscribers. Big companies like SunTrust, Merrill Lynch, Dow Jones, and Perkin-Elmer had also begun to sign up, often abandoning their old in-house systems in the process. Benioff's audacious gamble, like Insull's a century earlier, had panned out. As for the once mighty Siebel Systems, it had gone out of business as a stand-alone company. After suffering a string of deep losses in the early years of the decade, it was bought up in early 2006 by Benioff's old company, Oracle.”

Amazon launched the first utility computing service in March, 2006. They allowed customers to store data on Amazon’s systems for a few cents per gigabyte per month.

One of the things that I really like the way Carr writes is that he mixes so many different perspectives and insights. “In the early decades of the twentieth century, as punch-card tabulators and other computing machines gained sophistication, mathematicians and businessmen began to realize that, in the words of one historian, ‘information is a commodity that can be processed by a machine.’ Although it now sounds obvious, it was a revolutionary insight, one that fueled the growth and set the course of the entire computer industry, particularly the software end of it, and that is now transforming many other industries and reshaping much of the world's economy. As the price of computing and bandwidth has plunged, it has become economical to transform more and more physical objects into purely digital goods, processing them with computers and transporting and trading them over networks.”

And, later, “Until recently, most information goods were also subject to diminishing returns because they had to be distributed in physical form. Words had to be printed on paper, moving pictures had to be captured on film, software code had to be etched onto disks. But because the Internet frees information goods from their physical form, turning them into entirely intangible strings of ones and zeroes, it also frees them from the law of diminishing returns. A digital good can be replicated endlessly for essentially no cost-its producer does not have to increase its purchases of inputs as its business expands. Moreover, through a phenomenon called the network effect, digital goods often become more valuable as more people use them. Every new member that signs up for Skype, puts an ad on Craigslist, or posts a profile on PlentyOfFish increases the value of the service to every other member. Returns keep growing as sales or use expands-without limit.”

One of the big factors that is contributing to the rising productivity of computers is social production (the social web and collaboration). “Whereas industrialization in general and electrification in particular created many new office jobs even as they made factories more efficient, computerization is not creating a broad new class of jobs to take the place of those it destroys. As Autor, Levy, and Murnane write, computerization ‘marks an important reversal. Previous generations of high technology capital sharply increased demand for human input of routine information-processing tasks, as seen in the rapid rise of the clerking occupation in the nineteenth century. Like these technologies, computerization augments demand for clerical and information-processing tasks. But in contrast to [its] predecessors, it permits these tasks to be automated.’ Computerization creates new work, but it's work that can be done by machines. People aren't necessary.

That doesn't mean that computers can take over all the jobs traditionally done by white-collar workers. As the scholars note, ‘Tasks demanding flexibility, creativity, generalized problem-solving and complex communications-what we call nonroutine cognitive tasks-do not (yet) lend themselves to computerization.’ That parenthetical ‘yet,’ though, should give us pause. As the power and usefulness of networked computers have advanced during the few years since they wrote their paper, we've seen not only the expansion of software's capabilities but the flowering of a new phenomenon that is further reducing companies' need for workers. Commonly termed ‘social production,’ the phenomenon is reshaping the economics of the media, entertainment, and software industries, among others. In essence, it allows many of those ‘nonroutine cognitive tasks’ that require ‘flexibility, creativity, generalized problem-solving and complex communications’ to be carried out for free-not by computers on the network but by people on the network.”

An example familiar to everyone is YouTube. All the users provide the content, catalogue and rate is value. Wikipedia is another example.

Why do people contribute? Carr lists several reasons:

* They contribute without knowing it (i.e. search engines)
* Self interest (i.e. tools they use to help them for free and results are shared like del.cio.us)
* Competitive or status seeking (i.e. Wikipedia)
* Enjoyment

And, I would add, altruism.

Ubiquitous inexpensive computing and communications with a constant flow of new software applications are fueling this phenomena. “In his book The Wealth of Networks, Yale law professor Yochai Benkler traces the recent explosion in social production to three technological advances. ‘First, the physical machinery necessary to participate in information and cultural production is almost universally distributed in the population of the advanced economies,’ he writes. ‘Second, the primary raw materials in the information economy, unlike the physical economy, are [freely available] public goods-existing information, knowledge, and culture.’ Finally, the Internet provides a platform for distributed, modular production that ‘allows many diversely motivated people to act for a wide range of reasons that, in combination, cohere into new useful information, knowledge, and cultural goods.’”

One of the reasons all of this works is the connection between people and the community it creates. “Richard Barbrook, of the University of Westminster in London, expressed this view well in his 1998 essay 'The Hi-Tech Gift Economy.' He wrote of Internet users:

‘Unrestricted by physical distance, they collaborate with each other without the direct mediation of money or politics. Unconcerned about copyright, they give and receive information without thought of payment. In the absence of states or markets to mediate social bonds, network communities are instead formed through the mutual obligations created by gifts of time and ideas.’"

This a book to be read and discussed.

The Big Switch: Rewiring the World, from Edison to Google
Nicholas Carr
WW Norton & Company, NY, 2008, 278 pp

Tuesday, September 16, 2008

Adam Smith 2.0

By Irving Wladawsky Berger, AlwaysOn

The other day, I came across an interesting story about Adam Smith in The Economist. It appears that Adam Smith - the 18th century philosopher and economist, who is generally considered the father of free-market, free-trade capitalism - has been treated with remarkable indifference in his native Scotland. The 17th-century house where he spent the last years of his life has only a small, tarnished bronze plaque mentioning his name. His grave was overgrown until recently, and is still not easy for visitors to find.

The Economist story attributes this indifference to one of Scotland's best known sons to modern politics and historical ignorance. "Smith's most famous work, The Wealth of Nations," the article says, "which describes wealth creation in a competitive commercial economy dominated by the market's invisible hand, has long been appropriated by right-wingers and anathema in left-leaning Scotland."

Driven by their narrow political ideology, some people seem to think of open markets as reflecting a kind of survival of the fittest competition in which anything goes. But such people, I believe, have totally misrepresented not just Adam Smith and open markets, but the principles governing evolution and natural selection, especially as it applies to social animals like us humans.

The Economist story goes on to say that, - led by Prime Minister Gordon Brown, himself a Scot, - people are discovering that Adam Smith is not the right-wing ideologue he has been misunderstood to be. "Leftists much prefer Smith's other big work, The Theory of Moral Sentiments," it says. "Its deeply Scottish Presbyterian fulminations against materialistic desires for trinkets of frivolous utility, and lofty observation that man has some principles which interest him in the fortune of others, and render their happiness necessary to him, though he derives nothing from it, can be made to sound almost socialist."

I don't think that Adam Smith had socialism in mind, but something much deeper - sympathy, that is, the very human ability to have a strong feeling of concern for another person. Experts generally agree that Smith advocated both the self-interest of Wealth of Nations, and the sympathy of Theory of Moral Sentiments, with no contradiction between these two positions. In his view, "individuals in society find it in their self-interest to develop sympathy as they seek approval of what he calls the impartial spectator. The self-interest he speaks of is not a narrow selfishness but something that involves sympathy.”

Read the entire blog by clicking here.