• Music

    Random Thoughts on Leonard Cohen

    Thought Organization Status – Ramble

    After hearing a song from him on the show Your Honor I looked him up recently, and discovered that he had a great many albums (some live and compilations, but still) in his late period that I managed to miss during my bluegrass period. That led me to this article on his love lives, which I found quite interesting. I found the line “He had many adulthoods” quite gripping, though not quite accurate.

    Upon reflection on his personal life and career in the back of my head while doing a lot of rote cut and paste work I had the thought, “Leonard Cohen was a machine who converted people into art” (not meant as a compliment)

    The album cover is one of my favorite photographs

    Comments Off on Random Thoughts on Leonard Cohen
  • Culture,  Uncategorized

    Towards a unified theory of fringe groups in the great culture war

    One odd thing I’ve noticed about our current times is how massively illiterate the movements are. Communism in the 1930s actually turned people away, Woody Guthrie prominent among them. There seem to be no foundational texts amongst any of the fringe movements(Defund, the Q people, the Proud Boys, whatever passes for anarchism these days, the very woke, etc, etc).

    I first noticed this trend a few years ago in this interview with this… person.

    Oddly, I came across this post I wrote a while back

    Ideology as the usable consensus of extreme personality types – see my “Let’s Kill Hitler” book idea. Basically the ideology evolves not as the continuation of first principles, but as a series of compromises on the part of part of the extreme personalities involved – basically the ideology is whatever allows a certain collection of extreme personalities to work together. Cooperation is the important thing – not the consequences. An extreme ideology will be composed of extreme members and so forth. See the the alt-right and modern wokeness.

    I’m reminded of the Utah Phillips line “common sense of degradation”. The modern unifying feature would seem to be a common sense of alienation from visible society.

    Update – Feb 13th, 2023

    I was going to update the post as a clarifier, but it is probably better off as a new paragraph(s).

    I think what I’m trying to say is that books are no longer Schelling points. You could always point to some habit or event as not in keeping with the laws of the old testament or das kapital, but can you really point to a new habit or event as something that conflicts with the outrage over George Floyd or gas pipelines? Instead it just evolves with the changing preferences and day to day hatreds of the people drawn to the original Schelling point. Which basically means that the alienated fringe will be both dynamic and dysfunctional. Not a big deal really (what else would they be doing) but with internet and mobile technology they are unified and strangely influential on mass culture. Instead of the top 5% of the population in alienation moving in a thousand different directions they will move in 2 or 3. The lack of textual constraint allows them to keep up with current fashions, trends and technologies. And an active, motivated, unified (same people, shifting goals and language) 5% of people with very strong preferences is a meaningful marketing and voting bloc.

    I suppose in a way this is a rediscovery of BJ Campbell’s “auto update” feature of the culture war, but with lack of foundational texts as well as Schelling points added to the explanation.

    Update Feb 14th, 2023

    Thinking about this again – I realize that I’m underemphasizing the role of fashion and ease of coverage by the modern media. “Wokeism” (the Q people not as much) is very, very, very easy, and cheap to cover from a home office – all you have to do is weave a bunch of screenshots from Twitter into an existing story and there you go. Gresham’s law, improperly formulated strikes again – bad stories will drive out good stories based purely on price. Currently the ease of coverage and woke fashion overlap rather well., although crime does seem to be taking away from this a bit.

    An aside – it would be great if ChatGPT (or whatever AI is current at the time you read this) came up with an approximate price for each news story. Media articles in whatever form are usually not presented in terms of cost – but it would be great if they were. Say this is what your news source looked like:

    • Celebrity A says Celebrity B is washed up and has had bad plastic surgery ($35, warning, Tweets)
    • See our in depth, on scene coverage of the Syrian Civil War ($17,500)

    If an AI browser plugin created something like that, along with a “Hide under $1000” checkbox I would be eternally grateful.

    Comments Off on Towards a unified theory of fringe groups in the great culture war
  • Books,  Uncategorized

    The Land of Enterprise: A Business History of the United States – A review

    From my Notion template

    The Book in 3 Sentences

    1. A surface level history of business in America. There nothing terribly new or interesting in this book. Other books, most notably American Republics , do a better job of presenting information.

    Impressions

    It is a very surface level economic history of the United States. Granted – I’m a tough crowd, I’ve read several books on the topic – taken several classes, etc, etc, but there wasn’t that much new in the book.

    How I Discovered It

    The Amazon Algorithm

    Who Should Read It?

    Someone looking for a very rudimentary overview of US economic history

    How the Book Changed Me

    Nothing changed

    My Top 3 Quotes

    • Approximately 80 percent of the English migrants to Virginia between 1607 and 1624, or close to five thousand, were dead by 1625. Hemorrhaging money and unable to attract new investors, the Virginia Company failed in 1624, when the English government declared Jamestown a royal colony
    • The push for national independence grew strongest in the parts of the British empire that could envision their economies operating without the British army present.
    • In fact, by the mid-19th century, approximately two hundred thousand slaves worked in industrial settings. At the outbreak of the Civil War, more than sixty worked for, and were owned by, William Weaver, a native Philadelphian who moved to Virginia in 1814 to establish an iron forge with two charcoal blast furnaces in the Shenandoah Valley.

    All Quotes

    When historians use the term feudalism, they are attempting to describe an economic system in which power relations among people formed the building blocks of society.

    Monarchs in Spain, France, and England grew wealthier through trade, which spread from the Mediterranean to coastal West Africa, and then to the Americas. In the process, they consolidated military power at the expense of local lords.

    At the same time, these mercantilist exploits brought a major economic downside. The huge amounts of silver shipped back to Spain flooded the currency market, sparking a bout of inflation that lasted a century and crippled the Spanish economy. In spite of its large land-holdings in the Americas, Spain would never recover the economic power it wielded in the early 16th century.

    Most of those early migrants lived along the Atlantic coast in former Indian towns that had been abandoned during the plague epidemic that forced survivors inland from the coast in the late 16th century.

    fact, the first two successful English colonies in what would become the United States—Virginia (1607) and Massachusetts Bay (1620)—were themselves private companies.

    Joint-stock companies, the forerunners of today’s publicly owned corporations, pooled private sources of capital under the official protection of the crown, funding ventures that were too expensive or risky for an individual person. Drawing on a system of legal contracts developed in Italy centuries earlier, 16th-century English monarchs pioneered the practice of issuing corporate charters that granted an exclusive right to trade in a certain area to a particular group of subjects. In addition to creating a helpful monopoly, these charters created legal entities whose ownership was spread among several investors. These people purchased shares, or stock, to make up the whole company, which they owned jointly. Hence, “joint-stock company.”

    Approximately 80 percent of the English migrants to Virginia between 1607 and 1624, or close to five thousand, were dead by 1625. Hemorrhaging money and unable to attract new investors, the Virginia Company failed in 1624, when the English government declared Jamestown a royal colony.15

    Three years later, approximately one hundred people—a combination of the Separatists who had bought the patent and others who purchased their own passage directly—landed by accident far to the north of Jamestown in a former Massasoit Indian town, which they renamed Plymouth.

    (We start to refer to “Britain” instead of “England” after the Acts of Union in 1706 and 1707 by the English and Scottish Parliaments unified those two countries into the United Kingdom of Great Britain.)

    The American slave population became self-sustaining in the early 18th century, so even as the international trade declined, the population of enslaved people grew. By

    the 1770s, nearly seven hundred thousand people, or 15 percent of the total non-Indian population of the United States, were enslaved.

    Almost 95 percent of all enslaved people in the United States at the time of the Founding lived in Delaware, Maryland, Virginia, the Carolinas, and Georgia. One-third of the population of those southern colonies was enslaved, and approximately one-third of all southern households owned slaves.

    The push for national independence grew strongest in the parts of the British empire that could envision their economies operating without the British army present.

    On the other hand, Europeans on the periphery of the British empire depended greatly on the mother country. In present-day Canada, which Britain acquired from France after the Seven Years War in 1763, ongoing conflicts between the substantial native population and far-flung European fur traders and fishers meant that colonists depended greatly on British military support. In the slave societies of the West Indies, native inhabitants had been almost entirely annihilated, and small numbers of English colonists owned massive sugar plantations farmed by African slaves, whose numbers eclipsed those of their white owners by as much as ten to one in 1780. Landowners relied on brutal violence, sanctioned and backed by British law, and the strength of the British military, further cementing their ties to the crown.

    generations. Beginning in the early 15th century, merchants from kingdoms and city-states along the west coast of Africa established commercial relationships with Portuguese merchants, trading gold and spices for European metals and textiles. From the beginning, African-European commerce included the trade in human beings.

    In 1807, both the British Parliament and the U.S. Congress outlawed the international trade of slaves. (The Constitution of 1787, in an effort to forge a compromise between slave-owning interests and antislavery advocates, had included a clause prohibiting any move to ban the trade for twenty years.) By 1820, all other major European powers had as well.

    Large plantations certainly wielded disproportionate economic power, but most southern whites were not slave-owners. Historians estimate that, by the time of the Civil War, about 385,000 out of a total of 1.5 million white households in the South owned slaves. (African Americans and Native Americans did not own slaves in significant numbers, and were usually legally barred from doing so.) About half of these slave-owning households owned between one and five slaves; another 38 percent owned between six and twenty. Although they held a vastly disproportionate level of wealth, the remaining 12 percent of slave-owners (those who had twenty-plus slaves) represented only 3 percent of all white households.

    In fact, by the mid-19th century, approximately two hundred thousand slaves worked in industrial settings. At the outbreak of the Civil War, more than sixty worked for, and were owned by, William Weaver, a native Philadelphian who moved to Virginia in 1814 to establish an iron forge with two charcoal blast furnaces in the Shenandoah Valley.

    A significant number of enslaved people lived in urban areas such as Charleston and Baltimore. There, some slaves labored for, and often alongside, their owners in workshops, but many were owned by urban professionals—doctors, bankers, and lawyers who kept slaves as investment property. Some performed domestic duties, but more often they were hired out to work for private companies or to perform public works projects, such as digging canals or dredging harbors. Slave-owners received hourly pay for their slaves’ labor, and in many cases the enslaved people themselves brought home those wages in cash. In both cases, urban slaves often labored alongside free workers, both black and white.8

    By the eve of the Civil War, historians estimate that the total cash value of the 4 million slaves in the American South was $3.5 billion in 1860 money. At more than 80 percent of the country’s total economic output, that figure would be roughly $13.8 trillion today. Understood in that way, enslaved people were capital assets worth more than the country’s entire productive capacity from manufacturing, trade, and railroads combined.

    1776, when Thomas Jefferson declared the “self-evident” truth that “all men are created equal,” nearly 15 percent of the 4 million non-Indian inhabitants of the United States were enslaved. Although slavery remained legal in all states, almost 95 percent of enslaved people lived south of Pennsylvania, and the highest concentration was in Virginia.

    By 1804, every state north of Delaware had legally abolished the practice, and new midwestern states and territories that joined the nation in the decades to come likewise prohibited it.

    Slavery in the North died out because of the organizational power of antislavery activists combined with the lack of large-scale commercial agriculture in the region.

    Evidence suggests that many, if not most, white northerners had no moral problem with slavery, but few powerful interests had much to gain by defending it.

    In 1794, a twenty-eight-year-old Yale-educated New Englander named Eli Whitney, engaged as a tutor for the children of a plantation owner in South Carolina, patented a machine that mechanically separated cotton fibers from cotton seed. According to the traditional story, Whitney invented this “cotton gin” (gin was short for “engine”) after observing enslaved people slowly and painfully removing seeds from cotton balls.

    The amount of cotton an individual enslaved person could prepare for export rose as use of the mechanical devices spread. By some estimates, the per-slave cotton yield increased 700 percent.

    The results for the cotton industry were astounding. Southern planters produced around 3,000 bales of cotton per year in the early 1790s. By 1820—by which time domestic textile manufacturing had spread considerably—that number approached 450,000. By the eve of the Civil War in 1860, the South grew and exported (either domestically or abroad) nearly 5.5 million bales of cotton per year.

    Karl Marx, who was simultaneously capitalism’s fiercest critic and its most trenchant analyst, viewed slavery and capitalism as incompatible.

    Just as slavery drove the southern economy, manufacturing became increasingly important to the economies of the Northeast and, by the middle of the 19th century, the Midwest. And just as slavery’s social and economic reach extended far beyond the South, so, too, did industrialization exert a powerful influence on all Americans.

    The Boston Associates engaged so-called “mill girls” to perform the difficult and monotonous work of textile production. Primarily the daughters of white Protestant farmers, these workers encountered a paternalistic social system at the mills, designed to “protect” their feminine virtue and convince their parents to allow them the social independence to live away from home. Lowell provided dormitories for workers as well as churches, libraries, and stores.

    In 1790, only 5 percent of Americans lived in urban settings; by 1860, 20 percent did.

    Unlike roads, waterways allowed merchants to move large quantities of textiles, iron, slaves, and foodstuffs over significant distances. According to one estimate, the amount of money it took to ship a ton of goods from Europe to an American port city would only get the same cargo about thirty miles inland pulled by a wagon.

    These trenches—a few feet deep, a few dozen feet across, but sometimes hundreds of miles long—represented a tremendous engineering challenge. They were designed so that draft animals could walk parallel to the water, dragging nonmotorized barges laden with goods. As a result, the

    In most cases, municipal governments saw a positive return on their investments, not only in direct payments but also through the tremendous economic growth generated by the new system of canals—three thousand miles’ worth by the 1840s, linking the Atlantic seaboard with midwestern cities such as Terre Haute, Indiana, and Cincinnati, Ohio.

    while canals increased the ease with which large quantities of goods could be moved from the interior to the seaboard, ultimately those goods moved only as quickly as the oxen dragging the barges.

    The American foray into rail began in 1828, when the state of Maryland chartered the Baltimore and Ohio Railroad company, which laid tracks to the west to create an alternative to canal traffic. The first steam-powered locomotive to travel those rails moved slower than a horse, but within two decades, the technology improved. The railroad boom took off in the late 1840s, and the number of miles of tracks multiplied. Americans laid more than twenty-one thousand miles of railroad track in the 1850s. By the eve of the Civil War, a New Yorker could reach Chicago in two days, a trip that would have taken three weeks in 1830.

    The U.S. Post Office, which was granted a special license and responsibility by the Constitution to deliver the country’s mail, expanded from seventy-five branches in 1790 to more than eighteen thousand by 1850.

    In the 1840s, a group of investors formed a rapid-delivery service that charged customers high fees to move parcels by stagecoach westward from the East Coast. Within ten years, that original partnership broke up into several specialized companies, including Wells Fargo and American Express.

    Yellow highlight | Page: 54
    The first telegraph was created by the French government in the 1790s to allow communication from Paris to twenty-nine cities up to five hundred miles away. But those original telegraph networks were optical, not electronic. To make them work, trained operators staffed towers spaced ten to twenty miles apart, from which they sent coded signals by shifting the positions of specialized panels. Although nothing in the United States matched the complexity of the French system, smaller networks of optical telegraphs emerged along the Atlantic Coast in the first decade of the 1800s, and others connected New York and Boston to their outlying farming communities in the 1810s. Optical

    By erecting poles alongside railroad tracks, telegraph companies made all parts of their network accessible, so they could perform maintenance and protect against the elements and sabotage. Thus, as railroads spread across the continent in the mid-19th century, the electronic telegraph went with them.

    Yet before 1800, corporate charters were far from common, and almost no business enterprises were incorporated. Because charters had to be granted by the sovereign—the king or Parliament in colonial times; the state or federal legislature after independence—the few corporations extant were almost exclusively public operations, such as turnpikes, bridges, churches, and cities, including New York. During the entire 18th century, charters were issued to only 335 businesses—and much more than half of those were issued in the last four years of the century.

    Unlike today, when incorporation is granted in perpetuity, most antebellum corporate charters were limited in time, set to expire after a fixed period of ten, twenty, or thirty years. Nonetheless, having a distinct legal existence separate from their owners made corporations appear more stable and predictable, and made them more attractive to investors.

    Slowly, states turned to a new model known as general incorporation, granting corporate charters administratively, rather than legislatively. In 1811, New York became the first state to enact such a law for manufacturing firms. In 1837, Connecticut became the first state to allow general incorporation for any kind of business.

    A landowning Virginian, Jefferson believed that self-reliant and small-scale family farms, not impersonal factories (or, ironically, large slave-labor plantations like his), provided a bulwark against tyranny and ensured the future of self-governance.

    In the years to come, the Jeffersonian Republicans completed the rout of the Federalists at nearly all levels—by the War of 1812, the Federalist Party barely clung on in remote and far-flung corners of state and local politics, but had largely disappeared as a national force.3 Yet

    These three pillars—tariffs, internal improvements, and a national bank—formed the essence of the American System.

    In addition, the Bank’s corporate structure reinforced the privileged place of the wealthy: The federal government itself owned 20 percent of the corporate stock, while the other 80 percent was sold to wealthy Americans. Yet this structure was exactly as Hamilton intended. By catering to elite merchants, the Bank wrapped up their financial interests in federal institutions and thus guaranteed that they would continue to lend political, moral, and economic support to the Constitution and its government.

    Jackson’s coalition, heirs to the Jeffersonian Republican tradition, renamed themselves as Democrats during the fight. Beginning in 1833, their opponents identified as Whigs, taking the name of the British party that had historically challenged the authority of the king. (The “king,” to these American Whigs, was Jackson himself.) America’s second party system was born. The

    Hoping to mollify southern planters, Congress passed, and President Jackson signed, a law to lower tariff rates in the summer of 1832. Enraged politicians in South Carolina still insisted that the rates were too high. In the fall of 1832, South Carolina’s legislature passed a law nullifying the federal tariff. In response, a vengeful Andrew Jackson asked Congress for the right to use military force to collect tariffs in that state. Calhoun resigned the vice presidency and declared that any use of force by Jackson would give South Carolina a just cause to declare its independence from the United States.15 For a few months, the possibility of armed conflict appeared real. Only skillful diplomacy defused the crisis. Congress passed a compromise tariff that lowered rates, on the condition that South Carolina repeal its nullification statute. Yet the battle lines that formed over the Nullification Crisis, as well as the constitutional and legal theories about the relationship between the federal government and the states, established a powerful precedent.

    The first “land grant” law, passed in 1850, designated a line—which would become the Illinois Central Railroad—between Mobile, Alabama, and Chicago, Illinois. The law created a series of six-mile-square land parcels along each side of the proposed track; in an alternating, checkerboard pattern, the federal government bequeathed every other parcel to the states of Illinois, Alabama, and Mississippi, and sold the others off to farmers.

    oceans. Although economic recession in the mid-1870s slowed the juggernaut somewhat, Americans laid up to 8,000 miles of track per year through the 1880s. By 1890, the country boasted 166,000 miles; by the early 20th century, there would be 254,000 miles of tracks.

    Railroads became the first “Big Business” because they combined the unique scale and scope of their industry and the deliberate choices by their leaders to adopt what we now recognize as a modern system of management.

    Educated and skilled office workers, they would—along with other professionals such as doctors, lawyers, and accountants—form the heart of a new urban middle class in the modern American economy. Keeping their company in business for years to come meant job security, so professional managers tended to promote stable and less risky business practices.

    In 1856, he borrowed $600 from a personal mentor (who was also his boss) and bought stock in a transport company that soon paid him his first return: a check for $10. By 1863, still a manager at the Penn Railroad, the $45,000 he made per year from his stock investments far outpaced his salary.

    In 1867, he and a handful of partners launched an oil refinery in Cleveland, just as the commercial petroleum industry was beginning to grow.

    A 19th-century “trust” resembled what we would call a holding company today. As a legal entity distinct from any of the member companies, the Standard Oil Trust controlled all the stock of those corporations, centralizing control over prices, distribution schedules, and other business decisions. By the 1890s, more than 90 percent of the oil produced in the United States was refined through Standard

    As a percentage of gross domestic product, which at the turn of the last century was about $21 billion, the merger that birthed U.S. Steel would be worth about $1 trillion today.

    The largest and fastest-growing corporations in the decades after the Civil War were typically more capital-intensive than labor-intensive.

    Morgan had an extremely conservative disposition toward risk, even by the standards of bankers, who were traditionally averse to excessive gambles.

    The disastrous financial Panic of 1893 created new opportunities for Morgan to put his vision of corporate control into action. The economic collapse, precipitated by overspeculation in railroads, crippled the nation. More than fifteen thousand companies, including six hundred banks, failed in what became the worst economic depression to that point in U.S. history. As countless railroad companies teetered on the verge of bankruptcy, Morgan and a handful of his partners engineered a series of takeovers and mergers. Shareholders in those failing companies surrendered their stocks in exchange for “trust certificates,” and the House of Morgan took control of the companies’ assets. In the aftermath of the Panic of 1893, approximately thirty-three thousand miles of railway track (one-sixth of the total) was “Morganized.”22

    The Great Railroad Strike of 1877—the first major industrial strike in American history—began in July, when workers on the Baltimore and Ohio Railroad in West Virginia walked off the job. Their specific grievances were local—a series of sharp pay cuts as the B&O struggled through the protracted economic depression that followed the Panic of 1873—but their fury echoed across the industrial heartland. Within

    Men, women, and children by the thousands toiled in dirty, dark, dangerous environments in factories and mills, quarries and mines, rigs and rail yards. The spread of mechanization and chemical technologies made work itself more boring and, simultaneously, more dangerous. Booming industry drew rural Americans away from farms and into cities, where they competed with a massive influx of European immigrants in a flooded labor market.

    One of the earliest and most dramatic manifestations of class tension between laborers and economic elites was the creation of a national labor union in 1869 called the Knights of Labor. Officially called “The Noble and Holy Order of the Knights of Labor” and first organized as a secret society, the Knights of Labor grew into a major voice for the wholesale reform of the industrial system. Unlike most trade unions, the Knights welcomed both skilled and unskilled workers from the craft, retail, and manufacturing sectors, and, quite notable for their day, they encouraged membership by both African Americans and women. Central to the Knights’ social vision was the notion of the producer. So long as you made something, they reasoned, you served a social good, regardless of your race, sex, or relationship to the means of production. The only people the group actively excluded were “nonproducers”—liquor dealers, gamblers, lawyers, and bankers, for example.

    A horrific incident in Chicago in the spring of 1886 helped cement the link between the Knights of Labor and radical, often violent, socialism in the minds of many business leaders. Amid a labor protest in Haymarket Square, someone threw a bomb that killed ten people. Eight suspects, all loosely affiliated with the Knights and variously described as anarchists, were convicted of murder. The Knights themselves were not involved, but their public image never recovered. Membership peaked in 1886, and the group declined in size and influence thereafter.

    Within weeks, the strikers attracted the support of the burgeoning American Railway Union (ARU), the first industry-specific nationwide union. The ARU had been founded the previous year by Eugene V. Debs, a labor organizer from Indiana who would later—after his imprisonment for leading the Pullman Strike—become the country’s most prominent socialist politician and activist.

    Troops killed several dozen strikers in clashes before the strike ended. Eugene Debs served six months in jail for violating a federal injunction to allow rail traffic to resume. During his imprisonment, he became a committed Marxist and later converted his American Railway Union into a socialist political party.

    Yet the most distinctive aspect of the farmers’ political program, and the issue with which Bryan launched his career, was their attack on eastern banks and the influence of financiers over the national government. Monetary policy, in particular the question of the free coinage of silver, was their primary focus. The “silver question” often strikes history students as esoteric, obscure, and technical, yet it was one of the single most important political issues of the late 19th century. The struggle split the country between those who favored minting coins only in gold—monometallists—and those who wanted to use both silver and gold—the bimetallists. As a political rallying cry, the silver debate proved instrumental to a larger critique of corporate capitalism. Bimetallists,

    As with everything in the history of big business, the story of regulation begins with the railroads.

    Another strategy, which took aim at the rise of corporate monopolies more explicitly, led to the passage of the Sherman Antitrust Act in 1890. If the ICC had represented an effort to regulate monopolistic behavior, the antitrust movement endeavored to disband monopolistic companies entirely. Named after its chief proponent, Ohio senator John Sherman (brother of Union general William Tecumseh Sherman), the act sought to preserve the benefits of free competition by cracking down hard on anticompetitive behavior. It criminalized “every contract, combination in the form of trust or otherwise, or conspiracy, in restraint of trade or commerce.” Instead of prescribing rules and procedures to mitigate corporate power, as the ICC did, the law required the criminal prosecution of “every person who shall monopolize, or attempt to monopolize, or combine or conspire with any other person or persons, to monopolize any part of the trade of commerce.

    For the antimonopoly forces, U.S. Steel’s survival exposed the limitations of the antitrust movement. The Sherman Act successfully attacked cartels and price-fixing schemes, but because it banned restraint of trade, not market dominance in general, it did nothing to curb corporate mergers

    Historians of the period have long used the phrase “Progressive Era” to describe the years between the turn of the century and the onset of World War I, defined by a political and intellectual response to the rapid rise of industrialized society. Rooted neither in radical socialism nor in unfettered laissez-faire economics, Progressivism sought to mitigate capitalism’s excesses while retaining its benefits. In the process, the Progressive period both reaffirmed classical elements of the American political tradition and established new institutions, government agencies, and expectations about the promise of democracy.

    1908 Model T cost $850, but by the early 1920s, the price had fallen to under $300.

    As one journalist put it in 1924: “The American citizen has more comforts and conveniences than kings had 200 years ago.”

    Henry Ford didn’t invent the car—there were already six different models on display at Chicago’s Columbian Exposition in 1893—but his devotion to the Model T starting in 1908 revolutionized the industry. Standardization was key: Ford simplified the design of his “Tin Lizzie” and used a bare minimum of parts (about five thousand).

    To entice his workers to remain, Ford also pioneered labor policies that appeared progressive to many. In 1914, the company introduced the “five-dollar day,” when two dollars a day was more typical.9 In the next few years, Ford reduced the workday to eight hours and the workweek from six to five days, goals long championed by the labor, populist, and socialist movements. His business success, his personal austerity (especially when compared to the flamboyant wealth of men like J. P. Morgan), and his public devotion to the ideal that industrial workers should be able to afford the fruit of their labors—that a car should be inexpensive enough for the masses—contributed to Ford’s personal popularity around the world.

    Before the rise of industrial capitalism, however, retail outfits were only found in cities and mostly sold specialty items such as books and furs. The idea of shopping for a variety of grocery or household items—or the notion that stores themselves could be big businesses—didn’t develop until the late 19th century.

    In 1872, a Chicago-based traveling salesman named A. Montgomery Ward launched the country’s first mail-order firm. He published an illustrated catalog and mailed it, free of charge, to small-town farmers, who could then order products at lower prices than what local merchants charged. In 1886, another Chicagoan—a twenty-three-year-old man named Richard Sears—imitated Montgomery Ward’s success and started selling pocket watches by mail. Within a few years, Sears partnered with Alvah Roebuck, a watch repair specialist, providing both sales and maintenance services, all remotely. The pair broadened their offerings to compete directly with Montgomery Ward—their catalog became known as “the Farmer’s Bible,” and their Chicago warehouse filled orders from around the country.

    Incorporated in 1893, Sears, Roebuck supplanted Montgomery Ward and became the country’s biggest mail-order company by 1900.13 In

    The earliest and most famous pioneers of this model were F.W. Woolworth’s, which operated from 1878 to 1997, and the Great Atlantic and Pacific Tea Company, which survived until 2015 as the A&P supermarket. Formed in 1859, the A&P opened its doors in New York as a discount purveyor of teas and coffees, which its founders purchased in bulk straight from ships (allowing them to offer cheaper prices). Within twenty years, the A&P offered a wide variety of grocery products and owned stores in more than one hundred locations, stretching from Minnesota to Virginia. Combining efficient distribution channels, inventory management, and low costs—the hallmarks of Taylorism—grocery chains like the A&P grew prominent in the early 20th century.14 The success of the retail revolution, and the chain store model in particular, changed the way Americans identified as consumers, but it came at a cost.

    Invented in the mid-19th century, Listerine was an alcohol-based chemical designed as a powerful antiseptic for use during surgery. In 1920, advertising copywriters for Lambert launched a marketing campaign that proposed a new use for this old product—as a solution to bad breath (when taken in small quantities and not swallowed!). In its ads, Lambert introduced Americans to the word “halitosis,” an obscure but clinical-sounding, scientific word for “bad breath,” giving the impression that Listerine addressed a pressing medical problem. By 1927, Lambert’s profits, driven by Listerine sales, had skyrocketed from one hundred thousand dollars a year to more than $4 million, in the process changing the daily ablution habits of millions of people.18

    General Motors was founded in 1904 when William Durant, a carriage maker in Flint, Michigan—just seventy miles northwest of Henry Ford’s headquarters in Detroit—took over a small and failing car company called Buick. Over the next several years, Durant expanded his production of Buicks and absorbed dozens of other car manufacturers under his corporate umbrella, following the model of horizontal integration pioneered by large extractive companies such as Standard Oil. That rapid growth created organizational and managerial confusion, because the various constituent companies (Oldsmobile, Cadillac, and so on) each had their own internal structure, products, and corporate culture. Many of General Motors’s cars targeted the same type of consumer, leading to a frustrating internal competition that hurt profits. In the mid-1910s, Durant set his company on the road to resolving these problems by launching an important collaboration with executives from the DuPont Corporation.

    Only after the death of Henry Ford in the late 1940s would the Ford Motor Company start to catch on to the modern strategy for marketing cars.

    What made the years framed by the Roosevelt presidencies so pivotal for business history was not the flamboyant rhetoric, but the long-term dance between two emerging giants of the 20th century: the massive integrated corporation and the administrative, bureaucratic state, which developed an essentially associational relationship with each other.

    During World War I, income taxes provided vital revenue for the government, but the tax regime was steeply progressive, applying only to the top earners. Only approximately 15 percent of American households paid any income taxes at all in 1918; the richest 1 percent contributed about 80 percent of all revenue and paid effective tax rates of about 15 percent of their total income.

    During the Coolidge administration (1923–29), Mellon achieved many of his goals, and the top rate paid by individuals declined from 73 to 25 percent.

    Total corporate profits fell from $10 billion to $1 billion, a drop of 90 percent.

    By the time the market bottomed out in 1933, nominal gross domestic product was nearly half what it had been in 1929.

    and gestures such as the “Hoover flag,” an empty pocket turned inside out.

    As he put it: “We put those payroll contributions there so as to give the contributors a legal, moral, and political right to collect their pensions and their unemployment benefits. With those taxes there, no damn politician can ever scrap my social security program.”29 Social Security was highly popular (rising from a 68 percent approval rating in 1936 to 96 percent in 1944), but many large corporations and business associations recoiled at the new expense employers faced.

    All told, the United States government spent approximately $320 billion (in 1940s money) on World War II, about half of it borrowed from the public through bond sales and the other half raised in taxes. That spending provided a massive boost to the gross national product, which shot up from $88.6 billion in 1939 to $135 billion in 1945.

    By 1944, unemployment had fallen to just over 1 percent (remember that the official rate hit 25 percent in 1933). Within the span of eleven years, in other words, the country had seen both the highest and lowest levels of joblessness of the century.

    In 1929, prescription drugs accounted for only 32 percent of all medicines purchased in the United States (by cost); by 1969, that figure reached 83 percent.

    In 1954, however, General Electric became the first private firm to own a mainframe computer when it bought a UNIVAC.

    Roughly defined, a conglomerate is a corporation that conducts business in a wide range of markets and industries that have little or no relationship to one another. Berkshire Hathaway, the company founded by billionaire investor Warren Buffett, provides a familiar example of the form today—it acts as a holding company that owns and operates an array of disparate businesses, from GEICO insurance to Jordan’s Furniture to Fruit of the Loom.

    The rise of the conglomerate form reshaped managerial culture. Conglomerate builders such as Charles Bluhdorn succeeded, at least for a time, because they were experts at managing their company as an investment portfolio, not as a productive entity.

    From the 1960s onward, boards of directors increasingly sought to hire men (and let’s not forget that occupying the corner office was a nearly exclusively male privilege) who were experts not in a particular industry or niche, but in business management itself. Versatile generalists, holding degrees from the newfangled business schools mushrooming throughout the country, could adapt their broad understanding of business principles to any specific managerial problem they encountered. Conglomerate executives in particular often bragged that they could manage their companies through financial controls and measurements, remaining disconnected from the actual product or service the company provided. The

    Bluhdorn’s successor (he died of a heart attack in 1983) renamed the company Paramount Communications in 1989 to take advantage of one of its highest-profile holdings, Paramount Pictures. The entire operation became part of the media giant Viacom in 1994.

    In short, the dominant regulatory trend had been economic regulation. In contrast, the trend in the 1960s and 1970s was toward social regulations, rules that, by design, targeted aspects of business behavior not traditionally considered “economic”—public health and safety and, quite literally, the downstream consequences of companies’ production processes. There had been earlier examples of social regulation, including the Pure Food and Drug Act of 1906, which led to the creation of the FDA to improve the safety and quality of food and medicine. Yet the scale and scope of this new type of regulation exploded in the late 1960s, reinforcing a cultural and political distinction between protecting the economy and protecting people from business.

    In the early 1970s, Congress overhauled the laws governing campaign finance contributions. The federal government had regulated campaign giving to various degrees since the Tillman Act of 1907, which barred corporations and unions from donating to political campaigns on the rather explicit grounds that they were not humans.

    Instead, with minor exceptions, businesspeople preferred other, less official ways to skirt the campaign finance laws. Executives, for example, routinely arranged for special bonuses to top managers, with the clear expectation that those managers would donate their windfall to the candidate of the corporation’s choice.

    In 1975, the FEC clarified that political action committees were legally legitimate, and an explosion in corporate-backed political action committees followed. In the four years between 1974 and 1979, the number of business PACs increased tenfold, from 89 to 950, while the number of labor PACs barely budged, rising only from 201 to 226. The number of corporate PACs continued to soar, peaking around 1,800 in the late 1980s before declining slightly and largely leveling off. In the winter of 2016, the Federal Election Commission counted 1,621 political action committees affiliated with businesses, and 278 for labor.

    Not satisfied with running a traditional restaurant, the McDonalds spent the 1940s searching for a way to simplify. They wanted a food item that they could perfect and sell at a constant, affordable, and profitable price. They settled on the hamburger.

    In 1955, the American automobile giant General Motors topped Fortune magazine’s list of global companies ranked by annual revenue. For the remainder of the century, GM held that crown. Yet in 2002, it fell to second place, bested by a company that had barely been known outside of Arkansas in 1980 but exploded onto the international stage thereafter: Walmart.

    The son of a farmer-turned-debt-collector, a teenage Sam Walton spent the Great Depression with his father foreclosing on delinquent farms in Missouri.

    Rural America had traditionally been a hotbed of populist opposition to unfettered capitalism, from the anti-chain-store movement to opposition to the gold standard and eastern finance. Yet by the late 20th century, conservative politicians found greater success linking evangelical Christianity with free market economics.

    When Gates stepped down from Microsoft in 2014 (having reduced his role since 2000), he was the wealthiest person on Earth.

    At the beginning of 2008, five venerable and highly respected investment banks—the descendants of the “House of Morgan”—sat atop American financial capitalism. By the fall of that year, none of them existed. Two, Bear Stearns and Merrill Lynch, avoided bankruptcy through emergency mergers (engineered to a significant degree by government officials and the Federal Reserve) with J.P. Morgan and Bank of America, respectively. The 150-year-old Lehman Brothers was not so lucky. After its leaders failed to convince government regulators to offer either a direct bailout or a “shotgun marriage” to another financial institution, Lehman entered the largest bankruptcy in history on September 15. The remaining two, Goldman Sachs and Morgan Stanley, surrendered their status as investment banks and transformed themselves legally into traditional bank holding companies, which faced far greater government regulation in exchange for easier access to government loans. As the former chairman of the Federal Deposit Insurance Corporation put it, it was “the end of Wall Street as we have known it.”

    That political shift proved to be a guiding force behind the movement for widespread deregulation, which often garnered the support of groups that otherwise opposed each other politically. In 1978, President Jimmy Carter signed the Airline Deregulation Act, a law spearheaded by liberal politicians such as Ted Kennedy (D-MA) and consumer activist Ralph Nader, as well as free-market conservatives like the economist Milton Friedman. The opponents to airline regulation argued that a more market-driven airline industry would face greater competition to cut rates and, eventually, provide better service.8

    By the mid-1990s, having stripped away most of its functions, Congress finally dissolved the Interstate Commerce Commission

    The redirection of capital to financial pursuits—the hallmark of the process of financialization—led to growing number of deals known as “leveraged buy-outs” (LBOs)—mergers that depended on tremendous amounts of borrowed money.

    Before the mid-20th century, just under 50 percent of American households owned their own home. Excluding farmers, who owned land at disproportionate rates, the rate of homeownership was below 40 percent. Between 1940 and 1970, American homeownership rose steadily to about 65 percent,

    After hovering around 65 percent for several decades, homeownership rates rose quickly between the late 1990s and 2007, reaching 69 percent.

    either renegotiate the loan or resell the home later at a profit, convinced many people that homeownership was a foolproof investment. “They’re not making any more land,” went a common refrain.

    This unfounded faith in the never-falling value of houses was perpetuated by mortgage lenders and the real estate business, which profited from every loan made and every home purchased. (Economic historians showed that the misunderstanding of historical home price values came from the simplest of oversights: Once you account for the overall increase in prices over time, the real—noninflationary—price of homes remained remarkably stable throughout the entire 20th century.)

    Comments Off on The Land of Enterprise: A Business History of the United States – A review