-
Towards a unified theory of fringe groups in the great culture war
One odd thing I’ve noticed about our current times is how massively illiterate the movements are. Communism in the 1930s actually turned people away, Woody Guthrie prominent among them. There seem to be no foundational texts amongst any of the fringe movements(Defund, the Q people, the Proud Boys, whatever passes for anarchism these days, the very woke, etc, etc).
I first noticed this trend a few years ago in this interview with this… person.
Oddly, I came across this post I wrote a while back
Ideology as the usable consensus of extreme personality types – see my “Let’s Kill Hitler” book idea. Basically the ideology evolves not as the continuation of first principles, but as a series of compromises on the part of part of the extreme personalities involved – basically the ideology is whatever allows a certain collection of extreme personalities to work together. Cooperation is the important thing – not the consequences. An extreme ideology will be composed of extreme members and so forth. See the the alt-right and modern wokeness.
I’m reminded of the Utah Phillips line “common sense of degradation”. The modern unifying feature would seem to be a common sense of alienation from visible society.
Update – Feb 13th, 2023
I was going to update the post as a clarifier, but it is probably better off as a new paragraph(s).
I think what I’m trying to say is that books are no longer Schelling points. You could always point to some habit or event as not in keeping with the laws of the old testament or das kapital, but can you really point to a new habit or event as something that conflicts with the outrage over George Floyd or gas pipelines? Instead it just evolves with the changing preferences and day to day hatreds of the people drawn to the original Schelling point. Which basically means that the alienated fringe will be both dynamic and dysfunctional. Not a big deal really (what else would they be doing) but with internet and mobile technology they are unified and strangely influential on mass culture. Instead of the top 5% of the population in alienation moving in a thousand different directions they will move in 2 or 3. The lack of textual constraint allows them to keep up with current fashions, trends and technologies. And an active, motivated, unified (same people, shifting goals and language) 5% of people with very strong preferences is a meaningful marketing and voting bloc.
I suppose in a way this is a rediscovery of BJ Campbell’s “auto update” feature of the culture war, but with lack of foundational texts as well as Schelling points added to the explanation.
Update Feb 14th, 2023
Thinking about this again – I realize that I’m underemphasizing the role of fashion and ease of coverage by the modern media. “Wokeism” (the Q people not as much) is very, very, very easy, and cheap to cover from a home office – all you have to do is weave a bunch of screenshots from Twitter into an existing story and there you go. Gresham’s law, improperly formulated strikes again – bad stories will drive out good stories based purely on price. Currently the ease of coverage and woke fashion overlap rather well., although crime does seem to be taking away from this a bit.
An aside – it would be great if ChatGPT (or whatever AI is current at the time you read this) came up with an approximate price for each news story. Media articles in whatever form are usually not presented in terms of cost – but it would be great if they were. Say this is what your news source looked like:
- Celebrity A says Celebrity B is washed up and has had bad plastic surgery ($35, warning, Tweets)
- See our in depth, on scene coverage of the Syrian Civil War ($17,500)
If an AI browser plugin created something like that, along with a “Hide under $1000” checkbox I would be eternally grateful.
-
The Land of Enterprise: A Business History of the United States – A review
From my Notion template
The Book in 3 Sentences
- A surface level history of business in America. There nothing terribly new or interesting in this book. Other books, most notably American Republics , do a better job of presenting information.
Impressions
It is a very surface level economic history of the United States. Granted – I’m a tough crowd, I’ve read several books on the topic – taken several classes, etc, etc, but there wasn’t that much new in the book.
How I Discovered It
The Amazon Algorithm
Who Should Read It?
Someone looking for a very rudimentary overview of US economic history
How the Book Changed Me
Nothing changed
My Top 3 Quotes
- Approximately 80 percent of the English migrants to Virginia between 1607 and 1624, or close to five thousand, were dead by 1625. Hemorrhaging money and unable to attract new investors, the Virginia Company failed in 1624, when the English government declared Jamestown a royal colony
- The push for national independence grew strongest in the parts of the British empire that could envision their economies operating without the British army present.
- In fact, by the mid-19th century, approximately two hundred thousand slaves worked in industrial settings. At the outbreak of the Civil War, more than sixty worked for, and were owned by, William Weaver, a native Philadelphian who moved to Virginia in 1814 to establish an iron forge with two charcoal blast furnaces in the Shenandoah Valley.
All Quotes
When historians use the term feudalism, they are attempting to describe an economic system in which power relations among people formed the building blocks of society.
Monarchs in Spain, France, and England grew wealthier through trade, which spread from the Mediterranean to coastal West Africa, and then to the Americas. In the process, they consolidated military power at the expense of local lords.
At the same time, these mercantilist exploits brought a major economic downside. The huge amounts of silver shipped back to Spain flooded the currency market, sparking a bout of inflation that lasted a century and crippled the Spanish economy. In spite of its large land-holdings in the Americas, Spain would never recover the economic power it wielded in the early 16th century.
Most of those early migrants lived along the Atlantic coast in former Indian towns that had been abandoned during the plague epidemic that forced survivors inland from the coast in the late 16th century.
fact, the first two successful English colonies in what would become the United States—Virginia (1607) and Massachusetts Bay (1620)—were themselves private companies.
Joint-stock companies, the forerunners of today’s publicly owned corporations, pooled private sources of capital under the official protection of the crown, funding ventures that were too expensive or risky for an individual person. Drawing on a system of legal contracts developed in Italy centuries earlier, 16th-century English monarchs pioneered the practice of issuing corporate charters that granted an exclusive right to trade in a certain area to a particular group of subjects. In addition to creating a helpful monopoly, these charters created legal entities whose ownership was spread among several investors. These people purchased shares, or stock, to make up the whole company, which they owned jointly. Hence, “joint-stock company.”
Approximately 80 percent of the English migrants to Virginia between 1607 and 1624, or close to five thousand, were dead by 1625. Hemorrhaging money and unable to attract new investors, the Virginia Company failed in 1624, when the English government declared Jamestown a royal colony.15
Three years later, approximately one hundred people—a combination of the Separatists who had bought the patent and others who purchased their own passage directly—landed by accident far to the north of Jamestown in a former Massasoit Indian town, which they renamed Plymouth.
(We start to refer to “Britain” instead of “England” after the Acts of Union in 1706 and 1707 by the English and Scottish Parliaments unified those two countries into the United Kingdom of Great Britain.)
The American slave population became self-sustaining in the early 18th century, so even as the international trade declined, the population of enslaved people grew. By
the 1770s, nearly seven hundred thousand people, or 15 percent of the total non-Indian population of the United States, were enslaved.
Almost 95 percent of all enslaved people in the United States at the time of the Founding lived in Delaware, Maryland, Virginia, the Carolinas, and Georgia. One-third of the population of those southern colonies was enslaved, and approximately one-third of all southern households owned slaves.
The push for national independence grew strongest in the parts of the British empire that could envision their economies operating without the British army present.
On the other hand, Europeans on the periphery of the British empire depended greatly on the mother country. In present-day Canada, which Britain acquired from France after the Seven Years War in 1763, ongoing conflicts between the substantial native population and far-flung European fur traders and fishers meant that colonists depended greatly on British military support. In the slave societies of the West Indies, native inhabitants had been almost entirely annihilated, and small numbers of English colonists owned massive sugar plantations farmed by African slaves, whose numbers eclipsed those of their white owners by as much as ten to one in 1780. Landowners relied on brutal violence, sanctioned and backed by British law, and the strength of the British military, further cementing their ties to the crown.
generations. Beginning in the early 15th century, merchants from kingdoms and city-states along the west coast of Africa established commercial relationships with Portuguese merchants, trading gold and spices for European metals and textiles. From the beginning, African-European commerce included the trade in human beings.
In 1807, both the British Parliament and the U.S. Congress outlawed the international trade of slaves. (The Constitution of 1787, in an effort to forge a compromise between slave-owning interests and antislavery advocates, had included a clause prohibiting any move to ban the trade for twenty years.) By 1820, all other major European powers had as well.
Large plantations certainly wielded disproportionate economic power, but most southern whites were not slave-owners. Historians estimate that, by the time of the Civil War, about 385,000 out of a total of 1.5 million white households in the South owned slaves. (African Americans and Native Americans did not own slaves in significant numbers, and were usually legally barred from doing so.) About half of these slave-owning households owned between one and five slaves; another 38 percent owned between six and twenty. Although they held a vastly disproportionate level of wealth, the remaining 12 percent of slave-owners (those who had twenty-plus slaves) represented only 3 percent of all white households.
In fact, by the mid-19th century, approximately two hundred thousand slaves worked in industrial settings. At the outbreak of the Civil War, more than sixty worked for, and were owned by, William Weaver, a native Philadelphian who moved to Virginia in 1814 to establish an iron forge with two charcoal blast furnaces in the Shenandoah Valley.
A significant number of enslaved people lived in urban areas such as Charleston and Baltimore. There, some slaves labored for, and often alongside, their owners in workshops, but many were owned by urban professionals—doctors, bankers, and lawyers who kept slaves as investment property. Some performed domestic duties, but more often they were hired out to work for private companies or to perform public works projects, such as digging canals or dredging harbors. Slave-owners received hourly pay for their slaves’ labor, and in many cases the enslaved people themselves brought home those wages in cash. In both cases, urban slaves often labored alongside free workers, both black and white.8
By the eve of the Civil War, historians estimate that the total cash value of the 4 million slaves in the American South was $3.5 billion in 1860 money. At more than 80 percent of the country’s total economic output, that figure would be roughly $13.8 trillion today. Understood in that way, enslaved people were capital assets worth more than the country’s entire productive capacity from manufacturing, trade, and railroads combined.
1776, when Thomas Jefferson declared the “self-evident” truth that “all men are created equal,” nearly 15 percent of the 4 million non-Indian inhabitants of the United States were enslaved. Although slavery remained legal in all states, almost 95 percent of enslaved people lived south of Pennsylvania, and the highest concentration was in Virginia.
By 1804, every state north of Delaware had legally abolished the practice, and new midwestern states and territories that joined the nation in the decades to come likewise prohibited it.
Slavery in the North died out because of the organizational power of antislavery activists combined with the lack of large-scale commercial agriculture in the region.
Evidence suggests that many, if not most, white northerners had no moral problem with slavery, but few powerful interests had much to gain by defending it.
In 1794, a twenty-eight-year-old Yale-educated New Englander named Eli Whitney, engaged as a tutor for the children of a plantation owner in South Carolina, patented a machine that mechanically separated cotton fibers from cotton seed. According to the traditional story, Whitney invented this “cotton gin” (gin was short for “engine”) after observing enslaved people slowly and painfully removing seeds from cotton balls.
The amount of cotton an individual enslaved person could prepare for export rose as use of the mechanical devices spread. By some estimates, the per-slave cotton yield increased 700 percent.
The results for the cotton industry were astounding. Southern planters produced around 3,000 bales of cotton per year in the early 1790s. By 1820—by which time domestic textile manufacturing had spread considerably—that number approached 450,000. By the eve of the Civil War in 1860, the South grew and exported (either domestically or abroad) nearly 5.5 million bales of cotton per year.
Karl Marx, who was simultaneously capitalism’s fiercest critic and its most trenchant analyst, viewed slavery and capitalism as incompatible.
Just as slavery drove the southern economy, manufacturing became increasingly important to the economies of the Northeast and, by the middle of the 19th century, the Midwest. And just as slavery’s social and economic reach extended far beyond the South, so, too, did industrialization exert a powerful influence on all Americans.
The Boston Associates engaged so-called “mill girls” to perform the difficult and monotonous work of textile production. Primarily the daughters of white Protestant farmers, these workers encountered a paternalistic social system at the mills, designed to “protect” their feminine virtue and convince their parents to allow them the social independence to live away from home. Lowell provided dormitories for workers as well as churches, libraries, and stores.
In 1790, only 5 percent of Americans lived in urban settings; by 1860, 20 percent did.
Unlike roads, waterways allowed merchants to move large quantities of textiles, iron, slaves, and foodstuffs over significant distances. According to one estimate, the amount of money it took to ship a ton of goods from Europe to an American port city would only get the same cargo about thirty miles inland pulled by a wagon.
These trenches—a few feet deep, a few dozen feet across, but sometimes hundreds of miles long—represented a tremendous engineering challenge. They were designed so that draft animals could walk parallel to the water, dragging nonmotorized barges laden with goods. As a result, the
In most cases, municipal governments saw a positive return on their investments, not only in direct payments but also through the tremendous economic growth generated by the new system of canals—three thousand miles’ worth by the 1840s, linking the Atlantic seaboard with midwestern cities such as Terre Haute, Indiana, and Cincinnati, Ohio.
while canals increased the ease with which large quantities of goods could be moved from the interior to the seaboard, ultimately those goods moved only as quickly as the oxen dragging the barges.
The American foray into rail began in 1828, when the state of Maryland chartered the Baltimore and Ohio Railroad company, which laid tracks to the west to create an alternative to canal traffic. The first steam-powered locomotive to travel those rails moved slower than a horse, but within two decades, the technology improved. The railroad boom took off in the late 1840s, and the number of miles of tracks multiplied. Americans laid more than twenty-one thousand miles of railroad track in the 1850s. By the eve of the Civil War, a New Yorker could reach Chicago in two days, a trip that would have taken three weeks in 1830.
The U.S. Post Office, which was granted a special license and responsibility by the Constitution to deliver the country’s mail, expanded from seventy-five branches in 1790 to more than eighteen thousand by 1850.
In the 1840s, a group of investors formed a rapid-delivery service that charged customers high fees to move parcels by stagecoach westward from the East Coast. Within ten years, that original partnership broke up into several specialized companies, including Wells Fargo and American Express.
Yellow highlight | Page: 54
The first telegraph was created by the French government in the 1790s to allow communication from Paris to twenty-nine cities up to five hundred miles away. But those original telegraph networks were optical, not electronic. To make them work, trained operators staffed towers spaced ten to twenty miles apart, from which they sent coded signals by shifting the positions of specialized panels. Although nothing in the United States matched the complexity of the French system, smaller networks of optical telegraphs emerged along the Atlantic Coast in the first decade of the 1800s, and others connected New York and Boston to their outlying farming communities in the 1810s. OpticalBy erecting poles alongside railroad tracks, telegraph companies made all parts of their network accessible, so they could perform maintenance and protect against the elements and sabotage. Thus, as railroads spread across the continent in the mid-19th century, the electronic telegraph went with them.
Yet before 1800, corporate charters were far from common, and almost no business enterprises were incorporated. Because charters had to be granted by the sovereign—the king or Parliament in colonial times; the state or federal legislature after independence—the few corporations extant were almost exclusively public operations, such as turnpikes, bridges, churches, and cities, including New York. During the entire 18th century, charters were issued to only 335 businesses—and much more than half of those were issued in the last four years of the century.
Unlike today, when incorporation is granted in perpetuity, most antebellum corporate charters were limited in time, set to expire after a fixed period of ten, twenty, or thirty years. Nonetheless, having a distinct legal existence separate from their owners made corporations appear more stable and predictable, and made them more attractive to investors.
Slowly, states turned to a new model known as general incorporation, granting corporate charters administratively, rather than legislatively. In 1811, New York became the first state to enact such a law for manufacturing firms. In 1837, Connecticut became the first state to allow general incorporation for any kind of business.
A landowning Virginian, Jefferson believed that self-reliant and small-scale family farms, not impersonal factories (or, ironically, large slave-labor plantations like his), provided a bulwark against tyranny and ensured the future of self-governance.
In the years to come, the Jeffersonian Republicans completed the rout of the Federalists at nearly all levels—by the War of 1812, the Federalist Party barely clung on in remote and far-flung corners of state and local politics, but had largely disappeared as a national force.3 Yet
These three pillars—tariffs, internal improvements, and a national bank—formed the essence of the American System.
In addition, the Bank’s corporate structure reinforced the privileged place of the wealthy: The federal government itself owned 20 percent of the corporate stock, while the other 80 percent was sold to wealthy Americans. Yet this structure was exactly as Hamilton intended. By catering to elite merchants, the Bank wrapped up their financial interests in federal institutions and thus guaranteed that they would continue to lend political, moral, and economic support to the Constitution and its government.
Jackson’s coalition, heirs to the Jeffersonian Republican tradition, renamed themselves as Democrats during the fight. Beginning in 1833, their opponents identified as Whigs, taking the name of the British party that had historically challenged the authority of the king. (The “king,” to these American Whigs, was Jackson himself.) America’s second party system was born. The
Hoping to mollify southern planters, Congress passed, and President Jackson signed, a law to lower tariff rates in the summer of 1832. Enraged politicians in South Carolina still insisted that the rates were too high. In the fall of 1832, South Carolina’s legislature passed a law nullifying the federal tariff. In response, a vengeful Andrew Jackson asked Congress for the right to use military force to collect tariffs in that state. Calhoun resigned the vice presidency and declared that any use of force by Jackson would give South Carolina a just cause to declare its independence from the United States.15 For a few months, the possibility of armed conflict appeared real. Only skillful diplomacy defused the crisis. Congress passed a compromise tariff that lowered rates, on the condition that South Carolina repeal its nullification statute. Yet the battle lines that formed over the Nullification Crisis, as well as the constitutional and legal theories about the relationship between the federal government and the states, established a powerful precedent.
The first “land grant” law, passed in 1850, designated a line—which would become the Illinois Central Railroad—between Mobile, Alabama, and Chicago, Illinois. The law created a series of six-mile-square land parcels along each side of the proposed track; in an alternating, checkerboard pattern, the federal government bequeathed every other parcel to the states of Illinois, Alabama, and Mississippi, and sold the others off to farmers.
oceans. Although economic recession in the mid-1870s slowed the juggernaut somewhat, Americans laid up to 8,000 miles of track per year through the 1880s. By 1890, the country boasted 166,000 miles; by the early 20th century, there would be 254,000 miles of tracks.
Railroads became the first “Big Business” because they combined the unique scale and scope of their industry and the deliberate choices by their leaders to adopt what we now recognize as a modern system of management.
Educated and skilled office workers, they would—along with other professionals such as doctors, lawyers, and accountants—form the heart of a new urban middle class in the modern American economy. Keeping their company in business for years to come meant job security, so professional managers tended to promote stable and less risky business practices.
In 1856, he borrowed $600 from a personal mentor (who was also his boss) and bought stock in a transport company that soon paid him his first return: a check for $10. By 1863, still a manager at the Penn Railroad, the $45,000 he made per year from his stock investments far outpaced his salary.
In 1867, he and a handful of partners launched an oil refinery in Cleveland, just as the commercial petroleum industry was beginning to grow.
A 19th-century “trust” resembled what we would call a holding company today. As a legal entity distinct from any of the member companies, the Standard Oil Trust controlled all the stock of those corporations, centralizing control over prices, distribution schedules, and other business decisions. By the 1890s, more than 90 percent of the oil produced in the United States was refined through Standard
As a percentage of gross domestic product, which at the turn of the last century was about $21 billion, the merger that birthed U.S. Steel would be worth about $1 trillion today.
The largest and fastest-growing corporations in the decades after the Civil War were typically more capital-intensive than labor-intensive.
Morgan had an extremely conservative disposition toward risk, even by the standards of bankers, who were traditionally averse to excessive gambles.
The disastrous financial Panic of 1893 created new opportunities for Morgan to put his vision of corporate control into action. The economic collapse, precipitated by overspeculation in railroads, crippled the nation. More than fifteen thousand companies, including six hundred banks, failed in what became the worst economic depression to that point in U.S. history. As countless railroad companies teetered on the verge of bankruptcy, Morgan and a handful of his partners engineered a series of takeovers and mergers. Shareholders in those failing companies surrendered their stocks in exchange for “trust certificates,” and the House of Morgan took control of the companies’ assets. In the aftermath of the Panic of 1893, approximately thirty-three thousand miles of railway track (one-sixth of the total) was “Morganized.”22
The Great Railroad Strike of 1877—the first major industrial strike in American history—began in July, when workers on the Baltimore and Ohio Railroad in West Virginia walked off the job. Their specific grievances were local—a series of sharp pay cuts as the B&O struggled through the protracted economic depression that followed the Panic of 1873—but their fury echoed across the industrial heartland. Within
Men, women, and children by the thousands toiled in dirty, dark, dangerous environments in factories and mills, quarries and mines, rigs and rail yards. The spread of mechanization and chemical technologies made work itself more boring and, simultaneously, more dangerous. Booming industry drew rural Americans away from farms and into cities, where they competed with a massive influx of European immigrants in a flooded labor market.
One of the earliest and most dramatic manifestations of class tension between laborers and economic elites was the creation of a national labor union in 1869 called the Knights of Labor. Officially called “The Noble and Holy Order of the Knights of Labor” and first organized as a secret society, the Knights of Labor grew into a major voice for the wholesale reform of the industrial system. Unlike most trade unions, the Knights welcomed both skilled and unskilled workers from the craft, retail, and manufacturing sectors, and, quite notable for their day, they encouraged membership by both African Americans and women. Central to the Knights’ social vision was the notion of the producer. So long as you made something, they reasoned, you served a social good, regardless of your race, sex, or relationship to the means of production. The only people the group actively excluded were “nonproducers”—liquor dealers, gamblers, lawyers, and bankers, for example.
A horrific incident in Chicago in the spring of 1886 helped cement the link between the Knights of Labor and radical, often violent, socialism in the minds of many business leaders. Amid a labor protest in Haymarket Square, someone threw a bomb that killed ten people. Eight suspects, all loosely affiliated with the Knights and variously described as anarchists, were convicted of murder. The Knights themselves were not involved, but their public image never recovered. Membership peaked in 1886, and the group declined in size and influence thereafter.
Within weeks, the strikers attracted the support of the burgeoning American Railway Union (ARU), the first industry-specific nationwide union. The ARU had been founded the previous year by Eugene V. Debs, a labor organizer from Indiana who would later—after his imprisonment for leading the Pullman Strike—become the country’s most prominent socialist politician and activist.
Troops killed several dozen strikers in clashes before the strike ended. Eugene Debs served six months in jail for violating a federal injunction to allow rail traffic to resume. During his imprisonment, he became a committed Marxist and later converted his American Railway Union into a socialist political party.
Yet the most distinctive aspect of the farmers’ political program, and the issue with which Bryan launched his career, was their attack on eastern banks and the influence of financiers over the national government. Monetary policy, in particular the question of the free coinage of silver, was their primary focus. The “silver question” often strikes history students as esoteric, obscure, and technical, yet it was one of the single most important political issues of the late 19th century. The struggle split the country between those who favored minting coins only in gold—monometallists—and those who wanted to use both silver and gold—the bimetallists. As a political rallying cry, the silver debate proved instrumental to a larger critique of corporate capitalism. Bimetallists,
As with everything in the history of big business, the story of regulation begins with the railroads.
Another strategy, which took aim at the rise of corporate monopolies more explicitly, led to the passage of the Sherman Antitrust Act in 1890. If the ICC had represented an effort to regulate monopolistic behavior, the antitrust movement endeavored to disband monopolistic companies entirely. Named after its chief proponent, Ohio senator John Sherman (brother of Union general William Tecumseh Sherman), the act sought to preserve the benefits of free competition by cracking down hard on anticompetitive behavior. It criminalized “every contract, combination in the form of trust or otherwise, or conspiracy, in restraint of trade or commerce.” Instead of prescribing rules and procedures to mitigate corporate power, as the ICC did, the law required the criminal prosecution of “every person who shall monopolize, or attempt to monopolize, or combine or conspire with any other person or persons, to monopolize any part of the trade of commerce.
For the antimonopoly forces, U.S. Steel’s survival exposed the limitations of the antitrust movement. The Sherman Act successfully attacked cartels and price-fixing schemes, but because it banned restraint of trade, not market dominance in general, it did nothing to curb corporate mergers
Historians of the period have long used the phrase “Progressive Era” to describe the years between the turn of the century and the onset of World War I, defined by a political and intellectual response to the rapid rise of industrialized society. Rooted neither in radical socialism nor in unfettered laissez-faire economics, Progressivism sought to mitigate capitalism’s excesses while retaining its benefits. In the process, the Progressive period both reaffirmed classical elements of the American political tradition and established new institutions, government agencies, and expectations about the promise of democracy.
1908 Model T cost $850, but by the early 1920s, the price had fallen to under $300.
As one journalist put it in 1924: “The American citizen has more comforts and conveniences than kings had 200 years ago.”
Henry Ford didn’t invent the car—there were already six different models on display at Chicago’s Columbian Exposition in 1893—but his devotion to the Model T starting in 1908 revolutionized the industry. Standardization was key: Ford simplified the design of his “Tin Lizzie” and used a bare minimum of parts (about five thousand).
To entice his workers to remain, Ford also pioneered labor policies that appeared progressive to many. In 1914, the company introduced the “five-dollar day,” when two dollars a day was more typical.9 In the next few years, Ford reduced the workday to eight hours and the workweek from six to five days, goals long championed by the labor, populist, and socialist movements. His business success, his personal austerity (especially when compared to the flamboyant wealth of men like J. P. Morgan), and his public devotion to the ideal that industrial workers should be able to afford the fruit of their labors—that a car should be inexpensive enough for the masses—contributed to Ford’s personal popularity around the world.
Before the rise of industrial capitalism, however, retail outfits were only found in cities and mostly sold specialty items such as books and furs. The idea of shopping for a variety of grocery or household items—or the notion that stores themselves could be big businesses—didn’t develop until the late 19th century.
In 1872, a Chicago-based traveling salesman named A. Montgomery Ward launched the country’s first mail-order firm. He published an illustrated catalog and mailed it, free of charge, to small-town farmers, who could then order products at lower prices than what local merchants charged. In 1886, another Chicagoan—a twenty-three-year-old man named Richard Sears—imitated Montgomery Ward’s success and started selling pocket watches by mail. Within a few years, Sears partnered with Alvah Roebuck, a watch repair specialist, providing both sales and maintenance services, all remotely. The pair broadened their offerings to compete directly with Montgomery Ward—their catalog became known as “the Farmer’s Bible,” and their Chicago warehouse filled orders from around the country.
Incorporated in 1893, Sears, Roebuck supplanted Montgomery Ward and became the country’s biggest mail-order company by 1900.13 In
The earliest and most famous pioneers of this model were F.W. Woolworth’s, which operated from 1878 to 1997, and the Great Atlantic and Pacific Tea Company, which survived until 2015 as the A&P supermarket. Formed in 1859, the A&P opened its doors in New York as a discount purveyor of teas and coffees, which its founders purchased in bulk straight from ships (allowing them to offer cheaper prices). Within twenty years, the A&P offered a wide variety of grocery products and owned stores in more than one hundred locations, stretching from Minnesota to Virginia. Combining efficient distribution channels, inventory management, and low costs—the hallmarks of Taylorism—grocery chains like the A&P grew prominent in the early 20th century.14 The success of the retail revolution, and the chain store model in particular, changed the way Americans identified as consumers, but it came at a cost.
Invented in the mid-19th century, Listerine was an alcohol-based chemical designed as a powerful antiseptic for use during surgery. In 1920, advertising copywriters for Lambert launched a marketing campaign that proposed a new use for this old product—as a solution to bad breath (when taken in small quantities and not swallowed!). In its ads, Lambert introduced Americans to the word “halitosis,” an obscure but clinical-sounding, scientific word for “bad breath,” giving the impression that Listerine addressed a pressing medical problem. By 1927, Lambert’s profits, driven by Listerine sales, had skyrocketed from one hundred thousand dollars a year to more than $4 million, in the process changing the daily ablution habits of millions of people.18
General Motors was founded in 1904 when William Durant, a carriage maker in Flint, Michigan—just seventy miles northwest of Henry Ford’s headquarters in Detroit—took over a small and failing car company called Buick. Over the next several years, Durant expanded his production of Buicks and absorbed dozens of other car manufacturers under his corporate umbrella, following the model of horizontal integration pioneered by large extractive companies such as Standard Oil. That rapid growth created organizational and managerial confusion, because the various constituent companies (Oldsmobile, Cadillac, and so on) each had their own internal structure, products, and corporate culture. Many of General Motors’s cars targeted the same type of consumer, leading to a frustrating internal competition that hurt profits. In the mid-1910s, Durant set his company on the road to resolving these problems by launching an important collaboration with executives from the DuPont Corporation.
Only after the death of Henry Ford in the late 1940s would the Ford Motor Company start to catch on to the modern strategy for marketing cars.
What made the years framed by the Roosevelt presidencies so pivotal for business history was not the flamboyant rhetoric, but the long-term dance between two emerging giants of the 20th century: the massive integrated corporation and the administrative, bureaucratic state, which developed an essentially associational relationship with each other.
During World War I, income taxes provided vital revenue for the government, but the tax regime was steeply progressive, applying only to the top earners. Only approximately 15 percent of American households paid any income taxes at all in 1918; the richest 1 percent contributed about 80 percent of all revenue and paid effective tax rates of about 15 percent of their total income.
During the Coolidge administration (1923–29), Mellon achieved many of his goals, and the top rate paid by individuals declined from 73 to 25 percent.
Total corporate profits fell from $10 billion to $1 billion, a drop of 90 percent.
By the time the market bottomed out in 1933, nominal gross domestic product was nearly half what it had been in 1929.
and gestures such as the “Hoover flag,” an empty pocket turned inside out.
As he put it: “We put those payroll contributions there so as to give the contributors a legal, moral, and political right to collect their pensions and their unemployment benefits. With those taxes there, no damn politician can ever scrap my social security program.”29 Social Security was highly popular (rising from a 68 percent approval rating in 1936 to 96 percent in 1944), but many large corporations and business associations recoiled at the new expense employers faced.
All told, the United States government spent approximately $320 billion (in 1940s money) on World War II, about half of it borrowed from the public through bond sales and the other half raised in taxes. That spending provided a massive boost to the gross national product, which shot up from $88.6 billion in 1939 to $135 billion in 1945.
By 1944, unemployment had fallen to just over 1 percent (remember that the official rate hit 25 percent in 1933). Within the span of eleven years, in other words, the country had seen both the highest and lowest levels of joblessness of the century.
In 1929, prescription drugs accounted for only 32 percent of all medicines purchased in the United States (by cost); by 1969, that figure reached 83 percent.
In 1954, however, General Electric became the first private firm to own a mainframe computer when it bought a UNIVAC.
Roughly defined, a conglomerate is a corporation that conducts business in a wide range of markets and industries that have little or no relationship to one another. Berkshire Hathaway, the company founded by billionaire investor Warren Buffett, provides a familiar example of the form today—it acts as a holding company that owns and operates an array of disparate businesses, from GEICO insurance to Jordan’s Furniture to Fruit of the Loom.
The rise of the conglomerate form reshaped managerial culture. Conglomerate builders such as Charles Bluhdorn succeeded, at least for a time, because they were experts at managing their company as an investment portfolio, not as a productive entity.
From the 1960s onward, boards of directors increasingly sought to hire men (and let’s not forget that occupying the corner office was a nearly exclusively male privilege) who were experts not in a particular industry or niche, but in business management itself. Versatile generalists, holding degrees from the newfangled business schools mushrooming throughout the country, could adapt their broad understanding of business principles to any specific managerial problem they encountered. Conglomerate executives in particular often bragged that they could manage their companies through financial controls and measurements, remaining disconnected from the actual product or service the company provided. The
Bluhdorn’s successor (he died of a heart attack in 1983) renamed the company Paramount Communications in 1989 to take advantage of one of its highest-profile holdings, Paramount Pictures. The entire operation became part of the media giant Viacom in 1994.
In short, the dominant regulatory trend had been economic regulation. In contrast, the trend in the 1960s and 1970s was toward social regulations, rules that, by design, targeted aspects of business behavior not traditionally considered “economic”—public health and safety and, quite literally, the downstream consequences of companies’ production processes. There had been earlier examples of social regulation, including the Pure Food and Drug Act of 1906, which led to the creation of the FDA to improve the safety and quality of food and medicine. Yet the scale and scope of this new type of regulation exploded in the late 1960s, reinforcing a cultural and political distinction between protecting the economy and protecting people from business.
In the early 1970s, Congress overhauled the laws governing campaign finance contributions. The federal government had regulated campaign giving to various degrees since the Tillman Act of 1907, which barred corporations and unions from donating to political campaigns on the rather explicit grounds that they were not humans.
Instead, with minor exceptions, businesspeople preferred other, less official ways to skirt the campaign finance laws. Executives, for example, routinely arranged for special bonuses to top managers, with the clear expectation that those managers would donate their windfall to the candidate of the corporation’s choice.
In 1975, the FEC clarified that political action committees were legally legitimate, and an explosion in corporate-backed political action committees followed. In the four years between 1974 and 1979, the number of business PACs increased tenfold, from 89 to 950, while the number of labor PACs barely budged, rising only from 201 to 226. The number of corporate PACs continued to soar, peaking around 1,800 in the late 1980s before declining slightly and largely leveling off. In the winter of 2016, the Federal Election Commission counted 1,621 political action committees affiliated with businesses, and 278 for labor.
Not satisfied with running a traditional restaurant, the McDonalds spent the 1940s searching for a way to simplify. They wanted a food item that they could perfect and sell at a constant, affordable, and profitable price. They settled on the hamburger.
In 1955, the American automobile giant General Motors topped Fortune magazine’s list of global companies ranked by annual revenue. For the remainder of the century, GM held that crown. Yet in 2002, it fell to second place, bested by a company that had barely been known outside of Arkansas in 1980 but exploded onto the international stage thereafter: Walmart.
The son of a farmer-turned-debt-collector, a teenage Sam Walton spent the Great Depression with his father foreclosing on delinquent farms in Missouri.
Rural America had traditionally been a hotbed of populist opposition to unfettered capitalism, from the anti-chain-store movement to opposition to the gold standard and eastern finance. Yet by the late 20th century, conservative politicians found greater success linking evangelical Christianity with free market economics.
When Gates stepped down from Microsoft in 2014 (having reduced his role since 2000), he was the wealthiest person on Earth.
At the beginning of 2008, five venerable and highly respected investment banks—the descendants of the “House of Morgan”—sat atop American financial capitalism. By the fall of that year, none of them existed. Two, Bear Stearns and Merrill Lynch, avoided bankruptcy through emergency mergers (engineered to a significant degree by government officials and the Federal Reserve) with J.P. Morgan and Bank of America, respectively. The 150-year-old Lehman Brothers was not so lucky. After its leaders failed to convince government regulators to offer either a direct bailout or a “shotgun marriage” to another financial institution, Lehman entered the largest bankruptcy in history on September 15. The remaining two, Goldman Sachs and Morgan Stanley, surrendered their status as investment banks and transformed themselves legally into traditional bank holding companies, which faced far greater government regulation in exchange for easier access to government loans. As the former chairman of the Federal Deposit Insurance Corporation put it, it was “the end of Wall Street as we have known it.”
That political shift proved to be a guiding force behind the movement for widespread deregulation, which often garnered the support of groups that otherwise opposed each other politically. In 1978, President Jimmy Carter signed the Airline Deregulation Act, a law spearheaded by liberal politicians such as Ted Kennedy (D-MA) and consumer activist Ralph Nader, as well as free-market conservatives like the economist Milton Friedman. The opponents to airline regulation argued that a more market-driven airline industry would face greater competition to cut rates and, eventually, provide better service.8
By the mid-1990s, having stripped away most of its functions, Congress finally dissolved the Interstate Commerce Commission
The redirection of capital to financial pursuits—the hallmark of the process of financialization—led to growing number of deals known as “leveraged buy-outs” (LBOs)—mergers that depended on tremendous amounts of borrowed money.
Before the mid-20th century, just under 50 percent of American households owned their own home. Excluding farmers, who owned land at disproportionate rates, the rate of homeownership was below 40 percent. Between 1940 and 1970, American homeownership rose steadily to about 65 percent,
After hovering around 65 percent for several decades, homeownership rates rose quickly between the late 1990s and 2007, reaching 69 percent.
either renegotiate the loan or resell the home later at a profit, convinced many people that homeownership was a foolproof investment. “They’re not making any more land,” went a common refrain.
This unfounded faith in the never-falling value of houses was perpetuated by mortgage lenders and the real estate business, which profited from every loan made and every home purchased. (Economic historians showed that the misunderstanding of historical home price values came from the simplest of oversights: Once you account for the overall increase in prices over time, the real—noninflationary—price of homes remained remarkably stable throughout the entire 20th century.)
-
A response to Henry George and the Lunar Society
I recently completed listening to this long and good interview of Lars Doucet on the Lunar Society Podcast – it aroused many strong feelings, so I thought I would share.
In no particular order
- The problem with Georgism is the Georgists. To mix adages, the Georgist argument would be more convincing if they had more skin in the game and climbed Chesterton’s Fence. Why has the single tax on land been on the shelf for the past 100 years? Why do we have these other taxes? Explain!
- A common problem in rationalism (and adjacent ideologies) is the inability to effectively understand and address the concerns of people ten years younger or older than the rationalist.
- The soul of man does not contain a hole in the shape of high density living. People like the benefits, but density for the sake of density is not an inherent drive. A lot of Georgism seems to take that for granted.
- Doucet presumes effective urban governance. Does Georgism make made sense with 70s level urban decay and state capacity?
- The use of video game evidence creates a strong counter reaction in me.
- Doucet takes the notion of “Rent” as a given -and provides no evidence that landlords do not compete the rent away in some form or fashion, or lose the rent in maintenance and improvements. Or if he did present compelling evidence I missed it.
- The notion that the rentiers (to use Piketty’s term) do not contribute anything seems wrong. Why won’t the land owners just coordinate to increase the value of the property and increase overall prosperity via network effects? Seemingly the landowners would coordinate to bring productive labor and capital to their properties.
- Why has no one, in any country, ever, tried a single tax on land? Sorry, but Norwegian oil isn’t similar enough. Come to think of it – how similar is it to land valuation in the age of serfdom?
- Doucet does not provide any reason that a tax on land would necessarily displace ANY other form of taxation. He presents the land tax as just an additional tax.
- Doucet should provide some explanation for why he should tax land rent and not educational/human capital rent. He alludes to the concept early in the conversation and does not adequately answer the question.
The list makes it seem like I’m more critical than I am – the topic fully engaged me and I will be reading the book at some point in the near future.
-
Dan Bern at Eddie’s Attic was a bust
It would seem that an “All Ages” music show does not equal and “Age Appropriate” music show (it was explicitly advertised as all ages).
I’ve only seen this guy once before (pesky pandemic) but the whole family likes him. His previous show (that I saw) was much cleaner. All of his online shows were much cleaner. The two kids I brought were the only kids there, maybe he didn’t see them. Oh well.
As a show it was nowhere near as good as the other one, or his average online show actually. The show did improve markedly as it went on, but sadly so did the inappropriateness. We left halfway though.
I imagine I will go see him again, but alone. A lesson learned I suppose.
-
Awaiting Dan Bern
-
Evidence would suggest that I rule
-
On Vietnam, addiction, and substitutions
I saw the “During the Vietnam war American troops used lots of opiates , but when the troops came home very few of them continued using” line again.
I saw it on this substack post. Go ahead and read their entire post for more detail on their theories, but proponents of the this theory usually mention community and environment as inherent to the addiction process.
That could be the case, but one GIANT missing part of this story is that all of the troops went from a place where opiates, in the form of heroin, were common (Vietnam) to a place where alcohol was common (America). The human body substitutes opiates and alcohol easily. In and of itself a dropoff in opiate use proves nothing. The troops could just switch from heroin to the culturally celebrated and easily accessible alcohol. People use this example over and over without ever considering substitution.
-
Dogs in the parlor
-
If it gets measured, it gets managed
And arguably improved – the January to December results show “limited” progress, but that does not show the real progress. I was advancing far too quickly and the lifts were getting to be feats of strength, and not refinements of technique, which is dangerous with kettlebells.
I went to a lower weight early in the year and have been moving back up with better technique ever since.
I’m going to wait until February before I introduce the 80 lb kettlebell back. I could probably do it now, but there is no need to rush. Also – my shoulders feel great, which was not the case this time last year.
-
Eric Hoffer: The Longshoreman Philosopher by Thomas Bethell
From my notion book review template
The Book in 3 Sentences
- This book is an honest account of the life of Eric Hoffer. It is an honest summary, including actual independent research and not just Hoffer’s version of his life. Bethell focuses his attention on his life and hot his political views.
Impressions
I liked it a lot – it did actual research and dispelled lots of the Eric Hoffer self created legend and made him a much more interesting and mysterious character. This is one of the rare cases where more details adds mystery instead of taking it away.
How I Discovered It
An Amazon recomendation
Who Should Read It?
Eric Hoffer Fans
How the Book Changed Me
How my life / behaviour / thoughts / ideas have changed as a result of reading the book.
I don’t think it will change anything – perhaps it increases my willingness to disbelieve weird life origin stories in favor of even weirder life origin stories.
My Top 6 Quotes
- Quite possibly, he was born in Germany and never became a legal resident of the United States.
- It seems extraordinary, then, that no one from Hoffer’s early life should ever have shown up. Possibly—just possibly—he actually came to America for the first time across the Mexican border in 1934, the year after the El Centro camp was opened. Perhaps he walked to San Diego and was by then every bit as hungry as he said he was, ate some cabbage “cow fashion,” and found the truck driver who took him to El Centro.
- It’s understandable that Hoffer might have concealed his background if he were indeed undocumented. If born abroad he was not an American citizen, for he never went through any naturalization ceremony. Congress severely restricted immigration to the United States in 1924 and by the 1930s, when jobs were scarce, U.S. residents found to be here illegally were deported without due process. Some were minor children born in the United States. In one report, “between 1929 and 1935 some 164,000 people were deported for being here illegally, about 20 percent of them Mexican.”32 Others estimate that between 1929 and 1939 as many as a million people were unceremoniously repatriated, many of them to Mexico. If Hoffer himself was in the United States illegally, he was wise to keep quiet about it.
- Hoffer’s blindness has functioned in all accounts as an alibi, explaining why he didn’t go to school, didn’t have friends, spoke with a German accent, had “shadowy” recollections, and so on. How reliable is his blindness story?
- Much later, Hoffer decided that “the social scientist is no more a scientist than a Christian scientist is a scientist.”
- An ideal environment for him, he said, was one in which he was surrounded by people and yet not part of them.
Summary + Notes
Highlights
As for Hoffer, Selden said: “All his conclusions are wrong—every one of them. But he writes beautifully and he asks the right questions.” They remained on good terms, and when Eric Hoffer died two years later, in the room where we had met, Selden was with him. His date of birth is uncertain, often given as 1902 but more likely 1898. And the account he often gave of losing his sight at an early age and then regaining it several years later doesn’t fit with some Quite possibly, he was born in Germany and never became a legal resident of the United States. Over the next thirty-three years she knew him better than anyone in the world. But, she said: “I never met anyone who knew Eric in his earlier life.” One day, when he was six, she fell down a flight of stairs while she was carrying him. Two years later, she died and Hoffer went blind. His blindness lasted for eight years. When asked, “Did the fall cause those things?” he responded, “I don’t know.”5 Hoffer also didn’t remember the fall itself, nor could he recall whether his sight returned suddenly or gradually. In an early account he said that he went “practically blind,” followed by a “gradual improvement.” Martha Bauer was a “Bavarian peasant” and his German accent came from It seems extraordinary, then, that no one from Hoffer’s early life should ever have shown up. Possibly—just possibly—he actually came to America for the first time across the Mexican border in 1934, the year after the El Centro camp was opened. Perhaps he walked to San Diego and was by then every bit as hungry as he said he was, ate some cabbage “cow fashion,” and found the truck driver who took him to El Centro. and during those fifteen years Cole saw Hoffer almost every week. His account coincides with Lili’s: “I never met a single person who knew him before he worked on the waterfront.” It’s understandable that Hoffer might have concealed his background if he were indeed undocumented. If born abroad he was not an American citizen, for he never went through any naturalization ceremony. Congress severely restricted immigration to the United States in 1924 and by the 1930s, when jobs were scarce, U.S. residents found to be here illegally were deported without due process. Some were minor children born in the United States. In one report, “between 1929 and 1935 some 164,000 people were deported for being here illegally, about 20 percent of them Mexican.”32 Others estimate that between 1929 and 1939 as many as a million people were unceremoniously repatriated, many of them to Mexico. If Hoffer himself was in the United States illegally, he was wise to keep quiet about it. Hoffer also spoke German and did so fluently. Hoffer’s blindness has functioned in all accounts as an alibi, explaining why he didn’t go to school, didn’t have friends, spoke with a German accent, had “shadowy” recollections, and so on. How reliable is his blindness story? There is no Martha, and in this account he clearly lived with this aunt for a year after his father died, thus accounting for the gap between his father’s 1920 death and his 1922 departure for Los Angeles. Hoffer’s later and oft-repeated account of a $300 legacy from his father’s guild is also contradicted. “Martha had often consoled him with the advice: ‘Don’t worry Eric. You come from a short-lived family. You will die before you are forty. Your troubles will not last long.’ ” These thoughts All attempts to locate Hoffer or his parents, Knut and Elsa, in the Bronx, either through census data or Ancestry.com, have drawn a blank. He was almost forty years old before he acquired a definite street address. What may be more likely is that Hoffer came to America as a teenager or young adult and never did live in New York. It’s easy to understand why Hoffer would make up an American background if he was eager to avoid questions about his citizenship, but why so elaborate a ruse? Hoffer was a great storyteller, and he insisted that a writer should entertain as well as inform his audience. He was also a master at diverting attention from his own background. Finally, he did provide a few hints that his story shouldn’t be taken too seriously. Much later, Hoffer decided that “the social scientist is no more a scientist than a Christian scientist is a scientist.” But Although they were white Anglo-Americans, Starr writes, and often fleeing from the Dustbowl in Oklahoma, Texas, and elsewhere, they were regarded as a despised racial minority by much of white California. In 1935 California had 4.7 percent of the nation’s population but triple that percentage of its dependent transients. worked.” He described Hoffer as a natural loner; in fact, all his life he wanted to be left alone. For many years his relations with women were therefore confined, with one exception, to prostitutes.19 Koerner adds that Hoffer was “terrifically lusty”: The earliest documentary record of Hoffer’s existence is a photostat of his application for a Social Security account, filled out on June 10, 1937. He said at the time that he was thirty-eight years old, having been born in New York City on July 25, 1898. If so, of course, he was four years older than he claimed at other times. identified himself as the son of Knut Hoffer and Elsa Goebel, and gave his address as 101 Eye Street, Sacramento. His employer at the time was the U.S. Forest Service in Placerville, California. It is the only documentary evidence of his life to be found in the archives before he moved permanently to San Francisco. Vigorous walking seems to ease the flow of words; and The feeling of being a stranger in this world is probably the result of some organic disorder. It is strongest in me when I’m hungry or tired. But even when nothing is wrong I sometimes find it easy to look at the world around me as if I saw it for the first time. The war, the nationwide draft, and a labor shortage on the docks made it possible for him to become a longshoreman at the age of forty-five. There were many accidents. In 1943 a five-ton crate crashed to the wharf and just missed him, but it destroyed his right thumb. He was in the hospital for months as a new one was reconstructed from his own thigh. It was little more than a stump. my case conditions seem ideal. I average about 40 hours a week, which is more than enough to live on. And all I have to do is put in 20 hours of actual work. It’s a racket and I love it. Selden became a “diet faddist,” and Hoffer noticed that, too. How true is it, he wondered, “that true believers have an affinity for diet cults? You attain immortality either by embracing an eternal cause or by living forever.” Selden told Eric that when he ate, he methodically chewed so many times on one side, so many times on the other. “It would be hard to find another occupation with so suitable a combination of freedom, exercise, leisure and income,” he wrote to Margaret Anderson in 1949. “By working only Saturday and Sunday (eighteen hours at pay and a half) I can earn 40–50 dollars a week. This to me is rolling in dough.”10 But in a 1944 notebook he recorded that creative thought was incompatible with hard physical work. An ideal environment for him, he said, was one in which he was surrounded by people and yet not part of them. But routine work was compatible with an active mind. On the other hand a highly eventful life could be mentally exhausting and drain all creative energy. He cited John Milton, who wrote political pamphlets throughout the Puritan agitation, and postponed Paradise Lost until his life was more peaceful. Clumsiness, he concluded, is inconspicuous for those who are not on their home turf. Similarly, the cultural avant-garde attracts people without real talent, “whether as writers or artists.” Why? Because everybody expects innovators to be clumsy. “They are probably people without real talent,” he decided. But those who experiment with a new form have a built-in excuse.11 and it began with this issue. The union “was run by nobodies,” just like America, Hoffer said. “It did not occur to the intellectuals,” Hoffer commented, “that in this country nobodies perform tasks which in other countries are reserved for elites.” It was one of his favorite reflections. Financial records show that Hoffer made $4,100 as a longshoreman and $1,095 in True Believer royalties in 1953. The examples of Lenin, Mussolini and Hitler, where intellectually undistinguished men made themselves through faith and single-minded dedication into shapers of history is a challenge to every mediocrity hungering for power and capable of self delusion. During the day it occurred to me that if it were true that all my life I have had but a single train of thought then it must be the problem of the uniqueness of man. Most days he set off for a “five mile hike in the Golden Gate Park,” he wrote, and there he found that he could “think according to schedule”: I have done it every day for weeks. Each day I took a problem to the park and returned with a more or less satisfactory solution . . . The book was written in complete intellectual isolation. I have not discussed one idea with any human being, and have not mentioned the book to anyone but A visiting reporter, Sheila K. Johnson of the Los Angeles Times, said of this apartment: “There are no pictures on the walls, no easy chair, no floor lamps, no television set, no radio, no phonograph. There are in short no distractions.” Hoffer himself received his retirement papers from the longshoremen’s union in 1966. He may have already received that news when he accompanied Tomkins to the docks later that Here is a case where a genuine belief in God would make a difference. He is obviously drifting to an unmarked grave in a godforsaken graveyard. In lucid intervals he drifts back to San Francisco but does not stay long.2 “He wanted to change the world, and he wanted to change it alone,” Lili recalled. “He made a single convert—his mother.” Years after his death, reflecting on her former husband’s impractical nature, Lili still seemed amazed. “The idea that he chose to express his ideas was by leaflets,” she said with an emphasis that conveyed her frustration. Reflecting on Hoffer’s account of his early life, and the implausibility of his claim that as a large child he was carried downstairs by a small woman who tumbled and then died, Gladstone said: “I don’t believe a word of it.” In 1979, Eric moved to Alaska, became a fisherman, married, and had a family. He lives in western Alaska to this day. At the San Francisco reception following his mother’s funeral in October 2010, Eric (by now the father of six) was receptive to the idea that Hoffer’s account of his early life didn’t quite add up. He thought Hoffer’s case might be comparable to that of B. Traven, the mysterious German author of The Treasure of the Sierra Madre. (B. Traven was a pen name for a German novelist whose actual identity, nationality, and date and place of birth are still unknown. The book, published in Germany in 1927, then in English in 1935, was made into the famous movie of the same name in 1948.) Of the paternity question, Stephen said, “Has there been a DNA test? No. But Eric suspected that Hoffer [which he pronounced Hoafer] was his father. He asked my mother and she said yes.” I have been generous with myself and my money and the truth is that Selden did not love Lili and felt my invasion as a liberation. He told me yesterday that my intrusion enriched the children’s life and whatever I have saved will be theirs when I am gone. My attachment to Lili after 33 years is undiminished. In Lili’s hand beneath she wrote: “Dear, dear Eric! Always beloved.” Some of these fanatics act out of the weakness of their personalities, the reviewer added; some out of the strength. But by the end of the book Hoffer had brought “the fanatical leader and the fanatical follower into a single natural species.” True believers don’t start mass movements, Hoffer wrote. That is achieved by “men of words.” But the true believers do energize those movements. Hoffer’s understanding of the relationship between true believers and mass movements was Hitler’s relationship to the Nazi Party. The German Workers Party—its name was later changed—was founded in 1919. Hitler soon joined it and ousted the founder, Anton Drexler, in 1921. With all the zeal of the true believer, Hitler infused it with fanaticism and Nazism became a mass movement. Hoffer did not make this Hitler relationship explicit in his book but it was his unstated guide. “the preoccupation with the book is with theories—right or wrong. I cannot get excited about anything unless I have a theory about For a movement to prevail, the existing order must first be discredited. And that “is the deliberate work of men of words with a grievance.” If they lack a grievance, the prevailing dispensation may persist indefinitely.8 Sometimes, a regime in power may survive by co-opting the intellectuals. The partnership between the Roman rulers and the Greek men of words allowed the Roman Empire to last for as long as it did. As Hoffer saw it, then, men of words laid the groundwork for mass movements by creating receptivity to a new faith. This could be done only by men who were first and foremost talkers or writers, recognized as such by all. If that ground had not been prepared, the masses wouldn’t listen. True believers could move in and take charge only after the prevailing order had been discredited and had lost the allegiance of the masses.11 Mass movements are not equally good or bad, Hoffer wrote. “The tomato and the nightshade are of the same family, the Solanaceae,” and have many traits in common. But one is nutritious and the other poisonous.12 In adding this he was probably responding to another caution from Fischer, who wrote that some in-house readers “. . . got the impression that Hoffer is implying that all mass movements are equally good or bad, that the ideas on which they are based are always predominantly irrational, and that from the standpoint of value judgments there is not much distinction between, say, the Nazi movement, Christianity, and the Gandhi movement in India.” The Harper contract to publish the book was sent to Hoffer in June 1950. Harper scheduled the book for publication and, not surprisingly, wanted some independent report about this mysterious author who was unreachable by phone, worked on the docks, had never gone to school, and yet wrote so well. After publication, some reviewers, including the New York Times’s Orville Prescott, also called the work cynical—“as cynical about human motives as Machiavelli.”14 The libertarian author Murray Rothbard, writing for Faith and Freedom under the pen name Jonathan Randolph, was also highly critical. “Hoffer may be anti-Communist,” he wrote, “but only because he sneers at all moral and political principles.” Hoffer later became openly political, attacking Stalin, Communism, and leftist intellectuals en masse. He had “a savage heart,” he reflected, and “could have been a true believer myself.”17 America and Israel were to become his great causes. But the neutrality of The True Believer contributed to its critical success. Fischer also pointed out that the book would be more readable “if the author would make greater use of examples and illustrations.” Readers of The True Believer do indeed encounter a sea of abstractions—fanaticism, enthusiasm, substitution, conversion, frustration, unification—and many will have scanned its pages, often in vain, looking for the tall masts and capital letters of a proper name. As a historical assessment, nonetheless, Hoffer’s treatment was questionable on several fronts. Longevity was just one. Nazism lasted for twelve years, Communism’s span was measured in decades, while Christianity has endured for two thousand years and shows no sign of disappearing. Pipes has great admiration for Hoffer and assigned The True Believer to his Harvard class. “Mass movements do occasionally occur,” he added, “but my feeling is that most such movements are organized and directed by minorities simply because the ‘masses,’ especially in agrarian societies, have to get back to work to milk the cows and mow the hay. They don’t make revolutions: they make a living.” Communism resembled a religion but it was the faith of disaffected Western intellectuals, not of the masses. After the immediate revolutionary fervor cooled it was sustained, in Russia and everywhere else, by coercion and terror. Communism never did bring about a release of human energies—or if so, only for a short time. The explosive component in the contemporary scene, Hoffer wrote, was not “the clamor of the masses but the self righteous claims of a multitude of graduates from schools and universities.” An “army of scribes” was working to achieve a society “in which planning, regulation and supervision are paramount, and the prerogative of the educated.” In its May 22, 1983, obituary on Hoffer, the Washington Post said that The True Believer is “difficult to summarize [but] easy to admire.” In contemplating the mystery of Eric Hoffer, Lili Osborne would ask herself how a self-educated laborer came to write so abstract a work. His early manuscripts had shown that he was a polished writer before he (apparently) had much experience of writing anything. His comments to Margaret Anderson give one or two clues. Looking back over his earlier notebooks, he was surprised to find how hard it had been for him to reach insights “which now seem to me trite.” The key was that “the inspiration that counts is the one that comes from uninterrupted application.” Sitting around waiting for lightning to strike got one nowhere. His rewritten drafts of The True Believer showed how much he owed to perseverance. His self-assurance and stylistic mastery were remarkable coming from someone who had not yet published anything. But if his success with The True Believer were to be attributed to any single quality, it would be his capacity to concentrate and persevere. His ability to exercise these talents also explained his self-confidence. Still, the mystery never quite goes away. warning them that woe betides a society that reaches a turning point and does not turn. He worried that if workers’ skills were no longer needed they might become “a dangerously volatile element in a totally new kind of American society.” America itself might be undermined—no longer shaped by “the masses” but by the intellectuals. Hoffer increasingly saw them emerging as villains in the continuing American drama. The culmination of the industrial revolution should enable the mass of people to recapture the rhythm, the fullness and the variety of pre-industrial times. By now Hoffer’s life story was fixed. The KQED version became, in effect, the canonical account. In later interviews—by Tomkins, James Koerner, Eric Sevareid, and others—Hoffer stuck to the same script, sometimes almost word for word. He told the same anecdotes with no new details. The inconsistencies in his earlier accounts were gone. It was as though by 1963 he had settled on the story of his life and he no longer deviated from it. Later, the FBI heard that Ted Kaczynski, the Unabomber, an assistant professor at Berkeley at that time, might have visited the class; at one point, agents combed through Hoffer’s papers at Hoover. “Anyone could drop in to Hoffer’s class,” Cole said. “But they never established that Ted Kaczynski was there. Lili asked me if I remembered him. I didn’t.” So he taught himself Hebrew, “and his pronunciation was wonderful.” Cole heard Hoffer “more than a few times say something in Hebrew. He had such a great ear.” Hoffer told another interviewer that he had learned Hebrew while on skid row in Los Angeles. “I think I mastered it. I can speak it, but I cannot make out the text,” he said. He memorialized this appeal to brevity by funding the Lili Fabilli and Eric Hoffer Essay Prize at UC–Berkeley. It is awarded each year for the best essays of 500 words or less on a topic chosen by the Committee on Prizes. At least one of these columns was read by Pauline Phillips, the author of the “Dear Abby” column, who had been friends with Hoffer for some time. At the time of the anti-Soviet revolt in East Germany in 1953, Hoffer recognized the Communist evil. He noticed, too, that the West held Communism in awe.9 But he was also impressed by what Communism had apparently achieved. Stalin had shown unbounded contempt for human beings, but he could justify it by pointing to “the breathtaking results of sheer coercion.” Cruelty worked, in other words. “Idealism, courage, tremendous achievements both cultural and material, faith and loyalty unto death can be achieved by relentless, persistent coercion.” That industrial production had in fact collapsed following the Bolshevik Revolution and had made only a faltering recovery was not appreciated for decades. Led by U.S. government agencies that took Soviet statistics at face value, policy analysts and economic textbooks continued making the same mistake right up to 1989. As always, he was aiming for the widest generalization. An enduring problem was that Hoffer was not interested in economics and paid little attention to political institutions. He either took private property and the rule of law for granted, or thought them unimportant. “Far more important than the structure of a governmental system is the make-up of the men who operate it,” he wrote in 1952. He persisted, surely, because his underlying argument—mass movements had animated societies by releasing pent-up energies—came from The True Believer.13 Abandon this search, then, and his argument about the role of mass movements might collapse. He had referred to his new book as “vol. 2.” His prolonged difficulty with that unwritten book was rooted in “vol. 1,” on which his reputation was largely based. Later on, Hoffer was inclined to ignore and even to disparage mass movements. He had no wife and no debts, and his rent was as low as rents in San Francisco ever get. His expenses were minimal and his frugality ingrained. Pen, paper, and books from the public library were for him the key ingredients of contentment. When A related theme was often found in his notebooks: “What tires us most is work left undone.” He kept insisting that he was not a writer, but to continue functioning he had to keep on writing: He also saw reasons for believing that “Russia’s day of judgment will come sometime in the 1990s.” (The Soviet Union was always “Russia” in Hoffer’s lexicon.) “And when the day comes everyone will wonder that few people foresaw the inevitability of the end.” There will be no peace in this land for decades. The journalists have had a taste of history-making and have become man-eating tigers. Life will become a succession of crises . . . What will political life be like when history is made by journalists? As a symptom of aging, he noted what many in retirement have reported: he felt hurried though no one was pursuing him. Working on an essay about the old, he knew that “to function well the old need praise, deference, special treatment—even when they have not done anything to deserve it. Old age is not a rumor.” They say that on his deathbed Voltaire, asked to renounce the devil, said: “This is no time to be making new enemies.” This talk of living a life of quiet desperation is the blown-up twaddle of juveniles and if it hits the mark it does so with empty people. I have no daemon in me; never had. There is a murderous savagery against people I have never met; a potential malice which is not realized because of a lack of social intercourse. In the usual sense of the word, Hoffer himself was an intellectual. He read books and wrote them. But he had no desire to teach others, he said, and this made him “a non-intellectual.” For the intellectual is someone who “considers it his God-given right to tell others what to Another correspondent was the community organizer Saul Alinsky. the language is cryptic because the idea is not clear.” He viewed them as a dangerous species. They scorn profit and worship power; they aim to make history, not money. Their abiding dissatisfaction is with “things as they are.” They want to rule by coercion and yet retain our admiration. They see in the common criminal “a fellow militant in the effort to destroy the existing system.” Societies where the common people are relatively prosperous displease them because intellectuals know that their leadership will be rejected in the absence of a widespread grievance. The cockiness and independence of common folk offend their aristocratic outlook. The free-market system renders their leadership superfluous. Their quest for influence and status is always uppermost. free society is as much a threat to the intellectual’s sense of worth as an automated economy is a threat to the worker’s sense of worth. Any social order, however just and noble, which can function well with a minimum of leadership, will be anathema to the intellectual. The intellectual regards the masses much as a colonial official views the natives. Hoffer thought it plausible that the British Empire, by exporting many of its intellectuals, had played a counter-revolutionary role at home. Employment and status abroad for a large portion of the educated class may have “served as a preventive of revolution.” All intellectuals are homesick for the Middle Ages, Hoffer wrote. It was “the El Dorado of the clerks”—a time when “the masses knew their place and did not trespass from their low estate.” Eric Osborne recalled one humorous incident: “Once Eric Hoffer was talking and a rabbi was in the audience; or maybe Hoffer was talking to a bunch of rabbis, and he was telling them that there is no God. One rabbi said, ‘Mr. Hoffer, there is no God and you are His prophet.’ Yet he continued to ponder the nature of God. It was speculation without faith—more philosophy than religion—but it was never far from his mind. In his notebooks he often wrote as though God was a reality whether he believed in Him or not. And he did (sometimes) capitalize the pronoun. Sometimes you think how much of a better world it would be if Judaism, Christianity and Islam with their driving vehemence had never happened. Then you think of all the misery and boundless cruelty practiced in lands that never heard of Jehovah, his son and his messenger. Hoffer’s ideas about the uniqueness of man and the great error of trying to assimilate man into nature—a key dogma of modernity—was perhaps his most original venture into philosophy. Hoffer was strongly opposed to the modern tendency to see science and religion as antagonists. On the contrary, religious ideas about the Creator had inspired the early scientists. They tried to work out how God had created the world and science emerged from this study. He believed Israel revealed that history is not a mere process, but an unfolding drama. The insights and thoughts that survive and endure are those that can be put into everyday words. They are like the enduring seed—compact, plain looking and made for endurance. La Rochefoucauld, in his maxims, delighted Hoffer with his brevity and wit, sometimes bordering on cynicism (“We are always strong enough to bear the misfortunes of others”). Philosophers, on the other hand, had little to boast about. Why was this? Russell concluded, “As soon as definite knowledge concerning any subject becomes possible this subject ceases to be called philosophy and becomes a separate science. The whole study of the heavens, which now belongs to astronomy, was once included in philosophy; Newton’s great work was called ‘The Mathematical Principles of Natural Philosophy.’ Similarly, the study of the human mind, which was a part of philosophy, has now become separated from philosophy and has become the science of psychology. . . . [Only those questions] to which, at present, no definite answer can be given, remain to form the residue which is called philosophy.”1 He once wrote that “the trouble with the Germans is that they are trying to express in prose what could only be expressed in music.” There was “a German desire for murkiness,” Hoffer argued, “a fear of the lucid and tangible.” Worse, “the German disease of making things difficult” had conquered the world. The less we know of motives, the better we are off. Worse than having unseemly motives is the conviction that our motives are all good. The proclamation of a noble motive can be an alibi for doing things that are not noble. Other people are much better judges of our motives than we are ourselves. And their judgment, however malicious, is probably correct. I would rather be judged by my deeds than by my motives. It is indecent to read other people’s minds. As for reading our own minds, its only worthwhile purpose is to fill us with humility. Nowhere is freedom more cherished than in a non-free society, for example. “An affluent free society invents imaginary grievances and decries plenty as a pig heaven.” As for deciphering others, the only real key is our self. And considering how obscure that is, “the use of it as a key in deciphering others is like using hieroglyphs to decipher hieroglyphs.” Sophistication is for juveniles and the birds. For the essence of naivety is to see the familiar as if it were new and maybe also the capacity to recognize the familiar in the unprecedentedly new. There can be no genuine acceptance of the brotherhood of men without naivety. The most intense insecurity comes from standing alone. We are not alone when we imitate. So, too, when we follow a trail blazed by others, even a deer trail. At times he felt euphoric and he wondered how that arose. He came to believe that “the uninterrupted performance of some tasks” was the key to happiness. It was not the quality of the task, which could be trivial or even futile. “What counts is the completion of the circuit—the uninterrupted flow between conception and completion. Each such completion generates a sense of fulfillment.” Whenever conditions are so favorable that struggle becomes meaningless man goes to the dogs. All through the ages there were wise men who had an inkling of this disconcerting truth. . . . There is apparently no correspondence between what man wants and what is good for him. Flaubert and Nietzsche have emphasized the importance of standing up and walking in the process of thinking. The peripatetics were perhaps motivated by the same awareness. Yet purposeful walking—what we call marching—is an enemy of thought and is used as a powerful instrument for the suppression of independent thought and the inculcation of unquestioned obedience. Originality is not something continuous but something intermittent—a flash of the briefest duration. One must have the time and be watchful (be attuned) to catch the flash and fix it. One must know how to preserve these scant flakes of gold sluiced out of the sand and rocks of everyday life. Originality does not come nugget-size. Like Hoffer, Montaigne almost never mentioned his mother, who came from a family of Sephardic Jews. Hoffer said that when he read Montaigne’s essays in 1936 he felt “all the time that he was writing about me. I recognize myself on every page.” Overall, however, he found it remarkable “how little we worry about the things that are sure to happen to us, like old age and death, and how quick we are to worry ourselves sick about things which never come to pass.” Montaigne said something very similar. His life had been full of “terrible misfortunes,” he said, “most of which never happened.” Theorizing in the future, he predicted, would tend to regard humanity “as unchangeable and unreformable.” “I shall not welcome death,” Hoffer wrote. “But the passage to nothingness seems neither strange nor frightful. I shall be joining an endless and most ancient caravan. Death would be a weary thing had I believed in heaven and life beyond.” September 27, 1981 How does a man die? Does he know when death approaches? Friday night (25th) I vomited the first time in my life. The vomit was dark and bitter. The new experience of vomiting gave me the feeling that I was entering the realm of the unknown. As they lay there in the dark, Selden once again heard Eric’s heavy breathing. Reassured, Selden went back to sleep. But when he woke up again, perhaps an hour or two hours later, Eric’s breathing could be heard no more. He was gone—you could say that he didn’t say goodbye to anyone. He was buried at the Holy Cross Cemetery in Colma, just outside San Francisco. Lili Osborne’s grave is next to his. the well-off will no longer be able to derive a sense of uniqueness from riches. In an affluent society the rich and their children become radicalized. They decry the value of a materialist society and clamor for change. They will occupy positions of power in the universities, the media, and public life. In some affluent societies the children of the rich will savor power by forming bands of terrorists. Bacon touches upon two crucial differences between Judeo-Christianity and other religions. In a monotheistic universe nature is stripped of divine qualities—this is a downgrading of nature. At the same time, in a monotheistic universe, man is wholly unique, unlike any living thing. It would have gone against Bacon’s aristocratic grain to point out that the monotheistic God, unlike the God of other religions, is not an aristocrat but a worker, a skilled engineer. Bacon could have predicted the coming of a machine age by suggesting that if God made man in his own image, he made him in the image of a machine-making engineer. An aphorism states a half truth and hints at a larger truth. To an aphorist all facts are perishable. His aim is to entertain and stimulate. Instruction means the stuffing of people with perishable facts. And since in human affairs the truthful is usually paradoxical, aphoristic writing is likely to prove helpful. The French Revolution and its Napoleonic aftermath were the first instances of history on a large scale made by nobodies. The intellectuals loathe democracy because democracy creates a political climate without deference and worship. In a democracy the intellectual is without an unquestioned sense of superiority and a sense of social usefulness. He is not listened to and not taken seriously. The sheer possession of power does not satisfy the intellectual. He wants to be worshipped. years of pauseless killing of the First World War. This tangibility of death created a climate inhospitable to illusion.. But it is probably true that from the beginning of time talents have been wasted on an enormous scale. It is the duty of a society to create a milieu optimal for the realization of talents. Such a society will preach self-development as a duty—a holy duty to finish God’s work. Where the creative live together they live the lives of witches.