• Sample Page

Tom's Musings

  • Orthogonal Thinking.

    July 22nd, 2024

    I probably should write about the fact that President Biden recently announced he is dropping out of the race. But everyone will be chatting about that. I prefer to focus on deeper (though not necessarily more important) issues.

    So, let me meander into the way we too often think about public issues or, perhaps more accurately, how we fail to think about them with much accuracy. I am occasionally struck by how often consensual beliefs operate (or thrive) independent of experience and evidence. Of course, there are a host of well-known psychological phenomenon (e g., confirmation bias) that help explain why erroneous beliefs remain impervious to correction. Still, it is baffling that numerous false perceptions thrive despite contradictory history and fact.

    For example, most Americans probably believe that Democrats long have been the liberal political party, the defenders of the oppressed and supporters of minorities. Republicans, on the other hand, have been associated as defenders of big business, of hard money, and of white privilege. Those views, however, are way too simple if we take even a cursory tour of our past.

    The Republican Party was formed in Wisconsin to oppose the spread of slavery to future western states and to use government as a pro-active arm in creating economic expansion and social opportunity. Everything from land-grant colleges, to the investment in the intercontinental railroad, to the creation of a national currency and a proto-income tax, came from those early Republicans … the party of Lincoln. The Dems remained the Party of ‘state’s rights’ and backward thinking for a long time. Even Democratic President Woodrow Wilson, a highly educated southerner, was a resolute racist.

    It was during the Great Depression that the contemporary roles of the major parties began to evolve. FDR’s activism, his support for labor and unions (e.g., the NLRA), and whose spouse publicly advocated for the oppressed began to break the Republican hold on Black Americans and other minorities. Roosevelt’s issuance of an executive order to end discriminatory hiring in defense industries during WWII proved a pivotal turning point. Harry Truman’s executive order integrating the military in the late 1940s cemented this new political alignment.

    In the mid-part of the 20th century, the two parties were substantively similar in many respects despite differing rhetorical styles. Republican Dwight Eisenhower (who had spent his professional life in the military) was imbued with the need for public investment if the country were to remain strong and an international leader. Having been stuck stateside during WWI, he volunteered for a tortuous cross-country military caravan in 1921. From that experience, he intuitively understood that only the federal government had the resources and perspective to move the country forward. So, when he became President three decades later, rather than undoing the New Deal during the 1950s, he poured significant government resources into research and development (multiplying such investments by several fold), invested in the largest infrastructure project to date (the interstate highway system), and kept marginal tax rates at their war time levels of around 90 percent on the wealthy. He even sent federal troops into Little Rock Arkansas to support the integration of public schools after the 1953 Brown v. Board of Education SCOTUS decision. Ironically, it was Democratic President JFK who started lowering the top marginal tax rates on the most affluent Americans. .

    Not long after, another Republican President, Richard Nixon, would speak with a decidedly conservative tongue while acting like a radical liberal. He created new federal agencies like the Environmental Protection Agency (EPA), federalized several welfare programs for the aged and disabled, introduced an automatic COLA provision for Social Security recipients, and almost enacted a universal income floor. Not even today’s most liberal politicians could pull this semi-socialist agenda off.

    Yet, the public image of the two parties had already been set in an odd kind of orthodoxy. The Republicans were the party of fiscal prudence, sound business, and strong family values. That other party was weak on national defense, were anti-capitalist, and were considered to be big spenders. Once these emotional images were imprinted in the electorate’s DNA, they remained relatively impervious to change.

    However, let’s look at a few numbers. Between 1933 and 2020, we have had 7 GOP executives and 7 Democratic leaders. The economy has grown an average of 4.6 percent under the (so-called) anti-business Dems and only 2.4 percent under the pro-business GOP. While 10 of the past significant recessions began under the GOP, only 1 began under a Dem. The only balanced budgets for the past 60 years were under Democratic leaders… Carter, Clinton, and Obama … so much for GOP fiscal pridence. If not for the Reagan, Bush, and Trump tax cuts (skewed to the wealthy), and the unnecessary Bush wars in support of neo-conservative delusions, our national debt would be closer to 0 than to the actual figure of over $30 trillion dollars. Finally, job and wage growth has been better, and consistently so, under Democratic administrations.

    To be accurate, a significant normative diversion between the two parties only became obvious in 1980 with the Reagan revolution. At that point, the two camps clearly moved in distinct directions. The ideological sorting out that began in the 1960s had been fully finalized by 1990. Our politics were fully polarized by then and, with the 1994 Gingrich Congressional takeover of the GOP, virtually all inter-party collaboration and bi-partisanship ceased. Governance in the public interest became a rare event, to be replaced by gotcha moments and personal attacks. These tactics were not new, just more virulent and universal.

    The odd thing about our current state of affairs is the continuing disconnect between image and reality. Republicans are seen as the ‘good for the economy’ party despite their poor performance in this arena over the decades. They have, however, been good for those at the top of the pyramid. Under the general supply side principles that have exerted a stranglehold on economic policy since 1980, the top 1 percent earned 26.3 percent of AGI (Adjusted Gross Income) in 2021, up from less than 10 percent in 1979. The wealthiest Americans recently paid less proportionally in income taxes than working stiffs, something unthinkable during the Eisenhower years where progressive taxation was yet considered both fair and good public policy. Astoundingly, many working class Americans firmly believe that the GOP has their best interests at heart. Unbelievable!

    It might be noted that it was during the post WWII period of high taxes and growing public spending that a robust middle class grew and social inequality fell. Economists call this era the great compression where income and wealth disparities fell dramatically as the rich contributed their fair share to the public good. Astonishingly, when compared to today’s world, some top business executives (e.g., Paul Hoffman and George Romney, father of Mitt) argued for a cap on CEO salaries as being good for America. They suggested $250 thousand back in the 1950s (about $2 million today). I cannot see Elon Musk or Jeff Bezos doing something like that today.

    But nowhere has our political thinking been more askew than when it comes to so-called family values. I can remember when Republican Nelson Rockefeller was shunned by his party for getting a divorce. Now, the undisputed leader of today’s GOP is a serial sinner, an inveterate misogynistic, and a total degenerate who somehow remains the darling of evangelical extremists and the GOP base. He rejects all of the sober fiscal tenets to which his party once paid lip service (e.g., balanced budgets and free trade while endorsing big tariffs and trade wars). Moreover, he sucks up to an array of dictators and international thugs that older Republicans would have reviled and repudiated without question. And he would trash our revered Constitution in favor of authoritarian, strongman rule, abandon the Western alliance that has defended liberal democracies since WWII, and take the country back into a form of head-in-the-sand isolationism in a global economy. Talk about orthogonal thinking.

    There has always been a political disconnect between what is and what is believed. That has always fascinated me and raised a question in my mind. Can a democracy last if there is no relationship between what is and what the electorate believes exists? For the American experiment in a vigorous republic to succeed in the future, we must find a way to match perception with reality. The average American must be educated to the point where they can connect the most obvious dots.

    Otherwise, welcome to 1933 Germany. We might well end our fragile democracy and all because we did not pay attention to what was real and what was mere illusion.

  • Returning to the cultural divide.

    July 19th, 2024

    If there is one aspect of our political situation that is remarkable at present, it is that the cultural divide between political parties and between normative positions among our people is beyond measure and is resistant to any easy remedy. We cannot communicate across that divide. We cannot begin to understand those on the other side, even those within our own families. I find myself either selecting (or blocking) Facebook friends based upon their expressed or presumed political values. I am doing the one thing I thought I’d never do … creating my own normative and intellectual bubble.

    I do this reluctantly and out of self- protection. Of course, I can generate reasons why individuals make decisions that baffle me. They may have less education, have had more limited life experiences, believe their life choices are diminished, sense threat from others who look or believe or behave differently, and the list could go on and on.

    Still, at the end of the day, I cannot even begin to comprehend, nor forgive, those who adore and worship the most disgusting and depraved American public figure I’ve seen in my 80 years on this earth. It remains incomprehensible that this poster boy for the Biblical Anti-Christ (Donald Trump) is the darling of those who most vocally praise Jesus Christ as their inspiration. It is like living in a bad episode of the Twilight Zone.

    Yet, it sometimes helps me to realize we have had prior periods where communications among Americans were equally as fraught and incomprehensible. The anti-bellum period before our great blood-letting in the 1860s is a prime case in point. Varina Davis, wife of Confederacy President Jefferson Davis, offered to make a family friend an admiral in the Secessionist Navy while saying, “You will join us, we are going to have a glorious monarchy.” She implicitly conveyed a foundational premise of the aspiring new country located in our South … she believed that the democracy on which America had been founded would be replaced by an institutionalized social hierarchy based on privilege and an inherent right to rule by those naturally born to such a role.

    No matter what kinds of legalistic arguments were employed to justify tearing the Republic apart, State’s Rights for example, a more primitive impulse simmered below the surface … one hardly more defensible than the caste system in India. In seceeding, South Carolina officials wrote that “our position is thoroughly identified with the institution of slavery…the greatest material interest of the world.” Their declaration of independence went on to say, “it is not a matter of choice, but of necessity.’‘

    The one thing that the Southern States most resented was ‘the inalterable fact that the North, like the rest of the modern world, condemned slavery as a fundamental evil. In doing so, abolitionists and their allies impugned the honor of the entire Southern race for if slavery was indeed evil, then the South was evil, and it’s echelons of gentlemen, the chivalry, were nothing more than moral felons.’ Those outside the South were not just impugning an economic system but the moral basis upon which their society had been erected.

    This notion of any set of hierarchical positions in society, pre-ordained and beyond questioning, seems quaint and dated to most of us today. For a whole segment of the country (in those days at least), it was unquestioned orthodoxy. When U.S. Army colonel Robert Garnett of Virginia resigned his commission as an officer in the U.S. military to join the Confederacy, he confided to British reporter as follows. “I deride the notion that all men were born equal in the sense that all men have equal rights. Some men were born to be slaves … others to follow useful mechanical arts … the rest were born to rule and own their fellow men.” That some men (women were second class citizens as a matter of course) were seen as inherently inferior struck many as being so obvious that the question deserved no debate. It was merely the way God had ordered the world. The Southern diarist, Mary Chestnut, wrote, “God is on our side.” When asked why, she replied, “Of course he hates the Yankees.”

    Those in the North, for the most part were equally rooted in their own vision of the good society. William Seward, a member of Lincoln’s cabinet, represented a widespread feeing of the cultural gap between North and South. He went on to say “… in the North all was life, enterprise, industry, mechanical skill. In the South, there was dependence on black labour, and an idle extravagance which was mistaken for elegant luxury.” While prejudice and exclusion could be found anywhere (witness how the Irish initially were treated in Boston as the fled the potato blight back home), social improvement was feasible for most. Merit and ambition mattered in the more industrial and urban North. There was no pretense to, nor romantic idealization, of a past feudal world that the South yet yearned to sustain, or at least recreate.

    By the time London Reporter William Russell completed his investigatory trip through the South, he had come to a dire conclusion. “The utter contempt and loathing for the venerated Stars and Stripes, the abhorrence of the very words United States, the intense hatred of the Yankees on the part of these (Southern) people, cannot be conceived by anyone who has not seen them.” This more or less ‘objective’ observer was appalled at the chasm that had grown among Americans. He saw the break as irreconcilable.

    This was no dispute across mere political parties. This was more than a vigorous contest for mere dominance in a contest for resources or advantage. The gap in 1850’s America was visceral and fundamental. It centered on such things as the meaning of virtue, on the very principles for organizing society, and on the definition of who was fully human and who was not. These were dimensions of the human condition that could not be compromised nor negotiated. Only force of arms could determine who might prevail.

    Many would claim that the 700,000 or so who died in our Civil War settled which vision would prevail. It didn’t. Within a generation after the cessation of hostilities, the noble ideals on which the conflict were based faded in the face of political realities. De facto apartheid replaced the prior de jure form of institutionalized oppression and exploitation. We recreated a form of exclusion and domination by another name.

    In 1875, then President (and the most successful military commander of Union troops) U.S. Grant may well have sensed our future problems. He said, “if we are to have another contest … of our national existence, I predict that the dividing line will not be the Mason and Dixon‘s (separating the North and South) but between patriotism and intelligence on the one side and superstition, ambition, and ignorance on the other.”

    Our Civil War patched together those perspectives that had torn the country apart. The conflict, however, never settled the underlying tensions and differing value systems that created the initial disputes. They remained percolating under the surface. They would occasionally erupt in race riots and civil unrest only to be tampered down for the time being.

    Now, however, we may well be back in the 1850s. Today, the Republican Party has assumed the role played by the southern based Democratic Party back then (the defenders of white privilege). The Current Democratic Party more or less has become the defenders of inclusion, meritocracy, and opportunity for all. What is most concerning is that we have a major political party in this country which, for the first time in my living memory, has embraced the vision of a hierarchical society based on pre-ordained roles. We have a GOP that is dedicated to thwarting Constitutional protections and, instead, instituting authoritarian rule. They wish to end the American experiment in democracy mostly in a misguided attachment to white, male nationalism.

    The most frightening aspect of all this is that this time around, they just might succeed.

  • The Quant Conundrum.

    July 16th, 2024

    I spent most of my professional life in academic settings. It is not surprising, then, that I’ve partially been indoctrinated into the quantitative perspective. Nothing matters unless it can be reduced to observable numbers. These representations of reality seem to tap irreducable truth in some unbiased manner. Forget the inconvenient reality that quantitative figures can be manipulated in so many arcane ways and remain inscrutable to the mere mortals who constitute the hoi poloi. They seem inviolable.

    I was thinking about this recently as I reflected on how some of the early titans of social media (Zuckerberg, Dorsey, Sandberg, and Theil) have so easily been sucked into the quant perspective. It is a path easily followed. I did it early in my career as I became enamored by catchy planning tactics like Management by Objectives and a host of derivative tools. Pick the right goal, formulate that end as a defined metric, then measure it to death. Running your company or government then becomes an easy task, or appears to at least. You then manipulate things to optimize that outcome.

    I like to think of the adherents to this perspective as the brittle bright crowd. They are great at solving complex engineering and computer programming challenges. At the same time, they fall short when attempting to achieve the status of Plato’s philosopher kings. The lure of rigorous analytics fails to embrace the human dimension. Alas, technical acumen is not the same thing as wisdom. How I wish it were.

    I am pushed to consider this topic by several books in which I’ve immersed myself of late. They tap a growing concern that the social media empire (Facebook, Twitter or X, Tic Toc, Snapchat, Instagram, et. al.) has led society into a dystopian nightmare where our youth is anxious, detached, depressed, and even suicidal; our politics are polarized and divided beyond repair; and our social fabric and communal identity is being ripped asunder.

    How did that happen? Like many questions, at least one answer is easy to discern. As smart phones have taken over our lives, the social media platforms they provide us 24/7 now exert unparalleled influence over how we see the world about us. Moreover, there is a common business model behind all of these connectivity instruments and technologies.

    The prime outcome of interest is engagement … the amount of time a consumer spends on a social media site. Learning, bonding, communicating, and all other possible positive outcomes implicitly associated with these technologies mean nothing. Only engagement counts since that factor appeals to those purchasing advertising. The more people on a site, and the longer they stay, the greater opportunity for the sellers of nonsense to peddle their wares. And when a platform is dealing with upwards of $80 billion in annual advertising, all else, including any concern with perverse consequences, quickly falls by the wayside.

    Of course, these contemporary entrepreneurial gurus did not invent modern management. As far back as I can recall, there had been a push to introduce more rigor into government and the private sector. My own career was a testament to both the allure of the quant approach to things and the inherent pitfalls associated with it. This will be a mercifully quick tour.

    Some of you will remember Robert MacNamara and the body counts in Vietnam. Robert had a reputation as a whizz kid, a new kind of corporate manager in the auto industry who would rely upon numbers and not ‘feel’ or ‘experience’ when making key decisions. When he took over as Lyndon Johnson’s Defense Secretary in the 1960s, he brought his corporate culture with him to Washington. His primary challenge at the time involved choosing a key outcome to assess success in the emerging Vietnam war, a conflict with no front lines and where winning battles had little meaning. But you could measure dead bodies. That seemed easily quantifiable. If you killed more of them than they did of you, you would win … no? Poor Robert eventually would admit his errors and that the war has been a tragic mistake. His misguided early hubris haunted him for the remainder of his life.

    My early experiences in state government taught me several lifelong lessons, though on a less tragic level. One of my first responsibilities was to serve as the analyst for the state of Wisconsin’s Quality Control system (QC) for cash assistance for vulnerable families. Basically, samples of welfare cases were drawn, and the accuracy of eligibility and payments were assessed. The state was to employ these results to improve the accuracy of welfare decisions through various corrective actions. Sounds perfect as an example of the new and hard-headed approach to governing.

    What happened, though, was I soon stumbled on an easy way to reduce welfare errors. Forget about complex corrective action strategies, just simplify the rules. Cash welfare for families had been tailored to individual situations. The workers tried to adjust individual grants to the fiscal peculiarities a family faced. This was a noble attempt at case equity. But now, to reduce errors, we began to radically simplify those rules. Eventually, we would wind up with a flat grant where only family size mattered. Our rationale was simple, fewer decision points and there were fewer chances for error.

    The other byword of the new management was efficiency. That would be another lure for us young Turks of that era. So, I quickly concluded we had to automate government (back in the early 1970s). Until I left for the University, another obsessive project of mine (and like-minded peers) was to conceptualize and develop what was known as the Computer Reporting Network or CRN. When up and running, workers in Wisconsin’s 72 counties would merely collect data from an applicant for assistance and enter it into a remote terminal. The actual decisions would be made by a computer in Madison for the major assistance programs … cash assistance, (then) Food Stamps, and Medicaid. It was extremely efficient since only one application needed to be completed, and human input (including error) was largely eliminated.

    But one quickly learns that unintended consequences lurk everywhere. Automating welfare decisions meant that ALL discretionary decisions had to be eliminated. Every vague and opaque decision point had to be transformed into a binary choice. It was either this or that. No gray permitted.

    Between QC and CRN, we would take out all the human contributions associated with dealing with vulnerable families. The new system had all the hallmarks of modern management. It was efficient, accurate, and resulted in minimal maintenance costs. But, as do all innovations, there were costs.

    Horizontal equity was sacrificed, there was little relationship between what a family needed (given unique circumstances) and what they got. Also, the human element was removed. Agency workers became data collectors and little else. They no longer helped clients with non financial needs, which often were the important stuff. Finally, a uniform or flat grant became much easier to cut over time since benefits were less obviously tied to actual needs. Sure enough, benefit guarantees began falling over time. When I went to Washington to work on Clinton’s plan in the early 90s, I oft joked that ‘we better reform welfare soon or there would be nothing left to reform.‘

    Let us segue back to the titans of social media. They are clearly members of the brittle- bright group, excellent at analytics but not so great on the wisdom gained through broad experience. In fact, Silicon Valley was known to be populated by twenty-somethings during the heady days of the rise of social media.

    These kids had an engineering mentality. They saw a problem and went after THE solution. Hard numbers were always at the heart of these solutions, and optimizing equations and algorithms paved the way. You know the drill … pick an outcome and make choices that maximized that number. In the end, they highly simplified their world. Work up the emotional state of the customer by leading them deeper into controversial (even dishonest) content since that enhances engagement and increases advertising revenues (i.e., profits.)

    Many of these titans had fooled themselves, believing they were contributing to a new world based on universal communication. This was a world first intimated by Pierre Teilhard de Chardin some 75 years ago. In truth, they were trapped in a classic version of the prisoner’s dilemma associated with any investment initiative involving venture capitalists. They had to keep pleasing advertisers and thus increasing revenue streams. If they did the right thing for the public good, competitors would catch up. There was no way out of the trap they were in, no matter the costs to people and society.

    Those costs are proving to be extraordinary high and show no sign of abating. A dystopian future seems inevitable at the moment. But their intentions were good, as were mine, back in my early days.

  • Belated Independence Day musings.

    July 10th, 2024

    Roughly a week ago, we celebrated our independence from Britain. Of course, we screwed up the significance of that holiday … something we do with surprising regularity. At the time, John Adams predicted that future generations would celebrate our independence on July 2nd. It was on that day in 1776 that the colonial representatives gathered in Philadelphia voted to break with the mother country. That was considered the major event in the moment. The 4th was a relatively minor afterthought when the formality of affixing their signatures to the final version of the declaration of independence took place. But the vote two days earlier was the critical moment.

    The motivations behind this revolution are murky and not always uplifting. We all have heard the ‘taxation without representation’ rationale. While there is some merit to this argument, the case is not compelling. After all, the Crown had emptied its treasury during the 7 Years War (aka The French and Indian War) to defend these (as it turned out) ungrateful colonists. This conflict preserved the colonists’ sovereignty over the Eastern seabord from incursions by the French and by Native Americans who now realized the enormity of the mistake they made by not pushing the first Europeans back into the sea. In effect, the Crown had made it possible for the early settlers to solidify their hold on the land and to develop lucrative trade and commerce relations.

    The Native American story in New England was particularly sad. The first Europeans touched on these shores before 1620 but were easily run off by the indigenous tribes who had superior numbers. Still, physical contact had been made, a disastrous interaction for the original inhabitants of this land. Diseases for which they had no immunity spread quickly, reducing the coastal population in an extreme manner. Future probes along the New England shores by various explorers found largely depopulated villages along the coast with local tribes who could no longer effectively defend their lands.

    Fast forward a century and a half, let us consider the situation in the 1760s from the British perspective. They thought, not without reason, that the British subjects living in North America ought to share in the costs of protecting the still vulnerable colonies from future external threats. But Parliaments efforts to impose even modest taxes were met with fierce opposition. This was to remain an American trait that still dominates our national character … we want public services but despise paying for them. We want a free ride.

    This is a form of extreme selfishness, though it yet remains problematic to separate principles from profits. The Boston Tea Party has been sold as a noble measure of civil disobedience. At the same time, all this British tea waiting to flood the market would have depressed prices and cut into the profits enjoyed by rich merchants like John Hancock. Perhaps principle and profit are two sides of the same coin.

    Still, the bravery of the early revolutionaries cannot be questioned. When the British marched out of Boston in April, 1775, they were looking for stored arms in Lexington and for several revolutionary agitators who might well have been hung as traitors. During the 8 year struggle, it was (until the defeat of Cornwallis in 1781) a close run thing. The top revolutionaries clearly had committed treason against the Crown. The first few battles (after Bunker Hill and the British evacuation of Boston) showed the colonists to be no match against the most powerful military in the world at that time.

    America’s forces were first cornered in Brooklyn. Only a miraculous turn of weather enabled Washington to escape the vice like trap laid by the Brits. Brisk winds kept the fleet from sailing up the East River to close the ring around the beleaguered patriots. Then, a fog rolled in to enable Washington to escape to Manhatten during the night and eventually to get out of town. He was fleeing with the Brits close on his heels.

    That reprieve was short-lived. By the end of that first year, the Continental Army had dwindled to a skeleton force with many enlistments ending on December 31st. It was then that our the so-called Father of our country pulled a rabbit out of the hat. His forces crossed the Delaware River at three points on the night of December 24 to attack the Hessians based at Trenton. The logistics were a nightmare, and many of his troops never got across. Moreover, it took longer than anticipated. Worse still, a Tory spy saw what was happening and raced to Trenton to tell the Hessian commander about what was unfolding. If the element of surprise was eliminated, Washington’s last gamble would have been doomed. They could not have defeated a prepared, professional Hessian force.

    A note from the spy was delivered to the Hessian commander. Unfortunately, he was busy drinking and gambling in celebration of Christmas that night. He wasn’t concerned. After all, this ragtag American army was all but defeated. He put the warning in his pocket without looking at it, an act that would cost him his life a few hours later. Thus, it turned out to be a totally surprise attack and an astounding success. But it so easily could have turned out so different.

    I have always thought that the most relevant example in my lifetime where a similar underdog came out on top was the Vietnamese nationalists. They fought longer than the Colonists, well over four decades, and suffered way more than did our ancestors. But one thing is the same, they took on some of the greatest military machines of their era against all odds. And they won! These nationalists, in turn, took on the Japanese, then the French, and then the Americans. Their odds of prevailing probably seemed about the same as the Colonists back in the early days of our revolution.

    In similar fashion, the Colonists were sharply divided much like the Vietnamese were. Some estimates put the proportion seeking independence at no more than a third of the population, with another third remaining loyal to the Crown and the remainder indifferent. Benjamin Franklin’s own son remained a loyal Tory and British official, thus ending any positive relationship with his dad. My late wife’s ancestors (on her dad’s side) can trace their lineage to pre-revolutionary times in Massachusetts. However, they left the Boston area around that time. My guess? They had remained loyal to the Crown and were forced to move west. In about three generations, they settled for good in the Twin Cities

    As we can see from the above insert, independence would have come in due course. Britain, at one time, exerted direct or indirect control over a quarter of the globe’s population. Truly, the sun never set on their empire by the latter part of Queen Victoria’s reign. Yet, slowly, it all fell away. The business model of extracting resources and rents from others through force proved unworkable (and too costly) in the end. America was the first to break away, but eventually, all the others followed.

    Before ending these musings, let us revisit whether some uplifting principle might be found in America’s revolution. While mercenary motives cannot be discounted, I do find at least one uplifting purpose. It is the rejection of authoritarian rule. Enlightened thought was circulating among the intellectuals of Europe. Those ideas would seep into the thinking of the American founding fathers. These notions would, in turn, soon rebound back across the Atlantic to invigorate the French Revolution.

    Ironically, that 1789 seismic rupture in the old world was a direct result of the American revolution. In 1777, after the Battle of Saratoga, France finally succumbed to the entreaties of Franklin and Adams to support the uprising of the colonies. This cost the French monarchy much treasure, eventually leading the king to call Parliament (the Estates Generale) into session in order to raise additional revenue. That proved to be an unwise move in the short rum. His reign and his life soon were forfeit.

    These two intertwined revolutions, the one in America and the one in France, initiated the slow and uncertain evolution toward democracy and governing principles that embraced citizen participation and involvement in public decisions. Progress, however, was slow. It took America some 170 years to achieve more or less universal suffrage. France bounced back and forth between Republics and authoritarian rulers for many decades before a more Republican approach stuck.

    Now, in 2024, where are we? France just underwent a troubling election where the hard right seemed on the verge of taking control. That had not happened since World War II when the Nazis installed a Vichy puppet government. And in America, most now believe that the electorate will hand over the reigns of power this November to a pathological narcissist and degenerate sociopath who vows to reinstate an authoritarian regime. There is no crisis demanding such a bold and dangerous action. It would appear that there are a sufficient number of Americsns fully prepared to cede control of government to a maniac who will dismantle the rule of law and our constitutional protections. Wow! Their cognitive processes defy logic.

    I’ve noted this before. Some acute observers assert that the American experiment did not start with the Declaration of Independence (1776) nor the Treaty of Paris (1783) nor the ratification of the U.S. Constitution (1788). No, the real American revolution occurred in 1801. It happened when John Adams looked at the electoral results and realized that he had lost his bid for reelection against his bitter political rival (Thomas Jefferson). He did not call out the troops. He did not claim foul and refuse to leave the White House. When the time came, he merely got in his carriage and started the long ride back to Boston. His actions were light years away from Trump’s utterly narcissistic actions on January 6, 2021.

    Is the experiment in American democracy over? Has all this been in vain? Time will tell, but I am not optimistic.

  • Losing a generation … a further note.

    July 3rd, 2024

    As I was driving north to a getaway at lovely Green Lake Wisconsin, I began to muse on my recent blog. In it, I waxed on the damage that cell phones, social media, and the loss of ‘play-based’ childhoods were having in the Gen X generation, those born after 1994. Essentially, childhoods lost on apps where kids and adolescents spend their young lives scrolling through endless posts, usually comparing themselves to others. This mindless activity has led to extensive amounts of anxiety, depression, and mental health issues. As a result, this next generation has little opportunities to interact with other kids outside of structured situations with adult supervision. As such, they are bereft of the interpersonal skills essential to negotiating a complex and nuanced world.

    Before moving on, I want to at least touch on another side of the issue. Developmental biologists tell us that our brains 🧠 are about their adult size by age 5. After that, experiences and nurturing shape the brain in various specific ways. Some pathways are strengthened. Many neurons are lost due to inactivity. In some, a mylenization process results in faster relays of electrical impulses along specifuc oaths. In short, our brain is elastic and goes through a drastic restructuring in our tender years. Ages 10 through 14 are critical here. But even after our fully formed brains are set (in our early to mid 20s), further change is possible as has been demonstrated by the Dalai Llama and his Tibetan Buddhist monks.

    That got me thinking about a topic I haven’t considered since high school. Back then, I read the works of Pierre Teilhard de Chardin … a Jesuit Priest who spent his life in pre-Communist China doing archeological work in addition to his pastoral and missionary duties. His thinking on evolution, which he tried to integrate with Catholic dogma, kept raising the ire of Vatican authorities. But for thinking Catholic youth, he was a hero, opening us up through his writings to newer ways of looking at the world. He was a bit of a rock star to some of us.

    I vaguely recall (this was, after all, some 60 to 70 years ago) him developing the notion of the noosphere. His thinking on evolution had us on the cusp of another transformative moment in time. Increased connectedness among people, along with the communication revolution, would increasingly bind people together. It was merely a matter of time before a global consciousness would arise. Heady stuff for a working class kid struck in a narrow, religious bubble.

    Of course, Teilhard could know nothing about the cyber revolution that was decades down the road, not hardly imagine the artificial intelligence (AI) earthquake that is upon us. Even we who are living through it cannot imagine where it will take us.

    However, let us do a thought experiment. The great rewiring (as Jonathan Haidt calls it) that is reshaping the Gen X youth in disastrous ways (in his opinion) might be rather salubrious from a broader perspective. Perhaps the cyber connections being formed among the young (and old) are similar to the pruning and reshaping of individual brains during youth and adolescence. Just maybe we are in the infancy of the creation of a global consciousness (Chardin’s noosphere). All this connectedness, seemingly harmful in the short run, is a necessary step toward a next step in the evolutionary story.

    The algothythms used by social media companies now are developing the basic pathways just as the individual’s early experiences help shape their personal brains. One day, we may wake up to realize that we need to take charge of this rewiring process. We cannot leave the rewiring of a global connectedness to chance. Our broader connectedness should be done in a more appropriate manner since short-term corporate profit is not a grand nor uplifting purpose.

    If, however, there is something to this thought experiment. Might not we labor to create the future we want. I’m not sure I want such an important matter to be left, by default, to the baser instincts of a capitalist society.

  • Losing a generation.

    June 28th, 2024

    I recall one particular delusion I had toward the beginning of the digital age. It was common to presume that we were on the cusp of a golden age. We would communicate far more easily, connect with others more intimately, and have the world’s knowledge at our fingertips. That pollyannish perspective has been dashed by a cold reality.Β  Our youth are depressed, anxious, and disconnected. Our politics are polarized in the extreme. And American democracy is legitimately threatened for the first time in my long life. Moreover, we might be on the precipice of losing our very humanity to forms of digitally superior machines (the artificial intelligence revolution). I am so glad I’m old.

    Let me retreat from a generalized sense of the apocalypse to a more manageable fear. We are losing our young people in rather disturbing ways. In the most recent international hedonic survey (a study of happiness among major nations), the U.S. fell from 15th to 23rd place in average happiness. While we never have been among the leaders, we also had never fallen out of the top 20. Our recent sharp decline has been attributed to a growing sense of anomie and despair among our young. Older geezers, like me, have remained quite content, according to survey results (good for us).

    This negative trend among youth has occurred despite a sound economy, a robust recovery from Covid, and a cessation of overseas military involvements. Our young are responding to something new and perhaps beyond our capacity to remedy, at least not easily. It is as if an alien and pernicious disease has struck the young.

    The signs are everywhere. Kids, including college-age youth, are reporting high levels of anxiety, depression, and disturbing thought patterns. Girls are particularly vulnerable, with depression rates bordering close to one-third and over one-in-five attempting serious acts of self-harm. Generalized feelings of detachment and lack of purpose and direction are at higher levels than seen since such metrics have been measured.

    The daughter of a good friend is married to a Brit and lives in the U.K. She does therapy with children through their NS health care system. She sees first-hand a disturbing trend that is especially evident in English speaking countries. The denand for child mental health services has exploded. She now sees a demand for their services that can not be satisfied as evidenced by exploding wait lists for help. It is disheartening and frightening.

    One explanation for this generation that appears to be verging on a sense of nihilism has been attributed to the pernicious effects of social media. Facebook, Snapchat, Tic-Toc et. al. rely upon perverse algorithms to enhance their bottom lines. As I’ve noted before, profits are based on engagement. That, in turn, is premised on getting consumers to become engaged and stay engaged on their sites. Such levels of engagement requires consumers to click on their offerings and to stay on longer. Those are the metrics that sell advertising and bring in huge amounts of revenue.

    This is not rocket science. Those running these multi-billion dollar empires know that emotionally engaging material is the way to keep consumers hooked. Their algorithms, in particular, lead vulnerable kids deeper into a labyrinth of content where dopamine and other hormonal stimulants are effectively excited. For the young, this often involves continuous assessments of how they are doing compared to their peers and key inflencers. Establishing one’s comparative status at this age is a priority need.

    As they enter the critical teen years, appropriate reference sources (e.g., parents and adult figures) are replaced by a multitude of so-called influencers on the other end of their Smart Phones. Real and meaningful peers, close physical friends, and appropriate role models are lost. Virtual substitutes take their place. While males have see sharp rises in mental health issues in the 2010s and beyond, females have been harder hit. The negative impacts on their self images and identity more severe. The top officials know that this is happening. But the logic of the marketplace is severe. They just don’t care if they are scarring a generation of kids.

    Jonathan Haidt, in his provocative work titled The Anxious Generation, has extended my thinking on such matters. He argues that the sharp uptick in youth anxiety and mental health challenges can be traced to the widespread availability of Smart Phones around 2010. After that point in time, even preteens had access to the social media platforms through easy access to the cyber world on a 24/7 basis, almost always outside of effective parental control (at least in spite of adult efforts at control). This technological revolution has exerted the greatest impact on the Gen X cohort, those born after 1994.

    Soon, our youth were spending (on average) close to 5 hours per day on their phones. True, some of this time might be employed searching for useful information for educational purposes. On the other hand, too much time was spent comparing themselves with peers, exchanging gossip with others, doing endless self-comparisons with impossibly ideal images, and seeking affirmations and validation from remote others in impersonal ways … how many likes is one getting. This is one of the least effective ways of generating a positive sense of personal identity or in mastering appropriate adult behaviors.

    Haidt also argues that technology (as a cause of this youthful crisis) has been abetted by at least one key change in parenting behavior. Beginning about 1980, and accelerating in the 1990s, parents began to restrict the freedoms permitted children. I don’t know how many times my cohort has exchanged the following discussion: Hell, when I was a kid, my parents kicked me out of the house and told me not to come home until the street lights came on. I surely can recall spending hours on the streets each day, exploring the world with an assortment of mischievous kids from our poorer working class environment with absolutely no obvious adult supervision. We discovered the world on our own.

    Arguably, the ages 10 to 15 are key to our adult personalities. Essentially, the most important developmental time is from 0-3 years, with 90 percent of our brain size being attained by age 5. Yet, size is not all that critical. There are other mammals with larger brains than hours. Nurturing, what we learn and experience, are critical to which neurons and connections are maintained and strengthened (we actually lose the number of connections over time even as we get smarter) and which connections are mylenized or developed in a way that speeds electrical impulses. While our brains are not fully developed until sometime in our 20s, persistent patterns are formed and set by the end of high school. That is, those early teen years are not to be ignored.

    There is too much to cover here but one concept struck me as quite critical. As the brain develops during this critical period, each youth can have a preponderance of discovery experiences or defensive experiences. Boiled down, ‘discovery’ experiences involve some threat but permit the individual to exercise and develop coping strategies, some better than others. These are essential steps toward adulthood.

    After all, being grown up means coping with stress and challenges on your own. When we aren’t allowed to develop healthy responses to challenging experiences on our own (like when we are protected from the real world), we might well be left with ‘defensive’ modes of reacting where we find life a threatening place and where we are left with inappropriate coping tactics … anxiety and depression for example. At the extreme, one might even become a paranoid Republican.

    Here is the problem. Today’s kids are being coddled to an extreme. Parents and adult figures are hell-bent on removing all threats from a child’s experience, both physical and emotional. Protection is one thing, sticking a child in a figurative, yet effective prison nonetheless, is another. Threats always existed, but now we are paranoid about them. There have been incidents where neighbors, seeing a younger child playing outside without obvious parental supervision, have called the police to report child neglect. What had been normal in my day now borders on felonies behavior.

    I am reminded of a story a friend told that shocked several of us. Her daughter works at a high position in a nationally recognized corporation that hires many highly skilled technical workers. Today, according to her, it is not unusual for recent graduates (from the best universities) to bring their parents to their job interviews. Their parents for crying out loud!

    I picked my college out on my own (and somehow paid my own way through). Upon graduation, I went off to spend two years in rural India while living in very harsh conditions. Then, I continued my education before embarking on an exciting academic and policy career. I guess my unsupervised childhood had some upside after all.

    I don’t have a remedy on hand. But I am tempted to put this social challenge up there with climate change and AI on the A list of challenges that must be addressed. At a minimum, we ought to bring it forward to the front row of any national dialogue.

  • Remembering.

    June 22nd, 2024

    I’ve lapsed as a blog writer. Not exactly sure why. Perhaps I burnt myself out as a writer by pushing out so many lengthy works in a short period of time. Then, when I stopped developing full-length books, I entered into frenzy of blog writing, cranking out about one a day for several months.

    The explosion of words probably was destined to end. Yet, that creative process was both cathartic and healing. Writing, as I’ve noted, had been a dream since childhood. More to the point, it proved an anesthetic and a diversion during a difficult period. Perhaps a dozen years ago, my spouse entered a period where her early-onset dementia could no longer be hidden. She would slowly decline, was hospitalized in a memory care facility in 2018, and passed away … two years ago on this day.

    And so, it is a day of memories.

    Above is a picture taken at Blackhawk Country Club, probably at the onset of her disease, but before the decline became debilitating and all too obvious. It would prove to be the last of the good days. Observers are right in that Alzheimers is ‘the long goodbye.’ You lose your loved day by day and piece by piece. You are never quite sure about what has been lost until it is gone.

    It all started in 1972. We met in graduate school, had one date, and I moved in with her. Apparently, though I generally was an undeniable doofus, could see a good deal when it whacked me upside the head. For once in my life, I didn’t screw things up.

    Above was our wedding day at the local courthouse. We were married by a young judge who had no problem with ceremony that failed to include any of the usual rot. No promises of obedience were made. She did not take my name. Neither of us wore a wedding ring. We were going to do things our own way. Our two witnesses were work colleagues who could walk over to the courthouse with us.

    But here’s the thing. If you are compatible, share common values, and can compromise, a long relationship is likely, perhaps even certain. There is no need for big rituals, expensive nuptials, and exotic honeymoons. We simply drove to her parent’s home in the Twin Cities where we informed her family that yes, in fact, we had tied the knot earlier that day. We would remain partners for over 50 years.

    I can’t possibly summarize a lifetime in a few sentences. We both had rewarding careers, supported one another, and gave the other space to grow and achieve. That sentiment we did include in our vows, the freedom to grow. We would remain supportive partners through the good times and bad. She was there when I almost destroyed all through an alcohol addiction. And we had many good times like the trip to Yugoslavia and Greece captured in the foto above.

    The one thing she loved more than I were her two dogs, Ernie and Rascal. Below, she is seated with Rascal at Brookdale Memory Care, a facility which took good care of her for the last four years of her life. I am comforted by the knowledge that she was well-supported in her final days. She had slipped into that place where a full and harsh consciousness had abated.

    This is hard to admit but, when she did pass on June 22, 2022, there was a sense of relief. At the end, she was no longer the Mary I knew … more a shadow of what she had been. I could not shake the sense that she had been released from the grip of an insidious disease that inexorably robs an individual of their cognitive functions, and thus their identity.

    Even though we did not live together for the final four years of her life, she remained a living presence to me for a long time. I would wake at night, getting up carefully so as not to wake her … then realizing that she was not there. She would never be there again. Her only failure in our long partnership was a failure to beat the wit out of me, though she tried mightily. I kept telling her … you can’t beat out what God put in.

    In September of 2022, I went up to Burntside Lake with members of her family. Her father had built a cabin up there in the 1950s. Mary and I spent much time in that northern paradise. We poured her ashes in the lake in front of where that cabin had stood.

    It was a beautiful day. It was a day good for remembering.

  • A Fundamental Shift Backward.

    June 3rd, 2024

    I suspect that many of us have read or heard the commencement address by documentary genius Ken Burns at Brandeis University. It was, as expected, eloquent and profound. What touched me was his allusion to what a young man said during a public speech in Springfield Illinois in 1838. That tall, lanky young man told the assemblage that the American Republic would not be destroyed by any army from a foreign shore. No, any serious attack on our institutions and our tenets of self-governance would come from within. We are our foremost worst enemy.

    Of course, Ken was referencing a young Abe Lincoln who later would lead this country during a contest in which his dire prediction almost came to pass. Over 600,000 lost their lives in a civil war to test whether ‘this nation, or any nation so conceived’ could be sustained. I suspect that most have concluded that this long ago conflagration, horrific as it was, did preserve the union for all time. If only that were so.

    Back then, our political landscape was a mirror image of today’s divisions and discontents. The Republican Party, born in the 1850s as the Whig Party disintegrated, would have been seen as the liberal party of that day. Republicans, in general, opposed the extension of slavery, favored public investments in infrastructure, leaned toward a stronger national government, and endorsed initiatives to improve the human capital and educational status of the nation’s citizens. Even as the civil war raged, Republicans passed legislation to establish land grant colleges, extend railroads across the country, and create a standardized national currency.

    The party of Jefferson and Jackson had a distinctly separate vision for America. Democrats of that era saw nothing wrong with slavery. They disliked any (or most at least) forms of centralized control. Thus, they were strong proponents of State’s rights and local authority. It also might be argued that the Dems back then were suspicious of democracy, believing that wealth and privilege were signs denoting those who were graced by Providence to lead and govern over their less lifted brethren.

    Democratic strongholds, not surprisingly, were found in Southern states (a pattern that would generally remain until the partisan realignment of the 1960s Civil rights era). The Confederacy overwhelmingly leaned to the Democratic Party. The political disposition of the Confederacy did not well serve their cause. Each state was seen as sovereign entity, much like the original Articles of Confederation did among the original colonies … an arrangement which proved a disaster. Thus, the central government in Richmond could only petition the several governors to help in the rebellion. Their core value of decentralized authority hindered any long-term pursuit of a coordinated strategy in this massive conflict of arms. Lincoln’s powers were more substantive and effective over time.

    I offer the brief historical view above to suggest that while much has changed over the past 160 or so years, many things have not. The two major political parties have switched positions. The Republicans are now seen as representing conservative values that favor limited government (even though they are not consistent in this), while the Democrats are perceived as the liberal party which favors a proactive government. While this simple distinction is accurate on the surface, it fails to capture the essential differences (namely the radically distinct perceptions of society) each side embraces.

    Even as the party roles were reversed over time, as they were, the two major political parties continued to embrace radically different conceptions of the good society. James Henry Hammond, a Democratic U.S. Senator, gave a speech in 1858 laying out his ‘mudsill’ theory. It was a succinct summary of the philosophy underlying the principles of his party and a justification for the exploitation of others. According to his philosophy, every society needs a docile and less educated population to do the grudge work of society. Not surprisingly, he and his compatriots saw slavery as the natural order of things. The white paid laborers in the north represented somewhat of a violation of this natural order, and posed a threat to the existing equilibrium.

    In a larger sense, the then Democratic Party (like the the Republican Party now) felt more comfortable in a society where social and economic classes were rigid and hierarchical. An educated elite were rewarded for their efforts by enjoying material wealth and, by God’s graces, were destined to rule. The lower classes, while enjoying privileges that a white skin might bestow on some, could never arise above their station. That would not be natural. This view was not all that different from older, feudal organizations of European society, nor the caste system found in Asia, nor the elitist system found in 20th century Britain. Society was hierarchical, rigid, and ordained from above. Governance was authoritarian and, at best, paternalistic. Every person was relegated to their preordained place in this ordered world.

    The other side saw a different vision. A persons success would be determined by their innate skills and efforts. Ideally, everyone would have an opportunity for advancement and success. Any individual ought to have an opportunity to participate in the governance of society. Thus, basic educational opportunities spread more quickly in the North while suffrage was more acceptable when it finally came to those groups who had been ignored for so long. Social mobility, the proverbial American Dream, was something to be valued and preserved and even fostered through public efforts.

    Today, those two competing visions once again are vying for dominance just as they were in the anti-bellum era and as they have simmered in softly spoken, yet undeniable, ways throughout our history.

    Think about the underlying message we hear from the Republican base and the wannabe despots they support. They appear quite satisfied with the hyper-inequality that has increased since the Reagan revolution in 1980. They have no problem with declining social mobility as educational opportunities are priced out of reach of the less firtunate and public educational institutions are starved of resources. They attack most forms of a liberal (in the classic sense) education as they strive to turn post secondary schools into hi-tech vocational institutions … learn to do but not to think. They try to whitewash history to sanitize the past by eliminating anything that makes kids uncomfortable.

    Worse, the new Republican Party has renewed the age-old attack on universal suffrage. They have remade gerrymandering into an art form. In Wisconsin, voters statewide vote Democratic but almost two-thirds of the Assembly seats are held by Republicans due to skewed voting districts. (Note: the new State Supreme Court has addressed this farce). Where they can, Republicans continue to purge minorities from the voting rolls while making it harder to vote in general. It is difficult to sustain the majority support essential to electoral success when you continuously vote against the interests of the vast majority. They can only retain their offices either by scaring the public with bogus fears (an alien invasion) or engaging in classic forms of emotional misdirections … abortion or gun confiscation.

    Ultimately, they know that retaining power over the long run will be difficult. So, the insiders have formulated more drastic measures, one being the so-called 2025 plan. If Trump gains the White House again, the hard right leaves no doubt about their intentions. They intend to fabricate an excuse to turn America from a government of laws to one run by a narcissistic strongman. They will do what they accused the Democrats of doing but without a shred of evidence.

    Their new America will be based on fear, retribution, and unleashed power. Their model will be the Nazi usurpation of complete control in the weeks after Hitler was named Chancellor by President Hindenburg in 1933. Some crisis (the border or rigged elections or China) will serve as a convenient excuse to usurp more power to the center. The justice system will be weaponized to go after enemies, the civil service will be turned into a supporting cast of incompetent lackeys, and constitutional protections for average Americans will be eviscerated.

    The new Republican Party has no truck with traditional democratic checks and balances. The Trump cult wants a strong man in charge who will attack all the people they despise without constraints. It will be a society where the powerful control and dominate the weak and the defenseless. It will be a society where the weak and vulnerable will be crushed and forced to serve the elite according to the ‘mudsill’ perspective of society. They seek nothing less than a return to the 1850’s south or even further back in time.

    We often hear that the next election is the most important in history. This time, that assertion just may be true.

  • Older than dirt!

    May 26th, 2024

    It is official. I am older than dirt. I am an octogenerian, have been for almost a week now. πŸ™ƒ Or, to be technically accurate, I am now in my ninth freaking decade. Put that way, there may be trees in the petrified forest younger than me. The only way to accurately measure my age is through carbon dating.

    I can remember far back in the previous century wondering if I could possibly live long enough to usher in a new millennium. After all, I would be 56 in the year 2,000. Oh my god, that struck me as impossibly old. No way could I last that long. After all, I came of age when a popular mantra was ‘don’t trust anyone over 30.’ But my 30th birthday came and went. Then five more decades came and went along with the Y2K fears that our digitally dependent civilization would collapse when midnight struck while ushering a new century on January 1, 2000. I even watched as Sydney first celebrated that moment, and realized that the world really had survived πŸ₯³, as had I.

    So, how do I feel about being so ancient. Pretty good, actually. My condo neighbors regularly gather to chat and to dump on Trump. While the Association is not deed restricted by age, it might as well be. We are all old. And we are all highly educated and successful. We are retired academics, doctors, lawyers, engineers, and other assorted professionals. Many matriculated from the best universities in the land. Yet, none of us seem eager to turn back our personal clocks. We had good lives but are desperately concerned about the future given the various looming threats to our climate, our political system and American democracy, and the uncertainties emerging from the specter of artificial intelligence (to name a few).

    Most of my geezer friends and I realize that the world we entered back in the 1930s through the 1950s was not a happy place. There was war, economic disasters, famine, genocide, a host of totalitarian regimes, and a looming nuclear age where an atomic holocaust seemed more than likely. No, it was not a pretty place globally.

    Even America faced horrendous problems. We still had legal apartheid that relegated many minorities to lives of isolation and oppression. Women were generally treated as 2nd class citizens. I still recall a colleague from the Wisconsin Law School. She graduated (in the 1950s) at the top of her law school class, one of only two women. The Dean at the time told her that no major law firm in Wisconsin would possibly consider her. She somehow managed to get a position on the law faculty at Wisconsin and become a towering force in her profession.

    Certainly, we were not living in anything close to the ideal worlds presented on TV family sitcoms. I’m my situation (as a child), we lived in a cold water flat with no central heating (I could see my breath at night). We had an ice box, no car, a party line telephone shared with 3 other families, among other privations. But no one else had much either, so such challenges never bothered us. I started earning money by delivering papers early on and never stopped working. Even, after retirement, I continued writing books.

    Despite all our challenges back then there remained a pervasive sense of hope and opportunity. I could work and scramble my way through a competitive high school and a decent (private) university on my own (no financial help from family), then head off to India in the Peace Corps. I experienced little anxiety about my future, assuming that this really was a land of opportunity. There would be challenges, of course, but I had faith that things would work out. And I was correct, America (back then) had become a real land of opportunity … a moment of promise that began to evaporate with the Reagan revolution that started in the 1980s.

    Mostly, though, surviving this long has permitted my cohort to see many amazing things. As a youngster in grade school, I was assigned the honored position of ink well filler. That’s right, we had pens that had to be continuously refilled with ink. At home, we finally got a TV after most other homes had them. Still, you had to struggle with the vertical and horizontal controls to get a picture. Many a time, I accompanied my dad to a store where we got the vacuum tube’s checked. And believe it or not, to change the channel (we had only four at most), we had to walk all the way to the set and turn a clunky nob. Worse, the stations would cease broadcasting around midnight, at which time they would sign off with the national anthem.

    But there was other entertainment in the busy streets. The milk man, the coal man, the ice man (who delivered blocks of ice to our ‘ice box’), and a variety of other vendors were frequent visitors to our hood which was over run with kids (large Catholic families). I can still recall running after the ice or milk truck to purloin chunks of ice on hot summer days.

    You could get your knives sharpened or buy used rags and so much more. I have a picture (somewhere) taken of me sitting on a horse in front of our flat … yet another itinerant vendor wandering through. In the late 1950s, when I was sick as a dog for several days, my folks broke down and called a doctor (a rarity indeed since they cost money). He came to our flat to check me out … a freaking house call 🏠 if you can believe. That was a good thing. My appendix had burst, and I was near death. I was rushed to the nearby hospital and directly into surgery.

    Without cell phones and social media, we were raised the old fashion way. My parents told me to get out of the damn house and not to return until the street lights came on. Kids then were not kidnapped. If any were abused, we never heard about it, nor did our parents seem to worry. (After all, the sign my parents made me wear saying ‘please take this brat‘ did them little good.

    Often, as I marauded the streets with a gang of neighborhood ruffians, my long-suffering folks would change the locks or quickly move to an undisclosed location. But they couldn’t lose me no matter how hard they tried. When I got a bit older, my uncle gave me a set of golf clubs. They were so ancient that the shafts of the so-called irons were made of wood. A couple of my buddies and I would walk several miles (uphill at the end) toting our clubs to a local 9-hole course. We could play all day for one buck before trudging home as the sun set. We were tough.

    I could go on but you get the picture. The pace of change has been dizzying and accelerating. We did live during an epochal period of historical transformation. The angst we see, particularly among the young, may well be the result of this unprecedented pace of change. Many are ill equipped to handle it. My suspicion is that this unnerving period of exponential change has disoriented those incapable of dealing with uncertainty. They seek authoritarian voices who will calm their fears. Unfortunately, that is like seeking a magician who can hold the tide back. Ain’t going to happen

    These eight decades have been one helluva ride. However, I’m glad it is coming to an end. I really am getting way too old for this shit.

  • A Lifelong Pursuit.

    May 19th, 2024

    Previously, I touched on what it meant to articulate a moral center, how difficult that endeavor is … if you take the challenge seriously that is. And there’s the rub. How many of us approach the task of centering our world view in a thoughtful and serious fashion. Or, how many of us simply accept what we are given and go through life spewing forth scripted clichΓ©s.

    I can still remember my mother once again disapproving of my behavior, or the way I looked, or what I believed. If I questioned her ‘wisdom,’ she would give me her patented look of exasperation and thinly veiled disgust. ‘Everyone believes (or thinks or does) this,‘ remained her final argument.

    It mattered not what the issue might be since we were touching upon a universal truth according to her worldview. Inevitably, I would scrunch up my face. ‘Everyone in the whole world … really? Do you mean kids in remote China believe and act this way?’ My small acts of rebellion would never win the day for me. No, it merely would result in a rising level of exasperation on my mother’s part. Her world was one of absolutes.

    And so was my early world. It was Catholic, ethnic, working class, a world refined in a constrained petri dish of struggling tribalism…an us versus them mentality. Stereotypes and prejudices were omnipresent. It was a stark world of good and evil, of light and dark. Shades of gray, of nuanced or complicated thinking, were discouraged.

    I apologize if I’ve shared this too many times in the past but I can recall sitting in my High school religion classes arguing (only inside my own head of course) with the structures being passed on as absolute truth. In that dogmatic world, children were cast into some unthinkable eternity because they had not been baptized? Silly birth control prohibitions were imposed when, even then, we could see that growing populations would create huge societal issues. Ancient gender roles remained within the institutional arrangements of the Church, even as the wider world offered women a taste of equality and opportunity. The infallibility of the Pope (in matters of faith and morals) continued even as the history of the Papacy was rife with atrocious scandals. The list was endless.

    Yet, cultural bubbles are immersive and confining. When you are inside, you cannot easily detect the character of the prison in which you are incarcerated. It is, in the end, your world, much like the fish in the fish bowl. Contradictory input is something to be ignored or actively refuted. Most of us have difficulty accommodating diverse thought and contradictory ideas with some of us have more difficulty than others in breaking free from established precepts. After all, cracking open one’s worldview can be traumatic. It is the framework within which we organize the amazing array of stimuli that represents that complex world out there. It is not as if what we understand that reality is, in fact, our creation of reality. We assume all see the same reality even as our individual world is partly a social artifact, something created through the prism of our personal filters.

    I can recall confronting my own core set of assumptions and beliefs. That started in high school and accelerated in college. Early on, it was more a process of questioning the religious scripts imposed on me, starting with the precepts contained in the Baltimore Catechism in which we were indoctrinated by the Catholic authorities early on. Then, it was too early for me to do any major restructuring. I merely pushed some beliefs to the side, dismissing them as being illogical or contravening common sense while maintaining the essential architect of my childhood version of reality.

    Later on, in college, the challenges to my personal orthodoxy became too numerous. It soon was impossible to patch up my existing edifice. I had to articulate a new structure or rationale for my approach to things. At first glance, that might seem a significant undertaking. Yet, two factors rendered the process doable.

    First, I was now operating outside my bubble. The Catholic Church, throughout its history but particularly after the Reformation, spent enormous resources and energy on developing and nurturing separate institutions in the areas of education, health, and so many others. All these efforts were designed to separate and protect the flock from competing views and influences. I would create my own philosophy of life.

    Second, I found that I didn’t have to jettison my core or fundamental dispositions. Upon deep reflection, it turned out that my new moral center, while not based on any formal religious or established institutional dogma, reflected Christ’s teachings as well as that of many other major spiritual gurus. Essentially, most ancient moral teachers of note preach some forms of civility, compassion, and community with love and acceptance operating as the key bonding agent. In short, I did not have to transform my world view. I merely had to base my belief system on a more intrinsic type of rationale as opposed to extrinsic mandates and fear of punishment. The good is remarkably self-evident.

    Eventually, I sensed that my moral center was going to wind up where it did no matter what. I was predisposed to certain foundational sentiments … equality over superiority, acceptance over division, love over rejection, peace over violence, and kindness over the alternative. I was destined to be a do-gooder (most of the time). Perhaps that is why, even as a teen, I sought work that was aimed at helping others…hospital work, helping disadvantaged kids, making a hopeless stab at becoming a Priest, and then a stint in the Peace Corps. It might help explain how swiftly I embraced progressive politics and my profession of teaching and doing public policy work related to poverty and helping the poor.

    And yet, I always remained somewhat detached. I did policy and not politics. I mostly focused on programs and institutions, and not people themselves. I circled problems from 30,000 feet as opposed to getting into the trenches. In some ways, I never got involved.

    Is that wrong? I’m not sure. But I have always felt a little guilty about that. I’ve had to accept that, while I’m a decent schmoozer, I am not instinctively comfortable around people. Perhaps, if I were to do it all over again, I would step out of my comfort zone to be a better people person. Perhaps!

    While I have more to say on this topic, I realize that the process of settling on how one should center one’s life never ceases. It is an ongoing process. In a way, it is like science. The rigors of analytics seldom give consumers final answers but rather the most defensible answer available based on existing evidence and assessment. However, new evidence is continually available. Long ago, the best minds saw the earth about us as the center of everything. A century ago, our understanding of the scale of the universe was confined to the Milky Way, our own galaxy, in which we were minor players. Today, we know that there are at least two trillion galaxies out there. Our existence is peripheral indeed.

    We must keep reexamining everything, changing our minds based on our continuing sifting and winnowing of the world about us. Our belief systems are similar … we arrive at a set of conclusions only to question them once again. At least we should keep questioning them. It is the journey itself that makes the most sense, not easy answers grabbed onto as lifelines. Such a never-ending journey makes life harder, even uncertain. However, I personally would have it no other way. Struggling to understand the majesty and mystery of what is out there is an exilerating endeavor. I know it is hopeless. Yet that is the very quality that makes it such a seductive aspiration.

←Previous Page
1 … 9 10 11 12 13 … 31
Next Page→

Blog at WordPress.com.

 

Loading Comments...
 

    • Subscribe Subscribed
      • Tom's Musings
      • Join 41 other subscribers
      • Already have a WordPress.com account? Log in now.
      • Tom's Musings
      • Subscribe Subscribed
      • Sign up
      • Log in
      • Report this content
      • View site in Reader
      • Manage subscriptions
      • Collapse this bar