• Sample Page

Tom's Musings

  • What Might Have Been.

    August 20th, 2024

    I’m still musing a bit about the old days. That’s what you do when you become an octogenerian, if you get that far. And there’s the rub. You look back in wonder … just how in the world did I survive? Given my lack of skills and talent, coupled with a propensity to make disastrous decisions, longevity hardly seemed a sure thing. But, as a famous sports announcer always said, that’s why they play the game. You never know the outcome until the contest is completed.

    I started out my personal memoir, A Clueless Rebel, with recollections of various moments when I realized that yet another vocational path was beyond my pay range. You know, things like almost slicing my ear off in my junior high shop class. No career for me working with my hands. Or losing money on my paper route, which takes considerable talent. Better scratch business tycoon off my list of potential vocations. Professional athlete? That delusion ended when I actually stole second base one day. Then, I was tagged out when I inexplicably started back to first base assuming the batter MUST have fouled the ball off. After all, how else could I have beaten the throw to second? I still cringe at that memory.

    There were many other such disasters. The list of potential future vocations was dwindling to zero. Eventually, by the process of elimination, I concluded I better try learning something in school. My prospects there were hardly bright, but marginally better than the zero which I had assigned to all other possibilities.

    I started out my schooling in a run- down elementary school in a struggling part of my hometown … Worcester Massachusetts. I do remember crying when my mother left me on the first day of kindergarten … not exactly an auspicious start in the academy. Even at this point, I showed little promise even though I did excel in the cookies and milk they gave you in kindergarten. I do recall getting unsatisfactory warnings in penmanship and compartment (whatever that is). Somehow, when I entered junior high, I was put in the advanced class (must have been a clerical error). Only 5 boys made the cut while there were some 20 girls. All the girls seemed smarter than me though, in truth, I can not recall any of them by name or image. But I knew where I ranked among the five boys … tied for third place. Only one boy struck me as intellectually slower. That was a sad showing indeed.

    Nevertheless, I took the exam for a selective Catholic boys high school. Not sure why I did so since I was hardly salubrious about my prospects. Perhaps my parents pushed me, though I do not recall any such action on their part. Shockingly, I not only was accepted but placed in the top class … one’s class assignment was based on how well one did on the entrance exam. Again, I assumed a clerical error had been made. And once again, my actual classroom performance showed little intellectual promise … managing to settle somewhere in the undistinguished middle of the pack.

    Given my lackluster performance in the classroom, my feelings of concern about the future seemed warranted. I remember wondering how I would survive as an adult. Who would hire me? What skills and talents could I employ to secure a paid position? I thought my future prospects bleak at best. Seeing a bleak future on the streets as an unemployed bum, my best hope lay in the possibility that the army would take me. They took anyone, no? Then again, I seriously doubted I could make it out of basic training without shooting myself.

    After high school, I entered a (college-level) seminary that prepared priests to do foreign missionary work. That seemed like a kind of army to me. Later, after that ill-fated effort to become a man of God, I stumbled into college. I went to Clark University in my home town because it took 2nd semester applications which the local top Catholic College, Holy Cross, did not.

    Going away to school was out of the question. There was no money for that since any discretionary funds available to the family went for beer, cigarettes, and gambling. However, this was before Republican orthodoxy exerted a stranglehold on the national perspective. The American dream still was alive and well. While higher education was still a stretch given that I received virtually no financial help from my folks, I could easily (with scholarships, loans, and working 11-7) make it through a decent private school. After that, eventually getting a masters and doctorate were relatively easy.

    As I detail in A Clueless Rebel, I blossomed in college. I did well in the Catholic Seminary I attended for a year plus, once again being assigned to an advanced academic status at the beginning of my second year by the officials in charge. I could never quite figure out why people kept concluding that I was smart. In my own head, I was average at best, though I did have a special fondness for English Literature and any kind of history. Anything involving math, however, was the kiss of death. Besides, I dismissed the seminary as not a real college, even though it’s academic reputation was quite strong.

    In any case, Clark University took me in the Spring Semester of 1964. In the working class Catholic cocoon in which I had been embedded to this point, Clark was known as a den of atheists and Communists. I’ve written about what happened there elsewhere (likely more than once) but the musings of a blog beg for some redundancy.

    The bottom line is this, my days at Clark were transformational. I entered as one person and left several years later as someone quite different. Some good friends recently pointed out that Clark was mentioned in a book about exceptional colleges as a place that takes in B level students and turns them into intellectuals who later survive and even prosper at elite research universities. That would have perfectly described my experience. [NOTE: Eventually, I ended up as a senior scientist at the University of Wisconsin and the administrator of a nationally recognized research entity.]

    But here is the thing. I never planned on anything. I never had a goal, other than not to starve and to keep out of the cold. I never saw myself as an academic or a university teacher or a policy wonk who would be at the center of the national welfare reform battles that raged toward the end of the last century. It all just happened, as if by serendipity. Then again, there seemed to be more opportunities back in my day. I do feel very fortunate to have come of age when I did.

    Many a times I have mused on the seemingly random circumstances that shaped the trajectory of my life. What if I had not detoured into the Seminary after high school? I surely would have gone to Holy Cross and stayed within my Catholic cultural cocoon. What if I hadn’t stumbled into Clark almost by accident? What if others hadn’t kept insisting that I was smart in spite of my own personal view and my lackluster class performances. What if I hadn’t come of age in the 1960s, at a moment of when encrusted cultural and normative beliefs were being questioned. Most important of all, what if I had shown any skill at anything as a young boy. Perhaps I would have led a conventional life. Perhaps I never would have enjoyed a life of intellectual exploration and the pursuit of social change, and all by default.

    Wow, what powerful words … what if?

  • Continuing the Grand Adventure …

    August 15th, 2024

    In the previous blog, I explored the zeitgeist of that era in which the Peace Corps volunteer experiment emerged. Freed from pressing economic want, and spurred on by a lessening of both financial insecurity and inequality, the youth of America (a reasonable number at least) turned their minds and hearts to nobler aspirations. With that peculiar and illusory optimism only found in the young, they (we) believed an even better world was possible.

    After all our training and preparation, I can still recall my first moments in Delhi. The July heat was like a blast furnace. But we were able to ignore that momentarily in light of our recent brush with death. The cowling of one engine fell off our Air India flight as we entered Indian airspace. This unsettling incident warranted a news article in the next day’s Times of India. What we were less able to ignore were the cacaphony of sights and sounds that assaulted us. You can be told over and over what to expect. Reality, however, is always much more real.

    We spent a few more weeks trying to become farming experts. My group (India 44-B) was initially trained to be poultry experts. Then, half way through, we were switched to agriculture. That should have been clue number one that the planning for our exciting adventure might have been a tiny bit flawed. Despite the rather obvious confusion at the top, our training staff was superb. The preparation we received was intense, innovative, and remarkably immersive. In the end, we all learned an extraordinary amount about who we were and what we might accomplish in our service and, more importantly, beyond. I think, all things considered, we experienced far more change than anything we effected in India or the sites in which we were placed.

    Some 50 years after the formation of the Peace Corps, there was an anniversary celebration in Washington. The Indian Embassy had an event for all former PC volunteers in town. The top Embassy official noted that the PC contribution to India (which came to an end in the mid 1970s) was less in terms of any technical improvements and much more in terms of cultural connections. We learned from our hosts, and they learned from us. Despite our shortcomings and mistakes, we showed Anericans and America in a better light. Well, that’s my story, and I’m sticking with it.

    For example, before heading off to our villages, we played several games against a team from the local Udaipur college. (See pic below). Arguably, we held up the honor of American round ball before enthusiastic local crowds until they brought in some ringers from a military unit. On that day, we lost big time, both on the scoreboard and in the physical punishment we endured. The locals were delighted at our being thrashed, but we all ended on good terms. In fact, the local ball players invited me (along with two others from my group) to join the Udaipur team to compete in the all-India tournament to be held in Jaipur. The local team erroneously concluded that adding the hot shot Americans would improve their tournament chances. Alas, not to be. Soon enough, we found out that the rest of India could play the game at a high level (we were out quickly). But we all had a great time.

    I can’t relate the actual Peace Corps experience in a blog or two. Only a few impressions can be touched upon (or you can get a copy of Our Grand Adventure). In the pic below, you can get a feel for our situation in the field. I was stationed about 45 to 50 miles south of Udaipur in what can only be described as a desert wasteland. I lived in government housing attached to the local Panchayat Samiti (government development office). All the other officials chose to live in town, for reasons obvious to me. I never had running water, electricity was only installed after 6 months of promises, and I crapped in a hole located in a room attached to my humble abode.

    The experience was challenging in many ways. We all were struck with a variety of physical ailments … some in my group became very I’ll (either hospitalized or medically discharged). I had giardia, ringworm, and lost so much weight that my mother (seeing pictures of me) was tempted to call our Congressperson to have me rescued. But health wise, I escaped relatively unharmed, encountering relatively few bouts of dysentery and no serious ailments. I did live in fear of the dreaded guinea worm. These started out as cysts to be ingested from local well water which grew into long worms that eventually would pop out of an arm or leg. All these years later, I still have nightmares.

    The real challenges were the isolation, the loneliness, the heat, and the demands associated with an inscrutable culture. There was no way to negotiate the complexity of the social rules there without error. At the least, one had to consciously think about what to say and how to behave when in public. You have no idea how hard that is. Normally, we rely upon well-worn social scripts and rules yhat are well understood. It was not until we were back in the West on our way home that we comprehended the cultural pressures we had experienced.

    In addition, the isolation seemed crushing over time. There were no cell phones or visual media or easy ways of contacting the outside world. There were a telephone or two at the government facilities, but one could not escape the sense of being totally on your own. I recall realizing how much I missed the college sweetheart I left behind (being quite commitment-phobic). I wrote often, but she wound up marrying a post-doc at Harvard where she was working at the time. Who could blame her? Decades later, we reconnected. It was only then that we realized we had loved one another back then but were too damaged to recognize that fact. She speculated how different things might have been had there been modern communication technologies. Roads not taken and all.

    Most disconcerting is that we volunteers never felt confident in our primary role. You can’t become a farming expert in a few weeks (especially when expected to learn a new language, cultural nuances, and the complexities of village life). Most of us never escaped that nagging sense that we were frauds. What rendered that sense particularly critical is the awareness that our mistakes might have profound consequences.

    On the other hand, we got to experience something unique. The town located about a mile from our isolated accommodations was called Salumbar. It was of decent size (though still a comparatively smaller town) that was situated amidst a harsh and unyielding desert. While some affluence was to be found, many farmers in the area ecked out a marginal existence on tiny plots of land. It was a level of want and vulnerability I had never witnessed in my young life, nor since.

    I could never quite escape the sense that I was living in the past. To me, Salumbar looked like a western frontier town of the 1870s. On occasion, farmers drove their water Buffalo through the streets or rough looking men would ride though on camels while sporting rifles and wearing cartridges across their chests as if off to war. Was Pancho Villa preparing a raid?

    But mostly we did try to ply our trade as best we could, trying out several ideas we thought might prove useful. We erected a chicken project demo, raised money from my college back in the States to restock the local school library, and developed a home garden demo, among other initiatives. Mostly, though, we tried to convince local farmers to try new forms of hybrid seeds that would greatly enhance yields. This was part of the ‘green revolution’ launched in the third world by Norman Borlaug. It was seen as a possible rescue for an India facing exponential population growth with constrained resources.

    We did have successes. You can see in the pics above a successful demo plot along with me tending our demo home garden. Frankly, being a well-known klutz, I cannot believe I erected a chicken coop on top of our home. Astonishingly, I successfuly raised several vegetables. Who was that guy?

    Still, the challenges were many, too many to recount here. One will suffice. These new hybrid seeds demanded that the farmer adhere to a set of strict protocols including how to plant the seeds correctly, how to fertilize the crop, when to water, and so forth. In the past, these marginal farmers could throw out some seed left over from last year yield and (usually) get a crop that would enable the family to survive.

    These new hybrid seeds we were hawking held the promise of previously unheard of yields. To obtain this bounty, however, the local farmer had to make an upfront investment of money and follow a set of practices that demanded unusual care and attention to detail. Even then, success could not be guaranteed. For example, the promised wonder seeds might have been adulterated along the way (good seed stolen to be replaced by crap). Corruption was endemic.

    That left us volunteers in a dilemma. Should we work with the poorest of the poor. I wanted to. But they spoke a local dialect (Mewari and not Hindi which I had learned). The chances of miscommunication were great. Worse, if something were to go wrong, there was no plan B. There was no crop insurance nor a government safety net. Which of us could bear the guilt of pushing a marginal farmer (and his family) over the financial edge. So, we tended to work with better educated, hoping that change eventually would trickle down to all. In the meantime, we risked exacerbating local inequalities.

    Did we do any good? Who knows? I will say that India was a grain importing nation in 1967 when we started our tour. It was exporting grain by the early 1970s. Of course, that turn around was more likely due to the return of the life-giving monsoons than to our poor efforts.

    I can recall feeling very pessimistic about the future of the country back then. I could not imagine how these marginal farms would survive over the next generation or two. Most farmers had large families. In the past, they needed many offspring so that there would be surviving males to support them in the future. But with improved public health, many more children were surviving to adulthood. The existing family farms were too small to split any further. How would these children survive? What would they do?

    Was an apocalypse in the making? Fortunately, that did not happen. The bottom half of the picture of our local home above presents what that same place (our government site) looks like today. The desert has been replaced by growth, new buildings and developments, and green fields brought to life by irrigation. India is taking its place among the more developed countries in the world.

    Above, we have most of the survivors from India 44 (A and B) taken at a reunion some 40 years after our return in 1969. By way of explanation, India 44-A was a public health group (mostly female) who were assigned to the State of Maharasthra (near Mumbai, Bombay back in the day). We guys in India 44-B were yhe so-called Ag experts. We were dropped in the arid land around Udaipur in southern Rajasthan. But we all trained together and formed a common bond.

    I doubt whether our brave band who ventured forth in 1967 can take much credit for any observed successes. But we all know that each of us took so much away from that experience. As mentioned, the accomplishments of these volunteers have been remarkable. It is hard for me to decide whether Peace Corps did a remarkable job in selecting talent or something about the experience added value to our subsequent lives. Perhaps someone else can solve that conundrum.

    I can yet recall sitting on top of our house at night, the roof could only be accessed via a rickety ladder. Nothing but desert could be seen from our vantage point. The evening would be most pleasant after another scorching day with temps oft pushing 110 or higher. But the night air was refreshing, and the sky would be blazing with stars. The pace of life about us was glacial … so much time to read and think and simply appreciate a world that we (so busy in our western, adult lives) tend to ignore. All in all, it truly was a time out of time.

    It is too bad that more young people do not have access to such an opportunity.

  • Our Grand Adventure!

    August 13th, 2024

    In June of 1966, close to 100 eager young college students gathered on the campus of the University of Wisconsin-Milwaukee. They were part of an experimental Peace Corps program that would expose them to an enhanced preparation regimen. These wanna-be volunteers would tackle a rigorous training ordeal that summer, return to finish their degrees, then be subject to more preparation the following summer before heading to India for two years. Many, actually most, would never make it. Some would be asked to leave. Others self-selected out during training. Still others withered in the challenges offered by India or succumbed to illness or disease. By July of 1969, at the end of our tour in one of Peace Corp’s most challenging environments, around two dozen tested volunteers would finish up their service. For these survivors, it was to be a Grand Adventure. (Check out the book below for the full story.)

    In the picture below is a partial shot of the young and eager college kids who signed up to spend two years in far off India. Little did they know what awaited them. If you have any interest, I am the geeky-looking guy with glasses in the second row on the far left. As I look back at these faces, I am reminded of how different the world was back then. This all happened when the very concept of overseas volunteering was still fresh and a largely unknown venture. Peace Corps, as a concept, was yet evolving through trial and error. Those early years would later be called the wild west era of the program.

    There was a sense of institutionalized naivete that permeated that early program. It was assumed that smart American kids could be recruited off college campuses, given some training, and then dropped into remote sites without any resources other than their wits and good intentions. Oddly enough, in hindsight, officials actually assumed we kids actually could change the world. Some of us did … a little at least. But any small accomplishments were accompanied by frustration, disappointment, and often a sense of personal failure. India, for many reasons, was widely known to be one of the most difficult Peace Corps sites. Of course, we did not know that on the day this photo was shot.

    This really was a special time in many ways. America was in the midst of such dramatic change. The conformist, conventional 50s had begun to crack open in the early part of the 1960s. The Civil Rights movement, with scenes of Blacks being persecuted for seeking an end to racial apartheid, was followed by Kennedy’s assassination and then the escalation of a far-off conflict in Southeast Asia. Such shocks drove many of us out of our personal torpor.

    While I was quite political by the time this photo was snapped, many of the others undoubtedly were at the onset of profound personal changes. I was well along in my own transformational journey. Just three or so years earlier, I had been studying for the Catholic Priesthood (a foreign missionary order) after being raised in a conservative working class environment where independent thinking was frowned upon and divisive prejudicial attitudes the norm. God and country were paramount. Not surprisingly, I entered college as a product of my cultural cocoon. Change did not take long. By 1966, I was a veteran of antiwar marches who led what amounted to the leftist organization on my campus. I even joined the far-left Students for a Democratic Society (SDS), at least before it went over the radical edge.

    Each of those kids pictured above had their own stories. Many (like me) were first-generation college students. They benefitted from a booming economy and a virtual explosion of new opportunities. A number went to the best schools … Berkeley, Columbia, Yale, and so forth, virtually all on scholarships or with easily accessible financial assistance. The costs of higher education in that period were laughably cheap. I could easily afford a private college with work, scholarships, and loans that were not back-braking. Others came from dirt poor backgrounds like share-cropping families who would make their ways successfully into mainstream society. As I look back, the talent found in this group was amazing … the accomplishments of the ones I know about have been astounding. Among the group survivors, it would be impossible to randomly select such a talented and accomplished group.

    Many of us talk about the 60s with a kind of hushed reverance. In truth, it was a special moment in time. Not only did African -Americans wrest a degree of dignity from a reluctant majority, but several other ‘rights’ movements emerged … for women, for migrant workers and Latinos, for Native Americans, and for the disabled (along with movements for the environment and nuclear disarmament, etc.). The young questioned our military adventures abroad. Our national government declared a war-on-poverty. The arts went through an explosion of innovation. Everything was questioned. Most importantly, the search for a fair and just society was a-foot. Many in this pic traced their decision to apply to PC back to President John Kennedy (JFK) and his iconic challenge … ask not what your country can do for you, ask what you can do for your country.

    The cultural zeitgeist back then was decidedly different. When college students were asked what was most important to them, developing a personal philosophy topped the list. Making money was much lower down. Some one-third of Yale law students tried to volunteer for crusader Ralph Nader’s consumer interest organization. The SDS organization I mentioned above only declined into nihilistic self-destruction in later years. It started out as a few University Of Michigan students sought a more meaningful future (see the Port Huron Statement). They were searching for a just cause about which to structure their futures.

    That search for larger meaning is reflected in the early response to the Peace Corps concept. As I’ve noted elsewhere, the concept was first publicly presented by Candidate JFK during a late night stop in Michigan after a TV debate with his opponent… Richard M. Nixon. Dick had gotten under JFK’s skin by asserting that the Dems had been the war party while the GOP favored peace. In response, Jack stewed in the plane ride to his first post-debate stop, Ann Arbor Michigan. He challenged the mostly student crowd (it was near midnight when he landed) to be willing to sacrifice a year or two abroad to make the world a better place. It was a throw-away line at best, totally off-the-cuff.

    Jack could never have imagined what happened next. That suggestion roared through the imaginations of the youth in that audience. In those days of primitive communications, word spread of this new volunteer opportunity … by posters on college kiosks, via early telephones, and through face to face conversations. Still, the word spread like wildfire from campus to campus. Even while the campaign was in full throttle, Kennedy’s staff was overwhelmed with requests from students and young people trying to sign up for this new program that did not exist. The response was so overwhelming that, when elected, JFK felt compelled to create the Peace Corps by executive order on March 1, 1961.

    That enthusiasm had not abated when I sent in my application in late 1965. The D.C. central staff were still being inundated with applications, way more than they could possibly accept. The youth of that era (my era), while making many poor decisions in retrospect, were generally driven by the better angels of their natures. Many truly wanted to leave the world in a better place. Those were the prime sentiments that drove me. That was why I tried to become a missionary priest, why I worked with poor and disadvantaged kids, and why I became an orderly on the 11-7 shift in a large, urban hospital. Even during my college years, I wanted to make a difference.

    Sure, we lived during a golden economic era. Poverty and inequality were falling. The possibilities, even for relatively poor working-class youth like myself, were getting better. While we worried about nuclear incineration for sure, we somehow believed a better world was in the offing. We simply had to ensure that it happened.

    If you had been there that day in June of 1966, you would have felt that faith, that optimism. I taught college students for many years. Sure, many impressed me with their idealism, intelligence, and commitment. But nothing matched those heady days when we were eager to set off to the ‘other side of the world‘ to make things just a little better. During that era, there existed a philosophical leitmotif that was quite special. Moreover, I suspect we were among the best of that unique generation.

    I will pick up this theme in future blogs. Stay tuned!

  • Public Service … a declining ideal.

    August 12th, 2024

    I just finished the excellent memoir by Dr. Anthony Fauci. Most will know whom I’m talking about, but not all. Tony, as his friends call him, was a top doc in charge of a key federal agency responsible for keeping us safe from viral and other biological threats. Though he achieved a certain amount of notoriety himself (especially during the Covid crisis), he is one of the (normally) unsung heroes who usually labor in obscurity to ensure our safety during uncertain times. Because of unique circumstances, though, he did become a celebrity.

    Tony grew up in New York in an immigrant Italian family of very modest means but with strong values. As a kid, he loved both basketball and his studies with equal passion. He was intellectually prepared by the Jesuits in high school before leaving for the Jesuit run College of the Holy Cross in Worcester Mass since, according to him, it had a great pre-med curriculum and gave him an enticing scholarship. As an aside, I personally could see Holy Cross from my back porch as a youth, and would have gone their myself were it not for a detour into the Catholic Seminary after high school.

    From there, Tony matriculated at the Cornell Medical School from which he graduated 1st in his class. My eye doc, who was on the University of Wisconsin medical school faculty, once told me that medical students were generally organized into thirds. The top third often went into research and teaching (combined with clinical practice). The middle third made the best clinicians (you want them as your own doc). The bottom third made the most money since that is what drew them into the field initially but whose skills and motives might be questioned.

    Tony was the top of the top. Back then, most medical school graduates owed Uncle Sam some time (like patching up bodies in Vietnam). The brightest often were assigned to the Center for Disease Control (CDC) or the National Institutes for Health (NIH). My good neighbor (Dennis), like Tony, was at the top of his medical school class back in the 60s. He thought he might go onto surgery but was assigned to the CDC and that experience led him to become a top infectious disease doc. Dennis is still working long hours well into his 80s. He is utterly devoted to his craft.

    Over some five decades, Tony remained on the front lines of the nation’s ongoing fight against various insidious medical threats. He was there when the AIDS epidemic burst on the scene in 1981 and still at his post four decades later when the Covid pandemic encircled the globe. In between, there were a series of real threats including SARS, ebola, anthrax, smallpox, and biological terrorism after 9-11 (among numerous other threats). The ongoing battle against a largely unseen and constantly mutating enemy composed of viral and bacterial foes never ends.

    Tony, it must be said, was the kind of person whom I admired and who populated government back in the days when I first entered government service and later academia. They were people motivated, not by money, but rather by the ideal of service to others and to the public good. His talents were remarkable, and he rose quickly in the public service hierarchy. One day, President George Bush (the dad) called him into the White House to offer him the position of head of the National Institutes of Health (NIH). This is a pinnacle position for a scientist, one that most would give their right arm to get. Tony turned him down. Power and money held little charm for him. He wanted to stay involved with both the hands-on research and clinical experiences that he felt saved lives.

    On exiting the Oval Office, Bush’s chief of staff muttered to Tony, ‘Why you son-of-a-bitch, no one says no to the President.’ But Tony did. Apparently, President Bush didn’t mind. Once, before a national audience, the President was asked whom he admired, who his role models were. After a couple of familiar names, he prefaced his next offering by noting that few would recognize the name of his next idol. Then he mentioned Dr. Anthony Fauci, who epitomized the ideals of public service for the President.

    In his memoir, Tony spoke kindly of virtually all the Presidents under whom he served perhaps with the exception of Reagan (largely by ommission) and (of course) Trump, whom he called a complicated man to say the least. Tony tried to stay above partisan politics and always looked to the vital tasks before him. But things were going askew around him late in his career. He relates that Republicans began rejecting essential public health resources merely because the other side (Obama) asked for them, something that seldom happened in the past (though requests were sometimes revised downward for substantive reasons).

    During the Trump years, government functions virtually ceased to exist despite the most glaring public health disaster since the 1918 Spanish Flu, a long-ago pandemic that killed more than died on the WWI battlefields. Science, however, came to the rescue since quality people like Tony remained at their posts. But they spent too much time fighting the worst possible politics where disinformation and outright dangerous actions took place for the most transactional partisan advantages. It was a nightmare for any principalled man. So many amenable deaths occurred due to political meddling and pure spite, an unacceptable outcome to public servants motivated by high ideals.

    The cost to Tony and his family were enormous. Trump’s acolytes continously vilified and threatened him personally, his wife, and his offspring without mercy. They were subject to the most vicious attacks imaginable, including sexual threats to his spouse and daughters. He, himself, was called a ‘mass murderer’ and is still threatened with jail for specious allegations by Republican members of Congress. Their hate knows no bounds.

    This raises a serious question. Why would anyone enter public service these days? Surely, Tony could have made loads of money being a top doc. He could have enjoyed a comfortable personal life. As an aside, I recall Joseph Califano saying the following after accepting the position of Secretary of the (then) U. S. Department of Health, Education, and Welfare back in the late 1970s … “I’ve taken a position that pays me a fraction of what I make as a private attorney and which leaves me no time for my family.” Like Califano, Tony devoted his life to doing what he thought was the right thing. His career was totally consuming, leaving no time for the pleasures typically associated with success. He was always working 14 to 16 hour days (or more) as he confronted one health emergency after another. He made such sacrifices for his love of healing and from a sense of duty … not for power or gold.

    I can remember when the best and brightest went into public service as a matter of pride and patriotism. They wanted to make a difference. In my arena, social policy, the office of the Assistant Secretary for Planning and Evaluation (ASPE) in HHS once attracted top Ph.Ds in economics and related disciplines. The staff in the 1970s could compete with the intellectual firepower of a top research university, though they had other alternative career paths. When I spent a year at ASPE during the start of the Clinton administration (on leave from UW), they had some very smart people on staff but no longer could attract intellectual luminaries.

    What happened? Conservatives had spent years denigrating civil servants with withering attacks. As partisanship and polarization increased, it became increasingly difficult to attract top talent. That became especially true when available alternatives paid a lot more money and offered less hassle. Why put up with the grief for so little reward?

    Who wants to work around the clock for less money and be accused of atrocious crimes for their efforts. No one can make a firm estimate, but several Presidents (from both parties) have intimated that Tony’s professional work may well have saved millions of lives, not just with the big crises like global AIDS and Covid but by responding quickly to other emerging threats before they got out of control. His thanks at the end was a mix of much praise and honors along with the grossest villification imaginable, including threats to himself and his family.

    We need people like Tony in government as we look to the future. We need smart young people to enter public service based on a desire to serve the common good as opposed to seeking private gain. If we attack such people, and demean their professional choices, where in God’s name will we find them? Who will step up to save us?

  • A Tiny Rant.

    July 31st, 2024

    Some aspects of American life disappoint me. Now, that is rather an astonishing understatement. Often, I am furious. One aspect of our contemporary scene that pisses me off to no end involves our slow abandonment of progressivity in our tax system. Since the Reagan revolution, we have shifted the burden of paying for public goods from those who easily can afford the bill to those middle and working class folk who cannot easily carry this fiscal burden. My cynical view is that that trend was quite intentional, designed to erode support for government among the middle classes. And what has this erosion of progressivity bought us … hyper- inequality and internal divisions.

    If you broke the U.S. population into quintiles, you would roughly get the following distribution of household incomes. NOTE: the brackets to the right are the proportion of survey respondents who typically claim that they are 1 (upper class), 2 (upper middle class), 3 (middle class, 4 (working class), and 5 (lower class).

    Top 20% $153,001 and above [2%]

    2nd 20% $94,001 to $153,000 [14%]

    3rd 20% $58,001 to $94,000 [38%]

    4th 20% $30,001 to $58,000 [35%]

    5th 20% $0 to $30,000 [11%]

    A disproportionate number of those surveyed lump themselves in the center as either middle or working class types where working class is a polite way of saying lower middle class. The median household income usually is pegged somewhere around $70 to $75,000 dollars (it varies marginally from year to year). The average HH income is in the six-figure range, but that estimate is skewed by the outliers at the very top and thus tells us little. The median figure is more helpful since it represents the 50th percentile with half the population above and half below that income figure.

    Now, somewhere above the median income figure lies an important psychological point. It estimates the perceived income needed to meet basic needs … a subjective but important line. While that figure differs across surveys, the typical estimate usually is set at a few thousand dollars above the actual median figure.

    This point in the distribution is important. Below that line, the so-called marginal utility of each extra dollar earned is seen as quite important to the family. Above that line, this utility at the margin (or for next dollar earned) begins to fall. Thus, an extra hundred bucks to someone making $200,000 per year is less consequential than that same hundred bucks to someone making $50 grand annually. What benefit is the next billion dollars when you already have $100 billion or more?

    Now, let’s go back to the difference between median and average estimates of central tendencies. Averages respecting income distributions are typically much higher since those at the tip of the pyramid really are outliers. Their income and wealth figures are astronomical, thus tending to disproportionately distort any averages calculated. And America is one of the most econominally unequal nations across the globe. Our averages are inflated by the outrageous disparities associated with those at the very top.

    It turns out that some 800 American billionaires hold 3.8 percent of all U.S. wealth while the entire bottom half of the population only controls some 2.5 percent of the entire pot. The higher one goes in the distribution, the more skewed things get. The bottom quintile has a typical income of $16,000 annually. The top quintile sees $278,000 per year. That figure jumps to $336,000 for the top 5 percent of earners, then to $820,000 for the top 1 percent, and then to $3.3 million for the top tenth of one percent. Small steps up to the top of the distributional pyramid translates into huge financial windfalls.

    Wealth is even more unequally distributed than income. The top 1 percent of Americans control 26 percent of the nation’s wealth while the bottom quintile controls a mere 3 percent. This makes even more sense when you realize that the same 1 percent at the top control 54 percent of all stocks and mutual funds. The problem with such a gross maldistribution of the nation’s treasures is that power follows the gold. As the oft-cited play on the golden rule goes, he who has the gold makes the rules.

    Absent some unknown countervailing force, only the national government has sufficient leverage to correct the tendency for unfettered free markets to inevitably spiral toward more and more inequality. Quite clearly, the rules are continually tweaked (in the pursuit of self-interest) to reward those already at the top. There are few incentives nor many natural mechanisms to diminish obsequious and ravenous avarice.

    It is axiomatic that the government can help here. It has, in the past, redistributed resources from one group of Americans to another. The right inevitably screamed that it was unfair to those who were successful in life when their taxes would go to less fortunate citizens. But that older inage of redistributional schemes from rich to poor were more appropriately associated with the pre-Reagan era. Since 1980, the distributional flow has been more from the middle of the pyramid to the top. Thus, it should surprise no one that those in the top quintile generally receive a larger return from the federal government in terms of subsidies, credits, and other benefits than that they pay in taxes. In fact, as I’ve noted elsewhere, the rich now pay proportionally less in taxes than working class stiffs … a far cry from the post WWII era when the top marginal income tax rates ranged from 70 to over 90 percent, and corporate taxes were much higher.

    We did a lot of good during that older era during which the affluent paid their share, if not more. Among other things, we built our internal infrastructure (highways and airways), invested in science and higher education, protected our democratic institutions here and abroad, and built up our middle class by expanding opportunities and supporting labor.

    With the dawn of neo-liberalism and the ascendancy of the hard right, we now have new definitions of basic societal constructs such as fairness and opportunity. It is now a winner-take-all playing field based on the ethics of a no-holds barred Darwinian struggle for survival and supremacy. Some level of struggle may well be advantageous to progress, but excess competition ultimately erodes the communal fabric essential to a level of collaboration fundamentally necessary in a modern society. If we don’t work together, we perish separately.

    The U.S. has fallen out of the top 20 countries in surveys of national happiness. In part, this is due to the stresses and despair felt among our youth. Epidemiologists point out the spike of ‘deaths of despair’ as opiod use skyrockets along with suicides. The happiest nations, no surprise here, are found among those nations that tax their citizens at higher levels while providing extensive public services to support families and smooth out the uncertainties of life. Unceasing struggle with only a tattered safety net available to the non-winners eventually takes a toll.

    When I did policy work, we often noted that budgets were values oriented documents. They reflected the kind of world we wished to have. Perhaps it is time to reexamine that question. Just what kind of future world do we want? What type of world do we want to leave to our grandchildren? I surely hope we decide to leave a better world than the one we currently have.

  • The Wayward Academic …

    July 29th, 2024

    I’ve been focusing on political topics of late. Perhaps it is time to drift back to more personal musings. And no, I have not overlooked the exciting news in the Presidential race. I’ll just let that percolate for a few more days.

    I noticed the book I wrote a number of years ago titled A Wayward Academic: Reflections from the policy trenches. It is a memoir about my professional career, mostly as a policy wonk and fake academic. When I noticed this page-turner lying on a coffee table strewn among a pile of my other great works, it got me musing about my past. To be totally honest, it doesn’t take much to activate a dialogue in my head, this hyper-active place where the discussions are so bright and stimulating … at least to me. Self-delusion is one of my enduring strengths.

    In this instance, I was struck by a rather odd aspect of my life. I was, by conventional measures, a spectacular professional failure, at least according to the ordinary rules by which academics are evaluated. On the other hand, I believe I was a remarkable success by my own rather idiosyncratic standards as well as by the feedback from peers and colleagues in the social sciences. In some ways, I lived in a professional no-mans land by functioning as a policy wonk while remaining in a scholarly institutional setting even though most peers (especially the economists) really seemed to respect what I did in my job.

    While never being naturally disposed to a scholarly life, I loved being in the academy. It gave me several inestimable gifts: the freedom to pursue what caught my interest, a ready made patina of respect and ersatz authority, an environment populated by super-bright people from across the country, an institutional framework devoid of ordinary hierarchy, ready access to experts and authorities in related institutions, and an opportunity to share my thoughts with the next generation of students as well as with the audience of greatest interests to me … the policy world. I often pinched myself at my good fortune.

    Still, being in the academic world (but not of the academic world) took some finesse to negotiate and did incur some personal costs. It wasn’t always easy. This bifurcated sense of failure and success perhaps can be traced to a disconnect embedded within my internal wiring. I was a person who ostensibly played by the rules while stubbornly rejecting conventional norms.

    As a young Catholic, working class kid, I both embraced my religion seriously (studying to be a priest for a while) while arguing internally against so much of the dogma which that religious institution fed to me. Later, I would be a college student who appeared buttoned down and conventional while organizing the left-wing organization on campus and (for a while) joining the more radical Students for a Democratic Society (SDS). Finally, as an adult, I functioned within the higher reaches of the academy (even managing a nationally renowned research entity at the respected University of Wisconsin) while never becoming a formal academic. At each stage, I appeared to conform while quietly rebelling, even doing things my way.

    Irving Piliavin was a UW professor who recruited me from a state service job (my first professional position as a social services research analyst for the State of Wisconsin). He lured me to the University of Wisconsin’s Institute for Research on Poverty, the nationally recognized think tank on social policy questions created as part of President Johnsin’s War On Poverty. (He needed my knowledge of how government worked for his research.)

    He liked me personally, but I know that I frustrated him terribly. He could never understand why I didn’t want to be a scholar like him. He would suggest topical areas in which I might become a recognized intellectual authority within the academy. Pick one substantive area? That very notion appalled me as stultifying. At one point, he told me that I had trouble with authority, with convention, and with the accepted rules. Perhaps, but my read was that I simply had to do things my own way.

    I never started out to be so rebellious. I mostly thought of myself as someone who just wanted to get along and be liked. But there is this old saying … you can’t take out what God put in. Some elemental dispositions, I suspect, are wired into your DNA. I could never stop questioning the tenets of my culture and religion. I kept embracing thoughts (e.g., for racial justice and for a global government) at a time when NO ONE around me had such convictions. As a student, I was always asking questions and pushing my professors, a trait I finally realized could be irritating when, years later, I was in front of university classes over which I was in charge and exercised authority. When I entered the work world, I seldom just did my job but kept pushing the envelope. I was never satisfied with the concept that we’ve always done it this way.

    Most of these early experiences are included in the memoir of my earlier years, titled A Clueless Rebel. Between this work and A Wayward Academic, all my skeletons are laid out in detail, and with considerable humor. If you can’t laugh at yourself (and the world about you), what is the point of living.

    At the end of the day, my direction in life was driven by some rather simple ends, nothing really out of the ordinary. First, I wanted to make a contribution to the world about me. The same impulse that pushed me initially toward being a Catholic Priest remained with me after I left the Seminary. It always struck me that a person’s life cannot be worth much if he or she doesn’t at least try to leave the world a bit better in some way.

    Second, I craved novelty. It turns out that I had the attention span of a gnat. I was always looking for a new issue or problem to attack. And over my career, I touched on just about every major social policy issue feasible. As I said in my professional memoir, my career was like playing in a policy candy store. I always wondered what delights were in the next aisle. Success in an academic culture, on the other hand, normally demands drilling deep into a given topical area. That would be intellectual death to me.

    Third, I craved challenge. I responded to those issues that were inherently difficult. Not surprisingly, I became heavily involved in welfare reform and related social questions at a time when they were front page news and political hot buttons. I was always being called by the press and often in hot water with politicians. Great fun, but not for the weak of heart. It was oft said (back in my day) that welfare reform was the mideast of domestic policies, meaning that like mideastern foreign policy questions there were no easy answers. While many academic foci are technically difficult, the policy issues I engaged involved normative conflicts which rendered them especially problematic.

    I was attracted to questions that were difficult partly because of my innate curiosity about how society worked. Another way of thinking about this is that I wanted to attack questions in ways that would lead to a deeper understanding of the world about me. This intellectual curiosity drove me through life. I felt blessed to be able to, as I oft said, fly about the country to work with the best and brightest on problems that appeared unsolvable. And better yet, I was paid to do this. Not much, to be sure, but enough to be comfortable. Besides, money never really mattered to me.

    Finally, I hoped to do no harm. This was my greatest worry that, in trying to make things better, I would make them worse. I had few illusions that I would solve the bigger social issues of my day. But at least I might contribute something without imposing some net harm on society. That would be good enough.

    What I discovered is that I could best balance these diverse goals by finding my own niche as a professional. I simply needed to find a way to remain within the academic world while not being trapped by the scholarly culture which was a bit mind-numbing to my tastes. Now, that is a tightrope that is difficult to traverse.

    It turned out that I could do most of the expected academic tasks with relative ease. I was an excellent and popular teacher. I was good at raising research money (I was mostly self-supporting throughout my career). I was in demand as a consultant throughout the U.S. and in D.C. I was popular as a giver of talks and in serving on panels and at conferences. I was respected in the collateral worlds of my content area … with think tanks, the philanthropic institutions, with evaluation firms. And I was a damn good administrator who helped keep IRP alive during some dark days.

    What I didn’t like doing was the very thing that academics were supposed to dedicate their lives toward…publishing peer reviewed research articles. Sure, I get the value of that process, the rigorous and anonymous reviews of research methods and content. Those protocols are invaluable to science. For me, though, they meant a kind of intellectual death. The process of focusing on peer reviewed publications, in my estimation, led one in the direction of narrow and provincial topics that were approached via overly quantitative methods (in a display of technical virtuosity) and written up in ritualistic and formulaic ways. Moreover, they were directed at an audience of lesser interest to me … other academics. That feature of the academic culture (impressing this narrow audience of peers) struck me as overly incestuous. I wanted to reach a broader audience.

    Of course, I did publish in peer reviewed journals but almost always in co-authored works. I typically did so to help out a colleague. At the same time, I know I influenced thinking in my profession and in the policy world, which was always my primary audience. I did that through policy oriented outlets and through my many public talks. That gave me great satisfaction. What I wrote and talked about impacted not only my fellow colleagues but also the legions trying to solve our social problems. The cost of my professional choices … I never was fully admitted to the academic club.

    If you are interested, I did collect some of my thinking in one final collective work titled Confessions of an Accidental Scholar. Through my teaching, my consulting, writings for broader audiences, and my many talks, I left a small legacy. Better yet, I did little harm while having a great time. It is not really a job if you would do it for free.

  • Orthogonal Thinking.

    July 22nd, 2024

    I probably should write about the fact that President Biden recently announced he is dropping out of the race. But everyone will be chatting about that. I prefer to focus on deeper (though not necessarily more important) issues.

    So, let me meander into the way we too often think about public issues or, perhaps more accurately, how we fail to think about them with much accuracy. I am occasionally struck by how often consensual beliefs operate (or thrive) independent of experience and evidence. Of course, there are a host of well-known psychological phenomenon (e g., confirmation bias) that help explain why erroneous beliefs remain impervious to correction. Still, it is baffling that numerous false perceptions thrive despite contradictory history and fact.

    For example, most Americans probably believe that Democrats long have been the liberal political party, the defenders of the oppressed and supporters of minorities. Republicans, on the other hand, have been associated as defenders of big business, of hard money, and of white privilege. Those views, however, are way too simple if we take even a cursory tour of our past.

    The Republican Party was formed in Wisconsin to oppose the spread of slavery to future western states and to use government as a pro-active arm in creating economic expansion and social opportunity. Everything from land-grant colleges, to the investment in the intercontinental railroad, to the creation of a national currency and a proto-income tax, came from those early Republicans … the party of Lincoln. The Dems remained the Party of ‘state’s rights’ and backward thinking for a long time. Even Democratic President Woodrow Wilson, a highly educated southerner, was a resolute racist.

    It was during the Great Depression that the contemporary roles of the major parties began to evolve. FDR’s activism, his support for labor and unions (e.g., the NLRA), and whose spouse publicly advocated for the oppressed began to break the Republican hold on Black Americans and other minorities. Roosevelt’s issuance of an executive order to end discriminatory hiring in defense industries during WWII proved a pivotal turning point. Harry Truman’s executive order integrating the military in the late 1940s cemented this new political alignment.

    In the mid-part of the 20th century, the two parties were substantively similar in many respects despite differing rhetorical styles. Republican Dwight Eisenhower (who had spent his professional life in the military) was imbued with the need for public investment if the country were to remain strong and an international leader. Having been stuck stateside during WWI, he volunteered for a tortuous cross-country military caravan in 1921. From that experience, he intuitively understood that only the federal government had the resources and perspective to move the country forward. So, when he became President three decades later, rather than undoing the New Deal during the 1950s, he poured significant government resources into research and development (multiplying such investments by several fold), invested in the largest infrastructure project to date (the interstate highway system), and kept marginal tax rates at their war time levels of around 90 percent on the wealthy. He even sent federal troops into Little Rock Arkansas to support the integration of public schools after the 1953 Brown v. Board of Education SCOTUS decision. Ironically, it was Democratic President JFK who started lowering the top marginal tax rates on the most affluent Americans. .

    Not long after, another Republican President, Richard Nixon, would speak with a decidedly conservative tongue while acting like a radical liberal. He created new federal agencies like the Environmental Protection Agency (EPA), federalized several welfare programs for the aged and disabled, introduced an automatic COLA provision for Social Security recipients, and almost enacted a universal income floor. Not even today’s most liberal politicians could pull this semi-socialist agenda off.

    Yet, the public image of the two parties had already been set in an odd kind of orthodoxy. The Republicans were the party of fiscal prudence, sound business, and strong family values. That other party was weak on national defense, were anti-capitalist, and were considered to be big spenders. Once these emotional images were imprinted in the electorate’s DNA, they remained relatively impervious to change.

    However, let’s look at a few numbers. Between 1933 and 2020, we have had 7 GOP executives and 7 Democratic leaders. The economy has grown an average of 4.6 percent under the (so-called) anti-business Dems and only 2.4 percent under the pro-business GOP. While 10 of the past significant recessions began under the GOP, only 1 began under a Dem. The only balanced budgets for the past 60 years were under Democratic leaders… Carter, Clinton, and Obama … so much for GOP fiscal pridence. If not for the Reagan, Bush, and Trump tax cuts (skewed to the wealthy), and the unnecessary Bush wars in support of neo-conservative delusions, our national debt would be closer to 0 than to the actual figure of over $30 trillion dollars. Finally, job and wage growth has been better, and consistently so, under Democratic administrations.

    To be accurate, a significant normative diversion between the two parties only became obvious in 1980 with the Reagan revolution. At that point, the two camps clearly moved in distinct directions. The ideological sorting out that began in the 1960s had been fully finalized by 1990. Our politics were fully polarized by then and, with the 1994 Gingrich Congressional takeover of the GOP, virtually all inter-party collaboration and bi-partisanship ceased. Governance in the public interest became a rare event, to be replaced by gotcha moments and personal attacks. These tactics were not new, just more virulent and universal.

    The odd thing about our current state of affairs is the continuing disconnect between image and reality. Republicans are seen as the ‘good for the economy’ party despite their poor performance in this arena over the decades. They have, however, been good for those at the top of the pyramid. Under the general supply side principles that have exerted a stranglehold on economic policy since 1980, the top 1 percent earned 26.3 percent of AGI (Adjusted Gross Income) in 2021, up from less than 10 percent in 1979. The wealthiest Americans recently paid less proportionally in income taxes than working stiffs, something unthinkable during the Eisenhower years where progressive taxation was yet considered both fair and good public policy. Astoundingly, many working class Americans firmly believe that the GOP has their best interests at heart. Unbelievable!

    It might be noted that it was during the post WWII period of high taxes and growing public spending that a robust middle class grew and social inequality fell. Economists call this era the great compression where income and wealth disparities fell dramatically as the rich contributed their fair share to the public good. Astonishingly, when compared to today’s world, some top business executives (e.g., Paul Hoffman and George Romney, father of Mitt) argued for a cap on CEO salaries as being good for America. They suggested $250 thousand back in the 1950s (about $2 million today). I cannot see Elon Musk or Jeff Bezos doing something like that today.

    But nowhere has our political thinking been more askew than when it comes to so-called family values. I can remember when Republican Nelson Rockefeller was shunned by his party for getting a divorce. Now, the undisputed leader of today’s GOP is a serial sinner, an inveterate misogynistic, and a total degenerate who somehow remains the darling of evangelical extremists and the GOP base. He rejects all of the sober fiscal tenets to which his party once paid lip service (e.g., balanced budgets and free trade while endorsing big tariffs and trade wars). Moreover, he sucks up to an array of dictators and international thugs that older Republicans would have reviled and repudiated without question. And he would trash our revered Constitution in favor of authoritarian, strongman rule, abandon the Western alliance that has defended liberal democracies since WWII, and take the country back into a form of head-in-the-sand isolationism in a global economy. Talk about orthogonal thinking.

    There has always been a political disconnect between what is and what is believed. That has always fascinated me and raised a question in my mind. Can a democracy last if there is no relationship between what is and what the electorate believes exists? For the American experiment in a vigorous republic to succeed in the future, we must find a way to match perception with reality. The average American must be educated to the point where they can connect the most obvious dots.

    Otherwise, welcome to 1933 Germany. We might well end our fragile democracy and all because we did not pay attention to what was real and what was mere illusion.

  • Returning to the cultural divide.

    July 19th, 2024

    If there is one aspect of our political situation that is remarkable at present, it is that the cultural divide between political parties and between normative positions among our people is beyond measure and is resistant to any easy remedy. We cannot communicate across that divide. We cannot begin to understand those on the other side, even those within our own families. I find myself either selecting (or blocking) Facebook friends based upon their expressed or presumed political values. I am doing the one thing I thought I’d never do … creating my own normative and intellectual bubble.

    I do this reluctantly and out of self- protection. Of course, I can generate reasons why individuals make decisions that baffle me. They may have less education, have had more limited life experiences, believe their life choices are diminished, sense threat from others who look or believe or behave differently, and the list could go on and on.

    Still, at the end of the day, I cannot even begin to comprehend, nor forgive, those who adore and worship the most disgusting and depraved American public figure I’ve seen in my 80 years on this earth. It remains incomprehensible that this poster boy for the Biblical Anti-Christ (Donald Trump) is the darling of those who most vocally praise Jesus Christ as their inspiration. It is like living in a bad episode of the Twilight Zone.

    Yet, it sometimes helps me to realize we have had prior periods where communications among Americans were equally as fraught and incomprehensible. The anti-bellum period before our great blood-letting in the 1860s is a prime case in point. Varina Davis, wife of Confederacy President Jefferson Davis, offered to make a family friend an admiral in the Secessionist Navy while saying, “You will join us, we are going to have a glorious monarchy.” She implicitly conveyed a foundational premise of the aspiring new country located in our South … she believed that the democracy on which America had been founded would be replaced by an institutionalized social hierarchy based on privilege and an inherent right to rule by those naturally born to such a role.

    No matter what kinds of legalistic arguments were employed to justify tearing the Republic apart, State’s Rights for example, a more primitive impulse simmered below the surface … one hardly more defensible than the caste system in India. In seceeding, South Carolina officials wrote that “our position is thoroughly identified with the institution of slavery…the greatest material interest of the world.” Their declaration of independence went on to say, “it is not a matter of choice, but of necessity.’‘

    The one thing that the Southern States most resented was ‘the inalterable fact that the North, like the rest of the modern world, condemned slavery as a fundamental evil. In doing so, abolitionists and their allies impugned the honor of the entire Southern race for if slavery was indeed evil, then the South was evil, and it’s echelons of gentlemen, the chivalry, were nothing more than moral felons.’ Those outside the South were not just impugning an economic system but the moral basis upon which their society had been erected.

    This notion of any set of hierarchical positions in society, pre-ordained and beyond questioning, seems quaint and dated to most of us today. For a whole segment of the country (in those days at least), it was unquestioned orthodoxy. When U.S. Army colonel Robert Garnett of Virginia resigned his commission as an officer in the U.S. military to join the Confederacy, he confided to British reporter as follows. “I deride the notion that all men were born equal in the sense that all men have equal rights. Some men were born to be slaves … others to follow useful mechanical arts … the rest were born to rule and own their fellow men.” That some men (women were second class citizens as a matter of course) were seen as inherently inferior struck many as being so obvious that the question deserved no debate. It was merely the way God had ordered the world. The Southern diarist, Mary Chestnut, wrote, “God is on our side.” When asked why, she replied, “Of course he hates the Yankees.”

    Those in the North, for the most part were equally rooted in their own vision of the good society. William Seward, a member of Lincoln’s cabinet, represented a widespread feeing of the cultural gap between North and South. He went on to say “… in the North all was life, enterprise, industry, mechanical skill. In the South, there was dependence on black labour, and an idle extravagance which was mistaken for elegant luxury.” While prejudice and exclusion could be found anywhere (witness how the Irish initially were treated in Boston as the fled the potato blight back home), social improvement was feasible for most. Merit and ambition mattered in the more industrial and urban North. There was no pretense to, nor romantic idealization, of a past feudal world that the South yet yearned to sustain, or at least recreate.

    By the time London Reporter William Russell completed his investigatory trip through the South, he had come to a dire conclusion. “The utter contempt and loathing for the venerated Stars and Stripes, the abhorrence of the very words United States, the intense hatred of the Yankees on the part of these (Southern) people, cannot be conceived by anyone who has not seen them.” This more or less ‘objective’ observer was appalled at the chasm that had grown among Americans. He saw the break as irreconcilable.

    This was no dispute across mere political parties. This was more than a vigorous contest for mere dominance in a contest for resources or advantage. The gap in 1850’s America was visceral and fundamental. It centered on such things as the meaning of virtue, on the very principles for organizing society, and on the definition of who was fully human and who was not. These were dimensions of the human condition that could not be compromised nor negotiated. Only force of arms could determine who might prevail.

    Many would claim that the 700,000 or so who died in our Civil War settled which vision would prevail. It didn’t. Within a generation after the cessation of hostilities, the noble ideals on which the conflict were based faded in the face of political realities. De facto apartheid replaced the prior de jure form of institutionalized oppression and exploitation. We recreated a form of exclusion and domination by another name.

    In 1875, then President (and the most successful military commander of Union troops) U.S. Grant may well have sensed our future problems. He said, “if we are to have another contest … of our national existence, I predict that the dividing line will not be the Mason and Dixon‘s (separating the North and South) but between patriotism and intelligence on the one side and superstition, ambition, and ignorance on the other.”

    Our Civil War patched together those perspectives that had torn the country apart. The conflict, however, never settled the underlying tensions and differing value systems that created the initial disputes. They remained percolating under the surface. They would occasionally erupt in race riots and civil unrest only to be tampered down for the time being.

    Now, however, we may well be back in the 1850s. Today, the Republican Party has assumed the role played by the southern based Democratic Party back then (the defenders of white privilege). The Current Democratic Party more or less has become the defenders of inclusion, meritocracy, and opportunity for all. What is most concerning is that we have a major political party in this country which, for the first time in my living memory, has embraced the vision of a hierarchical society based on pre-ordained roles. We have a GOP that is dedicated to thwarting Constitutional protections and, instead, instituting authoritarian rule. They wish to end the American experiment in democracy mostly in a misguided attachment to white, male nationalism.

    The most frightening aspect of all this is that this time around, they just might succeed.

  • The Quant Conundrum.

    July 16th, 2024

    I spent most of my professional life in academic settings. It is not surprising, then, that I’ve partially been indoctrinated into the quantitative perspective. Nothing matters unless it can be reduced to observable numbers. These representations of reality seem to tap irreducable truth in some unbiased manner. Forget the inconvenient reality that quantitative figures can be manipulated in so many arcane ways and remain inscrutable to the mere mortals who constitute the hoi poloi. They seem inviolable.

    I was thinking about this recently as I reflected on how some of the early titans of social media (Zuckerberg, Dorsey, Sandberg, and Theil) have so easily been sucked into the quant perspective. It is a path easily followed. I did it early in my career as I became enamored by catchy planning tactics like Management by Objectives and a host of derivative tools. Pick the right goal, formulate that end as a defined metric, then measure it to death. Running your company or government then becomes an easy task, or appears to at least. You then manipulate things to optimize that outcome.

    I like to think of the adherents to this perspective as the brittle bright crowd. They are great at solving complex engineering and computer programming challenges. At the same time, they fall short when attempting to achieve the status of Plato’s philosopher kings. The lure of rigorous analytics fails to embrace the human dimension. Alas, technical acumen is not the same thing as wisdom. How I wish it were.

    I am pushed to consider this topic by several books in which I’ve immersed myself of late. They tap a growing concern that the social media empire (Facebook, Twitter or X, Tic Toc, Snapchat, Instagram, et. al.) has led society into a dystopian nightmare where our youth is anxious, detached, depressed, and even suicidal; our politics are polarized and divided beyond repair; and our social fabric and communal identity is being ripped asunder.

    How did that happen? Like many questions, at least one answer is easy to discern. As smart phones have taken over our lives, the social media platforms they provide us 24/7 now exert unparalleled influence over how we see the world about us. Moreover, there is a common business model behind all of these connectivity instruments and technologies.

    The prime outcome of interest is engagement … the amount of time a consumer spends on a social media site. Learning, bonding, communicating, and all other possible positive outcomes implicitly associated with these technologies mean nothing. Only engagement counts since that factor appeals to those purchasing advertising. The more people on a site, and the longer they stay, the greater opportunity for the sellers of nonsense to peddle their wares. And when a platform is dealing with upwards of $80 billion in annual advertising, all else, including any concern with perverse consequences, quickly falls by the wayside.

    Of course, these contemporary entrepreneurial gurus did not invent modern management. As far back as I can recall, there had been a push to introduce more rigor into government and the private sector. My own career was a testament to both the allure of the quant approach to things and the inherent pitfalls associated with it. This will be a mercifully quick tour.

    Some of you will remember Robert MacNamara and the body counts in Vietnam. Robert had a reputation as a whizz kid, a new kind of corporate manager in the auto industry who would rely upon numbers and not ‘feel’ or ‘experience’ when making key decisions. When he took over as Lyndon Johnson’s Defense Secretary in the 1960s, he brought his corporate culture with him to Washington. His primary challenge at the time involved choosing a key outcome to assess success in the emerging Vietnam war, a conflict with no front lines and where winning battles had little meaning. But you could measure dead bodies. That seemed easily quantifiable. If you killed more of them than they did of you, you would win … no? Poor Robert eventually would admit his errors and that the war has been a tragic mistake. His misguided early hubris haunted him for the remainder of his life.

    My early experiences in state government taught me several lifelong lessons, though on a less tragic level. One of my first responsibilities was to serve as the analyst for the state of Wisconsin’s Quality Control system (QC) for cash assistance for vulnerable families. Basically, samples of welfare cases were drawn, and the accuracy of eligibility and payments were assessed. The state was to employ these results to improve the accuracy of welfare decisions through various corrective actions. Sounds perfect as an example of the new and hard-headed approach to governing.

    What happened, though, was I soon stumbled on an easy way to reduce welfare errors. Forget about complex corrective action strategies, just simplify the rules. Cash welfare for families had been tailored to individual situations. The workers tried to adjust individual grants to the fiscal peculiarities a family faced. This was a noble attempt at case equity. But now, to reduce errors, we began to radically simplify those rules. Eventually, we would wind up with a flat grant where only family size mattered. Our rationale was simple, fewer decision points and there were fewer chances for error.

    The other byword of the new management was efficiency. That would be another lure for us young Turks of that era. So, I quickly concluded we had to automate government (back in the early 1970s). Until I left for the University, another obsessive project of mine (and like-minded peers) was to conceptualize and develop what was known as the Computer Reporting Network or CRN. When up and running, workers in Wisconsin’s 72 counties would merely collect data from an applicant for assistance and enter it into a remote terminal. The actual decisions would be made by a computer in Madison for the major assistance programs … cash assistance, (then) Food Stamps, and Medicaid. It was extremely efficient since only one application needed to be completed, and human input (including error) was largely eliminated.

    But one quickly learns that unintended consequences lurk everywhere. Automating welfare decisions meant that ALL discretionary decisions had to be eliminated. Every vague and opaque decision point had to be transformed into a binary choice. It was either this or that. No gray permitted.

    Between QC and CRN, we would take out all the human contributions associated with dealing with vulnerable families. The new system had all the hallmarks of modern management. It was efficient, accurate, and resulted in minimal maintenance costs. But, as do all innovations, there were costs.

    Horizontal equity was sacrificed, there was little relationship between what a family needed (given unique circumstances) and what they got. Also, the human element was removed. Agency workers became data collectors and little else. They no longer helped clients with non financial needs, which often were the important stuff. Finally, a uniform or flat grant became much easier to cut over time since benefits were less obviously tied to actual needs. Sure enough, benefit guarantees began falling over time. When I went to Washington to work on Clinton’s plan in the early 90s, I oft joked that ‘we better reform welfare soon or there would be nothing left to reform.‘

    Let us segue back to the titans of social media. They are clearly members of the brittle- bright group, excellent at analytics but not so great on the wisdom gained through broad experience. In fact, Silicon Valley was known to be populated by twenty-somethings during the heady days of the rise of social media.

    These kids had an engineering mentality. They saw a problem and went after THE solution. Hard numbers were always at the heart of these solutions, and optimizing equations and algorithms paved the way. You know the drill … pick an outcome and make choices that maximized that number. In the end, they highly simplified their world. Work up the emotional state of the customer by leading them deeper into controversial (even dishonest) content since that enhances engagement and increases advertising revenues (i.e., profits.)

    Many of these titans had fooled themselves, believing they were contributing to a new world based on universal communication. This was a world first intimated by Pierre Teilhard de Chardin some 75 years ago. In truth, they were trapped in a classic version of the prisoner’s dilemma associated with any investment initiative involving venture capitalists. They had to keep pleasing advertisers and thus increasing revenue streams. If they did the right thing for the public good, competitors would catch up. There was no way out of the trap they were in, no matter the costs to people and society.

    Those costs are proving to be extraordinary high and show no sign of abating. A dystopian future seems inevitable at the moment. But their intentions were good, as were mine, back in my early days.

  • Belated Independence Day musings.

    July 10th, 2024

    Roughly a week ago, we celebrated our independence from Britain. Of course, we screwed up the significance of that holiday … something we do with surprising regularity. At the time, John Adams predicted that future generations would celebrate our independence on July 2nd. It was on that day in 1776 that the colonial representatives gathered in Philadelphia voted to break with the mother country. That was considered the major event in the moment. The 4th was a relatively minor afterthought when the formality of affixing their signatures to the final version of the declaration of independence took place. But the vote two days earlier was the critical moment.

    The motivations behind this revolution are murky and not always uplifting. We all have heard the ‘taxation without representation’ rationale. While there is some merit to this argument, the case is not compelling. After all, the Crown had emptied its treasury during the 7 Years War (aka The French and Indian War) to defend these (as it turned out) ungrateful colonists. This conflict preserved the colonists’ sovereignty over the Eastern seabord from incursions by the French and by Native Americans who now realized the enormity of the mistake they made by not pushing the first Europeans back into the sea. In effect, the Crown had made it possible for the early settlers to solidify their hold on the land and to develop lucrative trade and commerce relations.

    The Native American story in New England was particularly sad. The first Europeans touched on these shores before 1620 but were easily run off by the indigenous tribes who had superior numbers. Still, physical contact had been made, a disastrous interaction for the original inhabitants of this land. Diseases for which they had no immunity spread quickly, reducing the coastal population in an extreme manner. Future probes along the New England shores by various explorers found largely depopulated villages along the coast with local tribes who could no longer effectively defend their lands.

    Fast forward a century and a half, let us consider the situation in the 1760s from the British perspective. They thought, not without reason, that the British subjects living in North America ought to share in the costs of protecting the still vulnerable colonies from future external threats. But Parliaments efforts to impose even modest taxes were met with fierce opposition. This was to remain an American trait that still dominates our national character … we want public services but despise paying for them. We want a free ride.

    This is a form of extreme selfishness, though it yet remains problematic to separate principles from profits. The Boston Tea Party has been sold as a noble measure of civil disobedience. At the same time, all this British tea waiting to flood the market would have depressed prices and cut into the profits enjoyed by rich merchants like John Hancock. Perhaps principle and profit are two sides of the same coin.

    Still, the bravery of the early revolutionaries cannot be questioned. When the British marched out of Boston in April, 1775, they were looking for stored arms in Lexington and for several revolutionary agitators who might well have been hung as traitors. During the 8 year struggle, it was (until the defeat of Cornwallis in 1781) a close run thing. The top revolutionaries clearly had committed treason against the Crown. The first few battles (after Bunker Hill and the British evacuation of Boston) showed the colonists to be no match against the most powerful military in the world at that time.

    America’s forces were first cornered in Brooklyn. Only a miraculous turn of weather enabled Washington to escape the vice like trap laid by the Brits. Brisk winds kept the fleet from sailing up the East River to close the ring around the beleaguered patriots. Then, a fog rolled in to enable Washington to escape to Manhatten during the night and eventually to get out of town. He was fleeing with the Brits close on his heels.

    That reprieve was short-lived. By the end of that first year, the Continental Army had dwindled to a skeleton force with many enlistments ending on December 31st. It was then that our the so-called Father of our country pulled a rabbit out of the hat. His forces crossed the Delaware River at three points on the night of December 24 to attack the Hessians based at Trenton. The logistics were a nightmare, and many of his troops never got across. Moreover, it took longer than anticipated. Worse still, a Tory spy saw what was happening and raced to Trenton to tell the Hessian commander about what was unfolding. If the element of surprise was eliminated, Washington’s last gamble would have been doomed. They could not have defeated a prepared, professional Hessian force.

    A note from the spy was delivered to the Hessian commander. Unfortunately, he was busy drinking and gambling in celebration of Christmas that night. He wasn’t concerned. After all, this ragtag American army was all but defeated. He put the warning in his pocket without looking at it, an act that would cost him his life a few hours later. Thus, it turned out to be a totally surprise attack and an astounding success. But it so easily could have turned out so different.

    I have always thought that the most relevant example in my lifetime where a similar underdog came out on top was the Vietnamese nationalists. They fought longer than the Colonists, well over four decades, and suffered way more than did our ancestors. But one thing is the same, they took on some of the greatest military machines of their era against all odds. And they won! These nationalists, in turn, took on the Japanese, then the French, and then the Americans. Their odds of prevailing probably seemed about the same as the Colonists back in the early days of our revolution.

    In similar fashion, the Colonists were sharply divided much like the Vietnamese were. Some estimates put the proportion seeking independence at no more than a third of the population, with another third remaining loyal to the Crown and the remainder indifferent. Benjamin Franklin’s own son remained a loyal Tory and British official, thus ending any positive relationship with his dad. My late wife’s ancestors (on her dad’s side) can trace their lineage to pre-revolutionary times in Massachusetts. However, they left the Boston area around that time. My guess? They had remained loyal to the Crown and were forced to move west. In about three generations, they settled for good in the Twin Cities

    As we can see from the above insert, independence would have come in due course. Britain, at one time, exerted direct or indirect control over a quarter of the globe’s population. Truly, the sun never set on their empire by the latter part of Queen Victoria’s reign. Yet, slowly, it all fell away. The business model of extracting resources and rents from others through force proved unworkable (and too costly) in the end. America was the first to break away, but eventually, all the others followed.

    Before ending these musings, let us revisit whether some uplifting principle might be found in America’s revolution. While mercenary motives cannot be discounted, I do find at least one uplifting purpose. It is the rejection of authoritarian rule. Enlightened thought was circulating among the intellectuals of Europe. Those ideas would seep into the thinking of the American founding fathers. These notions would, in turn, soon rebound back across the Atlantic to invigorate the French Revolution.

    Ironically, that 1789 seismic rupture in the old world was a direct result of the American revolution. In 1777, after the Battle of Saratoga, France finally succumbed to the entreaties of Franklin and Adams to support the uprising of the colonies. This cost the French monarchy much treasure, eventually leading the king to call Parliament (the Estates Generale) into session in order to raise additional revenue. That proved to be an unwise move in the short rum. His reign and his life soon were forfeit.

    These two intertwined revolutions, the one in America and the one in France, initiated the slow and uncertain evolution toward democracy and governing principles that embraced citizen participation and involvement in public decisions. Progress, however, was slow. It took America some 170 years to achieve more or less universal suffrage. France bounced back and forth between Republics and authoritarian rulers for many decades before a more Republican approach stuck.

    Now, in 2024, where are we? France just underwent a troubling election where the hard right seemed on the verge of taking control. That had not happened since World War II when the Nazis installed a Vichy puppet government. And in America, most now believe that the electorate will hand over the reigns of power this November to a pathological narcissist and degenerate sociopath who vows to reinstate an authoritarian regime. There is no crisis demanding such a bold and dangerous action. It would appear that there are a sufficient number of Americsns fully prepared to cede control of government to a maniac who will dismantle the rule of law and our constitutional protections. Wow! Their cognitive processes defy logic.

    I’ve noted this before. Some acute observers assert that the American experiment did not start with the Declaration of Independence (1776) nor the Treaty of Paris (1783) nor the ratification of the U.S. Constitution (1788). No, the real American revolution occurred in 1801. It happened when John Adams looked at the electoral results and realized that he had lost his bid for reelection against his bitter political rival (Thomas Jefferson). He did not call out the troops. He did not claim foul and refuse to leave the White House. When the time came, he merely got in his carriage and started the long ride back to Boston. His actions were light years away from Trump’s utterly narcissistic actions on January 6, 2021.

    Is the experiment in American democracy over? Has all this been in vain? Time will tell, but I am not optimistic.

←Previous Page
1 … 7 8 9 10 11 … 30
Next Page→

Blog at WordPress.com.

 

Loading Comments...
 

    • Subscribe Subscribed
      • Tom's Musings
      • Join 38 other subscribers
      • Already have a WordPress.com account? Log in now.
      • Tom's Musings
      • Subscribe Subscribed
      • Sign up
      • Log in
      • Report this content
      • View site in Reader
      • Manage subscriptions
      • Collapse this bar