• Sample Page

Tom's Musings

  • The notion of progress … linear or not!

    December 19th, 2023

    Kailash Kanoria recently posted a blog that summarized a set of transitional points during which the world as we knew it advanced in a sharp and unexpected manner, at least for a while and in selected locations. For example, the Renaissance turned western mankind away from an obsession with past ‘golden ages’ toward a more optimistic and human-focused future. The subsequent ‘age of exploration’ opened isolated societies to new possibilities of shared cultural experiences and ideas. The onset of the ‘scientific age’ further developed methods for understanding the world around us rather than relying upon divine or given truths. Each of these (and other) human transition points appeared to build on prior epiphanies and lead to the next set of insights. If true, can we assume progress is now linear and inexorable? Might we indeed anticipate an ever brighter future?

    Let us think about that prospect for a moment. If we look back over recorded history, we can find a number of moments and places where humans demonstrated considerable ingenuity and (relatively speaking) extraordinary insight. Such periods might be found in the Fertile Crescent (between the Tigris and Euphrates rivers), in ancient Egypt, in the Indus Valley, in the river valleys of China, and in Meso-America. In such places, civilization established conditions that enabled some members of society to consider questions beyond mere survival. A few members could focus on the arts and the more esoteric imponderables in ways that might explain the mysterious world about them. Ancient stone monolithic arrangements often speak to early efforts to understand and control the natural world.

    Somewhat more recently, we had the blossoming of philosophical and political thought during the Hellenic age in Greece, in the Macedonian and Persian empires, then the Roman Empire which extended from Britain to the Mid-east to be soon followed by the Byzantine empire after the collapse in the Roman West. Even then, the world did not fall into a total dark age. The Islamic Golden Age, centered in Baghdad, lasted for at least five centuries (7th to the 12th centuries) during which thinkers from all the known cultures in the world were encouraged to collaborate and advance human knowledge. In that age, there were many breakthroughs in mathematics (algebra was invented), astronomy (many stars were named during this period), human physiology, optics, poetry and the arts, and so forth. It was a kind of Renaissance in the middle-east..

    The Mongol hoard eventually sacked Baghdad, the center of this cultural revolution in science and thought. While Genghis Khan was a fierce, if illiterate, warrior, he was no dummy. His empire stretched from Eastern Europe to the Pacific Ocean. While it lasted, he introduced many reforms and innovations that anticipated the modern world. He instituted ingenious concepts in trade, communications, govrrnance, and financing that later were critical to sustaining more modern forms of enduring nation states.

    Still, we often think of the modern world as emerging during the European Renaissance of the 14th century. That explosion of new thinking and new ways of viewing the world was succeeded by the ‘age of discovery,’ the ‘age of early science,’ the ‘age of enlightenment,’ and finally our modern world. As noted, Kailish Kanoria recently posted a blog surveying these more recent Western ‘ages.’

    I think about this past and wonder. While we had many moments when civilization had enough going for it to spur a brief explosion of new thought and innovation, none of these endured over time. Some imploded due to natural causes or cultural disasters. Others were smothered and extinguished by more aggressive (if culturally inferior) neighbors. Even in our contemporary world, we have seen some nations with free thought and opportunities for human advancement. But that situation has never been universal. We inevitably see other parts of the world remain dark and mired in mythical or backward thinking. The world of science and progress and advanced might well be fragile and temporary.

    Here is my point, and my question. Despite the ephemeral nature of past explosions of learning, is this one likely to last? Are we finally on the path of linear and lasting progress? After all, the ‘singularity’ is supposed to be just years away. Humans then will join with machines to so that progress will be institutionalized in some permanent form where technology and human ingenuity combine in a synergistic and continuously evolving manner. A comforting thought, I think?

    Then again, we have all these signs that science itself, the foundation of our modern world, is now under attack. Politicians call for new forms of Christian nationalism to replace our secular democracy. The worst scenarios of the dystopian, yet prophetic, novel by George Orwell (1984) appear to be gaining traction. Even medical science is discredited as the wild assertions of wanna-be totalitarian types become the new truth. Demagogue’s, like Trump, rile up their conservative followers with unfounded fears of new threats everywhere, then suggest that only they can save them.

    Many believe the 2024 election is fundamental to the American experiment. Will Western society continue to evolve under a regime based on rationality and evidence or will we once again descend into darkness. Consider the following. The scientists in 10th century Baghdad who were perfecting mathematics and looking out at the world with increasing rigor probably saw no end to their relatively advanced view of reality and the natural world. Soon, however, it would all disappear with mere fragments of their advances resurfacing in Europe several centuries later.

    It is hard not to overlook the prognosis of famed astronomer Carl Sagan which he made shortly before his death in the 1990s:

    “I have a foreboding of an America … when technological powers are in the hands of a few, and no one representing the public interest grasps the issues; when people have lost the ability to set their own agendas or knowledgeably question those in authority; when, clutching our crystals and nervously consulting our horoscopes, our critical faculties are in decline, unable to distinguish between what feels good and what is true. We slide, almost without noticing, back into superstition and darkness.”

    He was talking about a future in his children or maybe grandchildren’s lifetimes. He was talking about today. I hope he was wrong, but I cannot say with certainty that he was.

  • Higher education … a thought or two on ‘doing policy.’

    December 15th, 2023

    Not long ago, a former colleague sent me a video clip that she needed my permission to use. In it, I expounded on the difficulties of doing what we call ‘public policy’ or the making of decisions impacting the broader good. During this aforementioned clip, I repeated a firm belief of mine … while doing policy may look easy, it definitely is not for the faint of heart. I even shared a favorite mantra of mine … the solutions of one generation are the scandals of the next. Problems we thought solved, or at least solvable, inevitably resurface again in even more intractable and diabolical forms.

    So called ‘wicked‘ policy issues are the worst. Such challenges involve confusing or contrary goals, disputed or non existing evidence, competing or incomplete theories, and proposed solutions that reek of contradiction and difficulty. Even where solutions are forthcoming, and an apparent concensus achieved, it is not long before shortcomings are realized and unintended consequences emerge. No one can anticipate all the effects attached to a policy change. We cannot calculate all the intricate interactions that either emerge with time or sense all the subtle, unexpected impacts that eventually manifest themselves.

    Such epiphanies were thrust upon me during the many years I was immersed in various welfare reform battles. In that contentious arena, one policy truism became an inescapable reality … it was impossible to please everyone. More likely, you would please no one. During those difficult days, I embraced another of my inviolable mantras … I knew I was approaching a truth on welfare reform when literally no one agreed with what I was saying. That made my professional road rather lonely indeed.

    Now, some policy dilemmas were technical in character and, by virtue of their character, were beyond resolution. For example, there was the ‘iron law’ of welfare reform. In this conundrum, you could not satisfy certain ends simultaneously no matter how hard you tried. For example, you could not design a reform package that would result in adequate welfare grants while achieving target efficiency and fostering a positive labor supply … at least within acceptable cost parameters and at the same time. It just couldn’t be done.

    Consider the following. You could achieve the goal of eliminating poverty by raising the welfare guarantee (what the beneficiary with no other income might receive) to sufficient levels. But then you would negatively impact positive labor supply expectations since beneficiaries would have little incentive to work. You might then fool around with what are called ‘marginal tax rates’ or the rate at which the benefits are decreased in the face of any earnings. Lower rates improve the reward for work (by putting more money in the persons pocket) but that erodes target efficiency … the proportion of welfare expenditures that go only (or primarily) to families below the poverty threshold. You might consider mandated work regimes, but those strategies quickly become budget busters. Just trust me on this one, you cannot resolve this small set of policy ends simultaneously. That is why the welfare debate endured for so long. [Note: these conundrums were resolved by pretty much ending cash welfare to poor families and accepting poverty levels higher than found in our peer nations but oddly enough acceptable to most Americans.]

    Complicating policy even more are our diversity of values. For better or worse, we are a heterogeneous society. That is, we do not have a common culture based on shared understandings. We don’t even come close. Our governing norms are diverse and often contradictory even at their core. We disagree about the nature of truth, about our views of false positives and negatives, about what constitutes worth and meaning. Some of us see people as basically good, and we tend toward positive reinforcements to enhance that goodness. Others of us see people as essentially evil and tend toward universal punishments to effect proper behaviors. Compromise across such different world views is difficult, if not impossible. And such divergent perspectives are everywhere across this strife-ridden land.

    The tendency for many (in America at least) to rely upon absolutes (an inability to engage in nuanced thinking) reflects on dimension of our national dilemma. Abortion is always evil and never can be tolerated for some. For others, it is a matter of basic individual choice and personal freedom. And for others, it is to be avoided when convenient, but we can consider the quality of life and other mitigating factors when deciding if it is appropriate.

    Another example: Is it better to incarcerate 10 innocent men rather than let one guilty one go free or just the reverse? We differ greatly on such hypotheticals, yet such primordial dispositions frame our sense of what constitutes justice. And another: Should we grab onto an ancient text written over hundreds of years by wildly different authors and then organized by a committee driven mostly by politics and insist it is God’s word that would trump even our nation’s Constitution. Or should we rely upon, and be guided by, centuries of thoughtful evolution that has resulted in the scientific method, something that has opened up an unfathomable universe to our understanding. Finding common ground is difficult when their are rifts within our foundational thinking … when claims of divine inspiration are deemed superior to rigorous thinking and advanced thought.

    The bottom line is this. Policy is hard because humans are involved. We sometimes can work together and solve intricate issues when they are technical in nature. We decided to put a man on the moon and did it. But those challenges that tap our human side are something very different indeed. We declared a ‘war on poverty’ and found that an unsolvable conundrum. Why? In part because we had fundamental disagreements on how people functioned and were motivated. We differed on the very nature of reality and truth.

    And so I looked upon the beleaguered Presidents of three top universities … Harvard, MIT, and Penn. These are very smart people with years of management behind them. But they wound up being crucified for their defense of the principle of ‘free speech’ on campus. How could an inviolable foundation of intellectual life, a sacred belief within academia, bring them so low. [Note: the President of Penn has resigned while the other two are under withering fire.]

    They fell from grace, not because they were crucified on the horns of a technical dilemma like the iron law of welfare reform. No, they ran afoul of a newer form of political persecution. In today’s gotcha world, the game involves putting leaders in impossible situations by confronting them with foundational beliefs that rub against one another with considerable fiction. Then, the inquisitors go in for the rhetorical kill. On either side of virtually all contemporary issues are partisans who hate those on the other side. This visceral animosity was made popular by Newt Gingrich in the early 1990s though previous versions existed (for example, in the pre-civil war 1850s). No compromise is tolerated. Governing is a zero-sum game. You lose if the other side doesn’t, even when they act on principle and for the common good.

    When power is everything and winning is all, intellectual niceties such as science and evidence easily are sacrificed. Okay, some science is yet trusted, where only technical issues are involved (a smaller and smaller set). Increasingly, even once protected areas of rigorous inquiry are subject to partisan nonsense. Medical science previously was looked to as a way of advancing our health and well-being. Recently, however, even the most respected medical scientists are vilified and threatened with jail simply because that keeps members of the conservative base angry and inflamed. When we cannot trust the best science, whom can we trust?

    While we try to govern and make decisions through accepted process and established evidence, it all comes down in the end to questions of trust. There is a great scene in the movie about the black woman who did the NASA calculations for the early space shots. When astronaut Scott Glenn had concerns about the numbers he was given based on computer algorithms, he asked that the ‘smart’ lady redo them manually. It was a matter of trust for him. Back then, he trusted a human over a machine.

    I was thinking about this trust issue recently in the context of the evidence that the long expected (theoretically) Higgs Boson sub-atomic particle exists. The only way to prove it involved sending beams of protons hurtling at each other near the speed of light. This required an apparatus so big and intricate that it crossed the borders of two countries in size and involved fiscal outlays and intellectual resources from some 23 countries. Even then, the effects of the collision only lasted a tiny fraction of a moment before decaying. Difficulty did not matter. Detecting this elemental particle was seen as a fundamental step toward confirming the ‘standard model’ of particle physics. It was extremely important to those who worry about such things. When they discovered this long sought particle about a decade ago at an immense costs (the large Hadron collider alone cost over $9 billion back when it was built), the entire world of physics rejoiced. And the rest of the world believed they had actually found something worth being excited about.

    But here is the thing. We mortals cannot verify that they found anything. In the end, we must believe their integrity and the scientific methods they used. Without such elementary trust, we risk reverting to a new ‘dark ages.’ We are in the atomic age because our leaders, driven by the necessity of war, believed in the math posed by a handful of scientists no one understood. We proceeded toward nuclear fission based on leaps of faith.

    The Presidents of the major universities now under attack had a much more difficult task. When they testified before Congress, they were going to be crucified no matter what they said. Still, it struck me that they remained true to their calling as they tried to make intellectually nuanced arguments that they must have known would satisfy no political extremists in the end. When asked repeatedly why some students could express the most hateful opinions and inflammatory rhetoric calling that called for violence against others, they waffled and danced and kept talking about context and freedoms. They may have been correct but, in today’s environment where only absolutes satisfy, they were doomed to ridicule at the least, outright sacrifice by others.

    I could not do any better. I would have defended free speech as a principle. I do believe that people have the right to say really stupid things and, if pressed, will fight for their right to be stupid. But I do have a line in the sand. I rather agree with the old adage that ‘you cannot shout fire in a crowded theater.’ (Was that Justice Brandeis?) Likewise, I don’t believe people can explicitly call for violence against others in public places and forums absent any cost. When you put others at imminent risk of physical harm or death, you have gone too far in my book. But that is merely my book.

    I might have tried something like that were I in their shoes. And I would have fared no better than they. In short, there are no acceptable responses to today’s political questions or some of our wicked policy challenges. And so, doing policy these days is still not for the faint of heart. At the same time, someone has to step up to the plate and try. Otherwise, we are doomed.

  • Higher Education … the luster has dimmed.

    December 12th, 2023

    There is an internet site that too often caters to readers who pose questions about elite schools. Is Harvard better than Yale or Princeton? Should they go to MIT or Cal Tech? Can they still get into an Ivy League school if they got one B in the 2nd grade? The variations on this theme are endless.

    I keep wanting to scream that any differences among these institutions, at least in terms of pedagogical advantages, are miniscule. What the interrogators really are asking is whether school A will offer them better social and economic contacts for success later in life than school B. To my mind, they could care less about education. They are obsessed with their prospects for securing economic advantages during their careers. They look upon their higher educational experience from a purely transactional perspective. Perhaps I’m being overly cynical, but that’s how it looks to me.

    My college career started about six decades ago. 😳 Okay, so we are talking pre history here. But things appeared very different back then, or so it seems to me. College was affordable, even for a working class stiff like myself. I made it through a pretty good private school with virtually no help from my parents. I recall getting some marginal assistance for one semester. My mother’s hand shook as she wrote the check. She probably was thinking that the money she was wasting on me could be going for more important things like beer and cigarettes.

    The important point is that higher education was affordable in the 1960s. Years later, when my Peace Corps group gathered and reminisced on such matters, we found it remarkable how many of us were from very modest circumstances, often the first in our families to try college. (The tuition at very good public schools in California was virtually free back then.) Yet, many of us managed to make it through excellent schools (e.g., Yale, Columbia, Berkeley) on our way to careers that made significant contributions to society. College was not only affordable, but proved a stepping stone to personal change and social fulfillment.

    I myself did not go to an elite school. After a stint studying for the priesthood in a Catholic Seminary (the Maryknoll foreign missionary society), I matriculated at Clark University located in my home town … then known in my circles as a den of Communists and atheists. But I could live at home, and they would take me in the spring semester. Recently, I came across an article which claimed that Clark was one of a handful of American schools that took in average students and transformed them into academic stars who wound up in academic careers at elite schools. That was me. I did not graduate in the top quarter of my high school class (admittedly a good school) and wound up as a Senior Scientist at the University of Wisconsin-Madison.

    My point is this. So many in my generation did not look upon higher education as a stepping stone to getting rich. I never selected courses based on their utility for some future, hypothetical career. Such mundane purposes never crossed my mind, even as my family pressed the notion that college would make me wealthy. No, I chose courses because (gasp!) they appeared interesting and I might learn something. Things like career trajectory and earning potential were decidedly secondary considerations. I, and my peers at the time, saw college as a place for learning and forming our lifelong values. It was an environment where we might become thinking beings. Fortunately, we were not disappointed.

    Decades later, I would run across a fellow ‘Clarkie’ from the 1960s. To a person, we would think fondly back to those days when we would spend hours debating the issues of the day and then remember that we better study a bit so they wouldn’t kick our fannies out. Back then, grades were earned and meant something. My GPA was good enough to graduate with honors. However, it would have been too low for me to make the cut for the U. of Wisconsin Masters in Social Work program where I now taught. (Note: they gave us admissions committee members some methods for adjusting GPAs from those early years when grades still counted. Today, close to 80 percent of grades at many elite schools are in the A range). In the committee, I would see endless 3.8 and 3.9 GPAs that would be accompanied by personal statements that were incoherent and, frankly, embarrassing.

    I recall those endless dialogues at Clark on the major issues of the day as the forum through which I sharpened my analytical skills and upgraded my ability to express myself. The conservative, Catholic, working class boy who merely accepted what he had been taught was replaced by a thinking young man who developed his own moral and intellectual center. I had to work hard as I concluded that the war in Vietnam was unsupportable. I did not merely accept the opinions of others. I had to convince myself of that conclusion. That process was very demanding but worthwhile in so many ways. All these decades later, I remain convinced that I was right.

    Clark was where my real education took place. And not just mine. All my closest friends at the time came from equally modest backgrounds. All (including my two girfriends from that era) went on to get doctorates. One of those close female friends later became a Dean at a major university and the other a research scientist. Everything seemed possible in those days when upward mobility was more the norm … before conservative orthodoxy subverted meritocracy with the neo-liberalism introduced by Ronald Reagan. All seemed possible, no matter how humbly born.

    As you know, I would end up working at an elite, research- oriented university. I helped run a nationally recognized research entity and taught policy courses at both the graduate and undergraduate levels. For many years, I also served on the Master’s admission’s committee for the School of Social Work. Beginning in the 1980s, I saw a fundamentally different student before me. They increasingly were debt-ridden and driven much more by career concerns. Who had time to learn and debate and refine one’s world view when you might end up homeless in a few years. The college experience was becoming more of a vocational experience and career training ground where deeper thought was now a luxury one could hardly afford to pursue. Moreover, college quickly was becoming enormously expensive even as many of our peer nations continued to offer higher educational opportunities at a nominal cost, if not free.

    Back in my day, ‘developing a philosophy for life’ was an important goal for a majority of college students. Over time, that goal was replaced by ‘making money.’ In recent times, the perceived importance of a college education among the young fell to about 40 percent … it had been about 75 percent. The percentage of high school grads headed to college has fallen by 8 percentage points recently, perhaps not a bad thing since at least half will drop out before completing their studies. The vocational ‘value’ of a diploma is eroded when too many have these pieces of paper of dubious worth.

    It strikes me that my generation (many of us at least) knew why we wanted the college experience. We wanted to embrace learning for learnings sake. Making money was the least of our concerns. Acquiring skills had some merit but we knew education would be a lifelong pursuit in an ever changing world. We wanted to figure things out for ourselves because we knew we had to. At Clark, I don’t recall any liberal indoctrination whatsoever. But I was encouraged to think things through on my own. I intuitively realized that the seeking of truth was a personal, even lonely, endeavor. That made all the difference in the world for someone who grew up in a world dominated by absolute truths. What a freaking gift!

    This is not what I started out to discuss. I wanted to explore the new orthodoxy on campuses as it has exploded across the news in recent days. But that will wait for a future blog … perhaps the next one.

  • Health care in America … markets and our vulnerable citizens.

    December 9th, 2023

    Another mantra I used in the Poverty & Policy series was ‘the test of a nation’s morality is how it treats its most vulnerable citizens,’ though I was far from the first to employ this quite expressive trope.

    Surely, the young and old count as vulnerable in any overall population. We often lump them among the worthy poor since their age makes it less likely, though not always impossible, to fend for themselves. Thus, society typically takes a paternalistic attitude toward their well being, or so we believe. We can’t let the young and old suffer, can we?

    let us take a quick look at how our health and related care systems deal with these more sympathetic subgroups. In short, not well. Shockingly, we find U.S. maternal and infant mortality rates among the highest, if not the highest, within OECD nations (generally those most like us economically). For example, the infant mortality rate in the U.S. is three times higher than the comparable rate in Japan. One might attribute such disparities to life style choices but that may simply be rationalizing away differential policy regimes.

    When you get into the teen populations, we find strikingly high rates of death by suicides and homicides. What are called assault deaths in the U.S. (often via guns) is some 2.7 times higher than the average among OECD countries. Death by violent means now ranks among the highest cause of death among teens. Widespread anomie and hopelessness coupled with the easy availability of weapons and dangerous prescription drugs leads to many early exits from life, some self-imposed and others a consequence of the casual carnage on our streets. Again, policy failures and/or indifference play a role.

    A quick detour to the other end of the age spectrum. Here, the policy response is a bit mixed. As far back as the 1960s, we passed Medicare and Medicaid during one of the brief spasms of progressivism in the U.S. Since the early 1980s, we have returned to form, relying more on market forces to provide efficient and equitable care to the elderly. After all, that is the inviolable truth according to neoliberal economic orthodoxy. Our fascination with markets now leads citizens to visit Canada for affordable drugs (if they can) and to take trips to Europe for joint replacements and other procedures available there at reasonable prices. The one contrary blip to the slide toward a conservative medical market in recent decades has been Obama care which caused outrage, anguish, and much screaming among conservatives.

    One result of this market-focused approach can be found in how we deal with those elderly needing more intensive care, what we usually term assisted living or enhanced service care. By one account, some 850,000 Americans get such help, and many more need it. But our system of care views this as a profit center for private interests. It has been estimated that half of all facilities providing such care have an ROI (return on investment) of 20 percent or more. Monthly charges range from $5,000 to $10,000.

    I have some personal experience here. My spouse had early onset Alzheimers. I could take care of her at home for a number of years but the time came when that proved very difficult, if not impossible. The initial cost of her care was about $5,500 per month, but her needs quickly increased with time. After some three years, the monthly charge was closer to $10,000 monthly. Fortunately, we had Long Term Care insurance (LTC), due to her prescient diligence on this matter. But, in a discussion with the facility CEO, I found that only a small minority of the residents were covered by such care, and not many had the kind of quality insurance we did. The vast majority were paying out of pocket.

    We face soaring costs for dealing with our aging citizens. Many can not afford costly institutional care leaving families to patch together home care as best they can. Again, differential policy regimes across countries are apparent. Most other countries act more aggressively to control prices and to expand care opportunities for the high-need elderly. Once again, we stand out among wealthy nations as relying upon the market to handle things.

    Here is my bottom line. We do need market forces to ensure a dynamic and vigorous economy. Few doubt that. But unfettered faith in markets is, frankly, unsupportable. The Adam Smith fantasy world of free individuals transacting in transparent and open markets to the collective advantage of all borders on the delusional. The imperfections in the medical market should be obvious to all. Who can negotiate the best deal when they experience a cardiac arrest event? How can the average citizen bargain for best prices when pricing is so convoluted as to defy logic? I have freaking Ph.D and I cannot understand the medical statements I get. Just this week I got a statement (not a bill) for an office visit from last June.

    Our over reliance upon market forces to deliver quality health care is a failure. Years ago, I read an illuminating book written by a journalist comparing approaches to health care across nations. The author used his own medical issue to see how he would fare under differing policy regimes. (There are 3 major approaches … national health services, single payor systems, and mandated insurance schemes. The U.S. has all three approaches depending on population in a patchwork system, a medical world that defies all logic.). Hands down, he found the American system to be the most opaque, inefficient, and mismanaged. Of all things, he found France to be the best.

    There are many things about the country into which I was born that confuse the hell out of me. Near the top of the list is this. Why do Americans pay so much for a such a substandard product? I ponder this every time I call for a doctor’s appointment in a medical rich community like Madison Wisconsin and am told the next available slot to see the doc is six months out. 🙃

  • America’s embarrassing health care financing system.

    December 6th, 2023

    Time to move beyond my poverty series. I ended that run by stressing the fact that America’s poor record in addressing economic vulnerability, at least when compared to our peer nations, can partly be explained by explicit national policy failures and outright neglect. That is, other nations that look like us in most important ways have more sensible and compassionate policy regimes. That same insight, assuming it ranks as an insight, might be applied to health care in the U.S.

    Let us start with some basic facts. (Note: some numbers vary across reports due to variation in the years used and other technical causes, but the relative positions are unchanged).

    The U.S. spends almost $13,000 per year, per person on health care. That comes to some 17.8 percent of our GDP. That is way more than our peer nations spend. Germany, with the next highest outlays, lays out $7,400 per citizen. Canada, our neighbor to the north who share with us much in terms of culture and other external factors, spends $5,900 per Canadien, and like every other advanced nation, covers all of its citizens. Japan, further down the list, lays out only $4,700 per person.

    Okay, we spend more than all other countries. But we get more in return … right? In truth, reality could not be further from this seemingly reasonable presumption. Let us start with a global metric … life expectancy. Below, I lay out expenditures as a percent of GDP (Gross Domestic Product) and measured national life expectancy averages:

    Nation % GDP. Life Exp.

    Germany. 12.8% 83.4 years.

    France. 12.4% 85.5. “

    Canada. 11.7% 84.7. “

    Japan. 11.1% 87.6. “

    U. S. 17.8% 79.3. “

    Just in case you missed it, we spend way more of our national wealth on health care and still die earlier. One might be tempted to blame issues generally beyond the scope of normal medical interventions (e.g. life style choices). But, even here, the U.S. comes out poorly. The number of annual avoidable deaths (AD per 100,000 persons) suggests a sorry state of affairs in the States:

    Nation. Avoidable Deaths

    Germany. 195 (per 100,000)

    France. 164. ” “

    Canada. 171. ” “

    Japan. 137. ” “

    U.S. 336. ” “

    Think about this. Even when normed for population differences, we fail to save twice as many treatable patients in the U.S. as the Canadian system to our north does. And they pay a fraction of what we do in both out of pocket costs (OOP) and in overall costs, at least relative to us. The bottom line is that we are getting the medical shaft. Despite spending outrageous sums, we have fewer docs per 100,000, fewer hospital beds, and longer wait times to see physicians. And God help you if you are one of the 8.6 percent of Americans without any insurance, never mind substandard coverage. Try getting a medical appointment, even in a health care mecca like Madison Wisconsin. It takes forever … even if you have excellent insurance.

    This is just my intro teaser to our national health care disgrace. More to come!

    Note: A thank you to Mary Rowin for reminding me recently of this issue.

  • Povert & Policy #8 … reflections.

    December 5th, 2023

    I finished my original talk (and article) on which the last several blogs were based by returning one more time to the ‘Wisconsin Idea.’ Key to that idea is that each generation helps the next, passes the torch so to speak. Each one of you, I told my audience back then, has a responsibility to pass on to the next generation an understanding of and a passion for an issue (poverty) and for a population (the poor). If you do not care, who will?

    Ironically, the passage of a national welfare reform bill which many thought would exacerbate poverty and hurt the poor seemed to diminish interest in such issues. Not immediately, but within a decade or so, it struck me that no one was talking about ending poverty anymore. Perhaps the policy world was exhausted by the extended battles over what to do with welfare. Or perhaps everyone just wanted to fight about other stuff (abortion and immigration) after the anticipated post-reform apocalypse failed to materialize. Who knows?

    Yet, as I look back, there was a time where serious observers thought we might eliminate poverty in America. It did seem to be within our grasp. Were such people, like Nobel economist James Tobin, delerious? No! While no country has completely eliminated want, some have come quite close, though how it is defined makes a big difference. International comparisons of comparative poverty point out one truth. We in America can do a much better job, at least as well as our peers. It is merely a matter of will.

    Many of our peer nations (especially in Northern Europe) have poverty rates that put us to shame. When our child poverty rates neared 1 in 5 in some years, their rates hovered below 1 in 20. There are likely several factors involved in their comparative success but our rather pathetic policy approaches cannot be discounted. And let us face it. Public policy is something we control. Consider this. Over 1 in 3 elderly were poor in 1959, before the war on poverty was enacted. That rate fell to less than 1 in 10 by the 1970s. It wasn’t labor markets that accounted for this difference. It was the set of policies we enacted during the 60s and early 70s.

    There is an old truism … the morality of a nation is defined by how it treats its most vulnerable citizens … the young and the old. We decided that the elderly were worth our attention, and we ensured that most of them deserved to be spared from extreme want. Our young were another matter. We can speculate on why, but I sense that we could never forgive the alleged sins of the parents. Thus, our children continue to disproportionately suffer.

    It doesn’t have to be that way. We can do better. Perhaps the next time poverty works itself to the front burner, we will.

  • Poverty & Policy #7 … our neglected agenda.

    November 29th, 2023

    In the 21st century, we appear to be suffering from collective amnesia, at least when it comes to the issue of poverty. We seldom discuss this issue any longer in our national dialogue. It is as if it has disappeared from our consciousness. Where has this collective national neglect gotten us? About a decade ago, when the original piece was first put together, here were some of the numbers I shared:

    Overall poverty number (almost 50 million) is somewhat higher than it was several decades ago. Our child poverty rate (roughly 1 in 5 kids) would spark outrage in our peer nations though hardly is noticed here (note: our child poverty rate is sone 5 or 6 tines higher than in several Scandinavian countries);

    Income and wealth inequality are at levels not seen since just before the Great Depression. The share of all income going to the top 1 percent was back up to 24 percent in 2007, the year preceding our most recent economic collapse. While inequality in most advanced counties is up, the U.S. raned 4th worst out of 33 peer nations (in 2013);

    Social mobility rates in the U.S. have declined to the point that we have fallen behind our so-called ‘socialistic’ peers in that respect. By some measures of social mobility, the probability of moving up the income distribution (e.g., from the 4th quintile to the 2nd quintile for example), we rank dead last compared to our European peers. Want the American dream, head to Scaninavia;

    U.S. health care outcomes are middling at best. We stand next to Romania in some rankings despite spending more than anyone else, by far. Worse, we had the 47th highest infant mortality in the world when this piece was written;

    Our educational outcomes suggest that our kids are falling further behind our primary economic competitors, particularly in math and the sciences. Our rate of functional illiteracy among adults (reading comprehension substantially below expected grade level) places the U.S. well down the international list;

    Oh, and we had (when this was written) the highest teen birth rate among advanced nations.

    What I find particularly troubling is that our easy strategies for dealing with declining economic opportunities (e.g., stagnating incomes for most along with growing societal inequality) appear exhausted. We have already delayed marriage, had fewer children, thrown our spouses and partners into the labor market, saved less and borrowed more (using housing equity as personal ATMs), and added more advanced educational credentials after our names. And our children often delay establishing their own households (good luck kicking them out of the nest). And still, economic outcomes and opportunities grow more unequal.

    Yet, we see remarkably little outrage across the country. When new policies are posed, not enough people ask … ‘what does this do for the poor and those falling further behind in an increasingly bitter Darwinian struggle for success.’ So, let us ask again, did we lose the ‘War On Poverty?’ (I should note that this was one of five question I had on my Ph.D. preliminary examination back when the war was a recent and ongoing conflict.)

    On a superficial level, the answer is yes … at least in terms of anticipated expectations. But let us think about the question in a slightly different way. Think of the adverse trends over the past several decades that would be expected to exacerbate poverty here in the States and also increase the economic struggles for so many. Here are a few of those salient trends:

    Demographic changes … consider the dramatic rise in single-parent households raising children;

    Globalization … American firms seek to lower labor costs by outsourcing higher-paying jobs overseas.

    Technology driven changes, automation, and especially artificial intelligence … tasks formerly done by humans are increasingly performed by digital technology and robotics. After all, can robot driven trucks or college courses being led by AI technologies rather than troublesome professors be far off?;

    Immigration … with migration opening up in the mid-1960s, we saw the proportion of foreign borns jump from 5 percent to 13 percent, many (though surely not all) of whom were low-skilled individuals;

    Deunionization … unionized workers in the private sector fell from about one-third of the workforce in the 1950s to about 7 percent in recent years;

    A Fractal Economy … within specific sectors of the economy, compensation packages have grown wildly unequal even in the face of modest differences in talent and contribution. A typical CEO’s remuneration went from 27 times the average worker’s pay in 1973 to 262 times that average in 2008. A professional baseball player who gets one additional hit for every 10 plate appearances earns multi-million dollar contracts as opposed to another who fails to get that hit and struggles to make a living in the minor leagues;

    Macro-Policy Changes … aggregate federal taxes and benefits to individuals reduced inequality by 23 percent in 1979 but by only 17 percent in 2007. In effect, our overall policy structure became less favorable to struggling families;

    When you consider these adverse trends as a whole, including others that might be added, perhaps we did better than many of us thought, at least in moderating the deleterious impacts associated with an increasingly hostile world for those stuck toward the bottom of the income distribution. Without the efforts of so many since the mid-1960s, there would be even more hunger, insecurity, and suffering across the land. Still, so much remains to be done.

    I remember asking a colleague many years ago why he thought the United States had such an impoverished social safety net for the vulnerable and disadvantaged. He gave a one word answer … heterogeneity. I dismissed his answer as overly simplistic. Over the years, though, I came to fully appreciate his terse response. We are too tribal and have no common culture or identity. It is too easy for us to say, and to believe, that the less successful are ‘them’ and not ‘us.’ They are ‘the others’ who did it to themselves. In effect, we are not all in this together. It is instructive to note that Americans are more likely (by some 30 percentage points) than our European counterparts to respond positively to questions that assign success to personal efforts as opposed to luck or social environments or inherited family fortunes. If you struggle, it is your fault. You should have made sure you were born to a wealthy family.

    Alas, we are getting to the end of our journey through the poverty and policy thicket. Next time, however, some final thoughts.

  • Poverty & Policy #6 … ending welfare as we knew it.

    November 27th, 2023

    In the 1990s, even those of a more liberal persuasion saw a different role for government, one where programs ought to be designed in ways more consistent with prevailing norms which clearly were drifting in a more conservative direction. This normative drift was clear when I decided to spend a year in DC to work on President Clinton’s initiative to reform welfare. From June, 1993 to May, 1994, I was technically on leave from IRP and assigned as a senior policy consultant to the office of the Assistant Secretary for Policy and Evaluation (ASPE) at Health and Human Services (HHS). This was the nerve center for the administration’s welfare reform effort, a broad initiative involving several federal departments.

    Not surprisingly, I found that the tensions across the partisan divide to be demanding, though not as daunting as they were to become in the post-Gingrich era. However, the normative tensions within the administration were equally as challenging. While some attention was directed toward income poverty, for example by liberalizing the Earned Income Tax Credit, the main focus was on mitigating welfare dependency. Whether an initiative would ‘end welfare as we knew it’ became the new litmus test for many in determining the worth of any new idea. At least, that was the message from the White House.

    A debate raged within the Clinton administration: what did his oft repeated mantra of ‘ending welfare’ actually mean. One thing is certain. I seldom heard the old litmus test (what does it do for the poor) as reform ideas were being vetted. Ex UW Chancellor and current HHS Secretary Donna Shalala was an unabashed liberal. She focused her energies on health care reform, seemingly avoiding the conservative drift of the internal welfare debate. In any case, these internal debates delayed the anticipated completion of the welfare proposal long enough to forego serious Congressional consideration until after the midterm elections. By then, it was too late; the Republicans had taken control of the House.

    That sad end was in the future. I recall rushing to DC as soon as my teaching obligations were finished for the Spring semester in 93. I worried that the reform package would be completed without my input. No worries. I knew the first morning that I had missed nothing. The internal policy struggles and philosophical disputes would go on for months. In January of 1994, word came down that welfare reform was being put on the back burner. Health care reform, an initiative headed by Hillary, was being prioritized.

    That weekend, however, I watched Senator Patrick Moynihan on one of the Sunday AM talk shows. He blustered about there being no health care crisis. Rather, we had a welfare crisis. While I thought him dead wrong on his positive assessment of our health care system, I did appreciate the signal he was sending the administration. Welfare was back on the front burner when I arrived at the Humphrey Building (where HHS is located) the very next morning.

    The old tensions remained, however. Essentially, the liberals scattered throughout the administration were desperately trying to preserve cash welfare as an entitlement in the face of changing normative opinions and political realities. In the end, that was a futile delaying tactic. At the 1994 State of the Union address, President Clinton announced he would have a welfare bill before Congress that Spring. I immediately pronounced to all who would listen that the date on the Bill would be June 21, 1994 (which I assumed was the final day of Spring). I proved prescient (that indeed was the date on the Act). I must admit, though, a good deal of my rhetoric survived in the final product. Nevertheless, it was DOA for reasons mentioned above. Byvsummer of 1994, all political attention was focused on the upcoming midterm Congressional elections (which brought Newt Gingrich and the Republicans to power).

    The GOP had their own ideas, as you might imagine. After much back and forth, Clinton finally relented and signed a Republican-sponsored Bill in August of 1996. As the story goes, all of his advisors counseled him to veto the Republican Act except AL Gore, his Vice President. When he did sign it, several of his top advisors resigned, including Peter Edleman and Mary Jo Bane. The Act he signed created the Temporary Assistance for Needy Families (TANF) program. This law ended the existing entitlement to cash assistance, imposed time limits on the receipt of assistance, and strengthened work requirements. In a reasonably short order, the national cash welfare rolls fell from 14 million recipients to about 4 million.

    That decline should not have shocked anyone. The W-2 reform in Wisconsin suggested what might happen if you transformed the very cultural foundations of a cash welfare program. An earlier conversation I had with a county welfare director in western Wisconsin gives you an idea of how profound the impacts were. Before W-2, she told me they had 1,400 AFDC cases. In the run-up to the reform (where the new expectations were made clear), the caseload fell to about 900 cases. When the county signed a contract with the state to run W-2 (as a block grant, not as a welfare entitlement), the state and county agreed that about 500 cases would be an excellent assumption for the post W-2 caseload. After the dust had settled, they found only 60 cases remaining. Alarmed, the director launched a study to find out what happened to the lost families. Most, she told me, had just disappeared.

    The Welfare Peer Assistance Network (WELPAN) concept, which I put together just before the enactment of national reform in the mid-1990s, periodically brought together top state welfare officials for intense two or three day discussions on the future of reform. For me, it was another delightful counter in what I considered my professional candy store where I confronted captivating and confounding social issues.

    The Midwest group (there were two others for a time) endured for well over a decade and, in my opinion, captured the best thinking of those who were making reform a reality on the ground. Given the new flexibility available to welfare administrators, and enjoying freed-up resources for a time, these officials yearned to go back to dealing with the root causes of poverty rather than merely slapping band aids on the symptoms. They discussed ways of once again joining human services with income support to heal whole families. And they played with ideas for integrating a broad array of human services into coherent packages to deal with the complex challenges many of these families faced. Finally, they sought ways to intervene early … before problems became entrenched and intractable. It was such an exciting and ambitious conversation, at least for a time.

    Once the welfare debate ended on the national level, however, so did any substantive policy conversation regarding poverty and the poor, at least in Washington. For a while, there were great alarms raised among liberals about what would happen to poor mothers and children in the TANF era. Disaster was predicted while the federal government and Foundations poured millions into so-called ‘welfare leavers’ studies. But none of the worst predictions materialized. Most on the left failed to realize that cash welfare already had pretty much disappeared … typical benefit levels having eroded substantially over the previous two decades. When I arrived in DC, I kept saying we better reform AFDC soon, or there would be nothing left to worry about.

    With national legislation a reality, and no catastrophic fall out, our political dialogue moved on. In contrast, Prime Minister Tony Blair announced a kind of British ‘war on poverty‘ in 1999. He pledged to eliminate child poverty in Great Britain over the next two decades. It is hard to imagine any U.S. politician, even President Obama, making a similar announcement during the post- reform period. The hard right campaign to reorient the acceptable political dialogue in America had succeeded beyond their wildest expectations. Everyone, it appeared, avoided discussing the poor. When they did, it was often in primitive, Dickensian terms.

    The ferocious welfare debate was over. What did we achieve, if anything? Well, keep reading.

  • Poverty & Policy #5 … the tide turns.

    November 25th, 2023

    Remember the Reagan revolution? ‘Government is not the solution to our problems. Government is the problem.’ Another favorite from the conservative 80s was ‘we had a war on poverty and poverty won.’ Beyond the clever bon mots, we had a visible shift in the dominant political perspective and in governing ideological norms. Suddenly, it seemed, we shifted from aggressive public interventions to remedy social problems to something like the following:

    Supply-side economics, market-based strategies, privatization and related laissey-faire approaches, and the glorification of minimal government.

    A pronounced shift from ending poverty to minimizing welfare dependency; and

    The devolution revolution … the abandonment of federal solutions to the promotion of local solutions to social problems through block grants and turning social issues back to the states.

    The revolution promised by the new crowd in Washington would fundamentally transform America. In the end, though, Reagan imposed more tax cuts than anything else while failing to cut social spending as he had promised his base. When the so-called ‘great communicator’ (a nickname that always puzzled me) took office, the top marginal tax rate was 70 percent. He cut that to 50 percent by 1982 and to 28 percent by the end of his administration. Other taxes were slashed or revised as well. In 1980, the rich paid half of their income in taxes, a rate that fell to 35 percent by the end of the 80s. Progressivism in our tax system took a big hit.

    Reagan, however, did bring about a real neo-liberal revolution in many respects. He limited anti-monolopy actions, loosened the rules on banks and Wall Street, and attacked unions when he could. Union membership plummeted to less than 17 percent of the workforce. Power increasingly was centered in the financial sector and the corporate elite. The gains accruing to the moneyed class skyrocketed, as well as the dangers associated with letting them run amok.

    Where Reagan disappointed his conservative base was in cutting spending, especially social spending. There is a very telling vignette told by David Stockman, Reagan’s budget director during the early 1980s. He presented the President with a list of social programs along with three possible actions … spend at current levels, take a modest cut from this program, or take a ‘big whack’ out of it. When confronted with specific decisions, Reagan balked. He had hard rhetoric but soft instincts. Stockman was appalled, knowing what this meant to future budget deficits.

    The net result was that social problems did not disappear, nor did the costs associated with them. The Laffer hypothesis, that government revenues would increase with sharp tax cuts because of the growth associated with lower taxes, proved to be a laugh as most impartial economists predicted. Slowly, poverty rates began to creep back up while inequality began to rise more rapidly. Most of all, the Reagan revolution resulted in alarming federal debt levels, a consequence of the Republican Party abandoning balanced budgets as a policy objective. (Note: only Clinton and Obama would make a serious dent in budget deficits since the 80s.) If the Democrats had been the ‘tax and spend’ party, the Republicans were now the ‘borrow and spend‘ alternative, especially where weapons were involved. You simply cannot have enough toys that kill people. But their wealthy supporters would do well indeed.

    The intellectual tide was also turning. Charles Murray wrote a very popular book titled Losing Ground in which he argued that public interventions for the poor from the New Deal on exacerbated the problems being addressed rather than alleviating them. This was followed up by a book by Lawrence Mead called Beyond Entitlement. Larry’s argument was that welfare type entitlements corroded personal responsibility which provided theoretical support for Murray’s pessimistic views on the efficacy of social investments to society. The two works supported the views that poverty pretty much was a personal failure (the ‘unworthy poor’ thesis) with serious consequences for society.

    A cultural and normative shift was under way, which had been a goal of those underwriting a neoconservative revolution that became quite serious by the early 1970s. Poverty was no longer a salient policy concern; welfare costs and welfare dependency dominated the discussion. More than welfare writ large, it was the Aid to Families with Dependent Children (AFDC) program that stoked public indignation the most, even though it was relatively small in terms of both caseload and expenditures. AFDC, it seems, proved a convenient proxy for a broad array of contentious public battles involving normative disputes concerning family, sex, work, personal responsibility, government over reach, compassion or the lack thereof, and so much more. It was the new front on which cultural battles would be fought politically and, as many would say, would become the ‘Mideast of domestic policy making.’

    In the meantime, income and wealth inequality began to worsen. From a low of 9 percent of total income at the end of the 1970s, the top 1 percent saw their share grow to 12 percent by 1984 and to 20 percent in 1994. After slowing during the Clinton years, this unequal division of the economic pie continued to worsen. Just before the housing bubble crash of 2007-8, inequality equaled what it had been at the onset of the Great Depression in 1929. These inequality changes can be considered tectonic shifts that fundamentally altered the face of opportunity in the U.S.

    America, however, remained distracted by other issues like Clinton’s failed effort to extend health care to all … something that every other advanced nation managed to do. Rather, we continued to spend way more per-capia than anyone else on health care for outcomes that were disappointing at best. We have often struggled to beat out Romania in basic health outcomes while other key measures, like infant mortality, are shockingly bad when compared to our peers. But our public discourse focused on the welfare crisis rather than doing anything about our embarrassing health financing system.

    Soon, much of the world again watched Wisconsin as Governor Tommy Thompson, later Secretary of Health and Human Services under George W. Bush, launched a host of welfare reform initiatives. The first to grab national attention was Learnfare, an initiative that linked children’s school behaviors to their parent’s welfare benefits. It was to be the first of many that introduced what might be termed a ‘social contract’ notion to public assistance, an approach where help was conditioned on an increasing number personal behaviors. This was, in reality, a very old conception that dated back to the Elizabethan Poor Laws but now was touted as something innovative and exciting. Moreover, Thompson was able to get his ideas implemented where so many others had failed given the political minefield welfare had become. Buoyed by the notoriety of his initial reforms, the Governor proposed more dramatic changes, including a so-called welfare replacement concept known as Wisconsin Work’s or W-2.

    While Thompson’s rhetoric was tough, the reality of his reforms was more tempered. He expanded child care help and access to health care and workforce services for struggling families, initiatives that greatly helped the working poor. He was quite willing to help those he felt were playing by society’s rules and expected norms. Through all these changes, the University (especially IRP), played no role. The Wisconsin Idea had hit what night be termed a ‘rough patch.’

    I can’t do justice to the kerfuffle between the Governor and IRP during this period but the rupture was serious. Mostly, it was a matter of misinterpretation. For the Governor (any politician really), you were either on his side or against him. There was no such thing as objectivity. I was called by the media to comment all the time and would try to be a real academic … giving both sides on controversial issues. The press, I found, often would choose only those comments that would rattle the Governor. I realized just how badly he saw me (and IRP) when he attacked me personally at a public event in Chicago while I sat in the audience. Welfare reform took no prisoners.

    As a final note on the nadir of the Wisconsin Idea, I did start working with Jennifer Noyes, Thompson’s policy aide and eventually the head of W-2. Tentatively, we put the IRP-State relationship back together again. Jennifer eventually came to the University (something for which I lobbied hard) became Associate Director of IRP and now works directly for the UW Chancellor. Sometimes you get things right.

  • Poverty & Policy #4 … the ‘war’ stalls.

    November 21st, 2023

    By the early 1970s, those fighting to sustain community and personal rehabilitation strategies by employing social work technologies faltered. Rather, the debate began to focus more on direct resource transfers (e.g., cash, food, shelter, health care). There were many reasons for this, but surely one was the feeling that poverty was not a pathology. The poor needed money and not meddlesome service interventions. Critics of the emerging perspective countered with the allegation that merely giving the vulnerable money masked deeper challenges including possible individual and familial dysfunctions. This debate continues.

    In my first professional position (1971) working for the State of Wisconsin’s human services agency, social work types dominated the key administrative positions. In fact, the traditional welfare functions had just been integrated into the agency administering classic human service systems. Oddly enough, this happened just as the federal government was divorcing the giving of money from any offer of social services to struggling families. By the time I migrated to the Institute for Research on Poverty at the University of Wisconsin (1975), virtually all the important scholars doing poverty related work were economists.

    This revolution did not occur over night. Sometime in the early 1970s, those fighting for community and personal rehabilitation strategies had pretty much backed away from playing an active role in the anti-poverty drama. The debate, as noted, had shifted to direct resource transfers. In fact, I noted that social workers were abandoning the field in the ‘war’ on poverty by the late 1960s, a retreat that picked up speed over time. I think they are still running. I served on the UW School of Social Work’s master’s program admissions committee for many years. When I ran across an applicant who expressed an interest in working with the poor, I would call for the paramedics so that I might be revived. I also continued to work with the State Human Services agency periodically. By the 1980s, you could fire a cannon down the agency hallways and not risk hitting any social workers. They largely had disappeared, probably all becoming marriage therapists.

    By the time I was engaged in poverty work as an academic, the focus clearly had shifted to cash and cash-like transfers. For example, President Richard Nixon, despite his many flaws, proved to be a big spender on social programs, though he was not a fan of rehabilitative service programs. Among other things, he:

    Instituted a cost of living provision to annually update benefits for Social Security benefits;

    Federalized welfare for the blind, disabled, and aged under the Supplemental Security Income (SSI) program;

    Nationalized the Food Stamp program (now known as SNAP) so that it became virtually a funny-money Income floor or what economists termed a ‘negative Income tax;’

    Supported and almost passed a real cash-based negative income tax. His ‘family assistance plan’ came within one vote of passing in the Senate;

    A bit later, one of the most important anti–poverty measures was introduced, the Earned Income Tax Credit which continues to lift working families out of poverty to this day.

    Despite all this progress, the underlying tensions evoked by the poverty wars never were far from the surface. Nixon dismantled or slashed many remnants of the original War On Poverty, oversaw the separation of human services from the transfer of cash to poor families with children, and vetoed the Comprehensive Child Development Act.

    War fueled deficit-spending (financing the Vietnam conflict), a robust social safety net, and declining income inequality worked their magic. In 1973, poverty would fall to its nadir, 11.1 percent, a figure we would not see again for several decades. Moreover, measures of inequality had also fallen dramatically with the share of the income pie going to the top 1 percent falling from about one-quarter in the late 1920s to slightly less than 10 percent in the 1970s.

    In 1973, the Arab oil embargo disrupted the long hegemony of the U.S. economy. By this time, we had real competition in global markets. And the political right, emboldened by a strategic plan to reshape government in a hard conservative direction, paid close attention to the tactics laid out by future Supreme Court Justice Lewis Powell. He called for the reframing of every institution underlying American governance and for transforming the very normative culture in how we thought about society. A new ‘war’ had been declared, this time on what was seen as an ‘activist’ government.

    The original War On Poverty was losing momentum. So, what had we achieved? We found ourselves with a social safety net that yet reflected an earlier world view of the poor, one based on a notion of the ‘worthy’ and ‘unworthy’ poor. For the worthy poor, those NOT expected to work, assistance was relatively more generous, included reasonable cash transfers, and was more likely to be seen as a federal responsibility. For those deemed unworthy, such as those expected to work like able-bodied male adults, assistance was meager at best, usually in the form of non-cash help, and remained largely a local responsibility. For those in the middle, like single mothers with children, we were torn. Control was split between the federal and state levels, remained uneven in terms of generosity across jurisdictions, and was increasingly conditioned on proper behaviors.

    In addition, one could feel this ideological pushback gaining momentum, abetted by some alarming social trends. Many were frightened by civil discord (urban riots) and by what they saw as a breakdown in law and order. Moreover, there appeared to be a fracture in expected social conventions and norms. For example, the nonmarital birthrate began an inexorable rise from 5 percent in 1960 to about 40 percent before leveling off. And welfare rolls continued to expand through the 1960s and 1970s, not decline as many had predicted given a robust economic climate. Some feared that society was losing its mooring and blamed welfare for all the problems.

    A growing right-wing pushback was aided in no small measure by a planned growth in the conservative voice, as was called for in the Powell strategy memo. In earlier debates, the American Enterprise Institute (created during WWII) had been one of the only think tanks opposing an expansionary public assault on poverty. By the end of the 1970s, contributions from wealthy donors had created a score more centers of conservative thought with the Cato Institute and the Heritage Foundation leading the way. The Virginia and Chicago Schools of conservative academic study also mounted a serious campaign to undermine the prevailing consensus around Keynsian economics.

    In Washington, reform politics seemed exhausted. President Jimmie Carter’s Program for Better Job’s and Income was a last gasp for a positive national comprehensive reform as residual concerns about poverty appeared to be going the way of the Titanic, slipping slowly out of sight. Perhaps sensing the shift in where the debate would next settle (in the states), the Wisconsin Legislature mandated its own reform study … an initiative chaired by Economics Professor Robert Haveman and partly staffed by me. Between us, we developed a policy technocrat’s dream that included broad changes to the tax system, to the workforce development system, to the child support system, to name just a few. Some of our ideas saw the light of day, including the implementation of a state Earned Income Tax initiative and several significant child support reforms. Still, making headway was increasingly challenging, unlike a decade or so earlier. We would soon discover that even the vaunted ‘Wisconsin Idea’ would soon encounter rough waters, especially with the State’s new political elite.

    Stay tuned!

←Previous Page
1 … 13 14 15 16 17 … 31
Next Page→

Blog at WordPress.com.

  • Subscribe Subscribed
    • Tom's Musings
    • Join 41 other subscribers
    • Already have a WordPress.com account? Log in now.
    • Tom's Musings
    • Subscribe Subscribed
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar