Monday, December 30, 2013

"ObamaCare - A Single-Payer in Our Future?"

Sydney M. Williams

                                                            Thought of the Day
                                        “ObamaCare – A Single-Payer in Our Future?”
December 30, 2014

“I think the Russian problem is not just the president as a person. The problem is that our citizens, by a large majority, don’t understand that their fate; they have to be responsible for it themselves. They are happy to delegate it, say, to Vladimir Vladimirovich Putin, and then they will entrust it to somebody else.” So spoke Mikhail Khodorkovsky in Berlin, upon his release from prison in Russia.

While his syntax was a little confused, Mr. Khodorkovsky’s meaning was clear. He synthesizes the dangers posed when governments intrude too deeply into people’s lives. An increase in dependency translates into a loss of personal responsibility and individual liberty. Certainly in civilized societies government has a responsibility to help those who cannot look after themselves. That includes the ill, the handicapped, the impoverished and the elderly. However, when we include those who are able to care for themselves, but who find it easier to live on the largesse of a “caring” government, dependency can become addictive and regressive to human progress. Programs that start with benign intentions can morph into the despondency of dependency. Historically, self-reliance and the assumption of personal responsibility for our behavior have been integral to both individual success and the success of our nation. When we delegate responsibility to some other person or institution, we lose freedom.

These thoughts lead, obviously, to ObamaCare, its promises and its perils. The law was passed unilaterally. Members of Congress who supported it famously acknowledged they were unaware of its contents. “We have to pass the bill so that you can find out what’s in it,” infamously and arrogantly stated House Speaker at the time, Nancy Pelosi. Not surprisingly, implementation has been disastrous. Everyone knows that the prior system had substantial flaws, but the flaws of the ACA appear even more substantial. The question now is will the Administration continue to tinker around the edges, effectively changing the bill without Congressional agreement, as they have been doing? At some point, is it not likely that the Courts may rule on the legality of the Executive branch arbitrarily making such changes? Should a Republican majority control both Houses next November, will Congress attempt to repeal the law in 2015? If so, the President would surely veto any such attempt. Will the President, working with Congress, attempt to salvage what is good about the law and toss aside that which is not? As desirable as such efforts would be, it seems unlikely that this President and this Congress will work together to reconcile their differences. Both seem implacable.

I don’t pretend to have a crystal ball, but the consequences of the arrogant grab for power, which the passage of ObamaCare represented, will reverberate through our country for years to come. ObamaCare represents the most significant subordination of the individual to the collective in eighty years.

No matter the “bumps in the road” ObamaCare has encountered thus far, it is my guess that progressive Democrats will stubbornly hold onto the concept of a national health system. They are likely to argue that the problems we have encountered would be rendered moot if only we had a single-payer.

It seems likely that insurance companies will be demonized for rate gouging. As businesses owned by public shareholders or policy members, insurance companies will have to adjust premium rates to the arbitrary changes the Administration has been making. To argue, as some have, that the ACA combines the best of the public sector with the best of the private ignores the fact that insurance companies and providers are in this business (and predicament) because of federal fiat. The Administration may be forced to make good any lost revenues to the insurance companies, effectively having taxpayers bail them out – something, I am sure, that will be met with resistance. Many doctors and other providers are already opting out of the system. It seems probable to me that a single-payer system may have been the intent of Mr. Obama and HHS Secretary Kathleen Sibelius all along. Such sentiments appeal to many. Those who run “big government” and those who prefer the cocoon of “feel-good” government will find a single-payer as comforting as sipping hot chocolate while wearing red and black-checked pajamas on a cold winter’s night.

None of this is to suggest that government should not play a major role to play in our lives. Government is critical to our well-being. We are not anarchists. But as Americans, we have a self-interest in perpetuating the freedoms and individual rights that have allowed this nation to flourish and prosper. Nevertheless, civilization has made us communal. Community living is our natural state, whether it is in a small village, in a suburban housing complex or in a city apartment, we rely on one another. We depend on government to provide protection and security, to educate our children, to build roads and bridges, to care for those incapable of caring for themselves, and for myriad other services. But it is also important to recognize that every time we ask government to assume a new responsibility, we give up some element of individual freedom. It is also important to understand that politicians are not impartial observers. Their success is largely measured by an expansion of their duties. Ceding power, regardless of political affiliation, is not their natural state. The endless debate that engages people like me is not the necessity for government, it is the degree to which it regulates our lives. All politics can be seen as a continuum, with despotism at one end and anarchy at the other. The question is always: what is optimum – where on that continuum is the spot that provides the most good for the most people without destroying individual initiative that allow societies to prosper and economies to expand?

Certainly, I prefer a spot that emphasizes personal responsibility over dependency. By nature, I am one who believes that the wisdom of the individual – of the many, if you will, as expressed through their elected representatives – is far greater than the omniscience of the few, of an executive or appointee, no matter how benevolent or well-intentioned he or she may be, or what promises have been made.

It is true that there was much wrong with the existing healthcare system, and one benefit of ObamaCare is that it brought the subject front and center. While the majority does not like what they see in the ACA, there are very few people who want to go back to the way we were. The debate should continue and myriad options have been offered.

A lack of competition and a want of individual involvement played important roles in the high cost and inefficiency of the former system. Healthcare is an individual and family matter. No two families or individuals have the same needs. Employee-provided healthcare was established in the days following World War II as a benefit for enticing employees at a time when wage controls prevented more direct competition. However, employee-provided insurance assumes a “one-size-fits-all.” The system would be more efficient if employers provided equivalent funds to their employees, allowing them to purchase the insurance that met their individual needs. Health savings accounts (HSAs) would allow consumers to set aside pretax dollars to cover any deductible costs. One positive consequence of HSAs would be that providers would be forced to compete for patients on a basis of price and outcomes. Policies owned by the individuals, would be portable, solving the problem of pre-existing conditions. Catastrophic insurance should be of first order. Many consumers can afford routine physicals and most generic drugs. The system would be improved if insurance companies were allowed to compete across state lines. Additionally, any reform would have to reduce frivolous lawsuits, as fear of being sued has too often been responsible for doctors and hospitals prescribing needless and endless procedures and medicines, protecting their butts rather than doing what is right for the patient. ObamaCare has not addressed any of those concerns, and a single-payer system would only magnify them.

In the current issue of Foreign Affairs, Lane Kenworthy, professor of Sociology and Political Science at the University of Arizona, wrote: “…the ACA represents another step on a long, slow, but steady journey away from the classical liberal capitalist state toward a peculiarly American version of social democracy.” He asserts: “But the opponents are fighting a losing battle and can only slow down and distort the final outcome rather than stop it.” He notes that the Nordic countries have adopted market-friendly approaches to regulation. He writes: “The Nordic countries’ experience demonstrates that a government can successfully combine economic flexibility with economic security and foster social justice without stymieing competition. Modern social democracy offers the best of both worlds.”

The problem with this somewhat Panglossian and naive view is that its relevance is difficult to assign to the complexity of American society. Recent events in Scandinavia suggest all is not as copasetic as Professor Kenworthy would have us believe. The populations of the five Nordic countries (Denmark, Finland, Iceland, Sweden and Norway) plus the Faeroe Islands equal about 26 million people, versus our 330 million. The U.S. is a polyglot mixture of all races, languages and religions. In a recent article in The American Conservative, entitled “The Nordic Mirage,” Samuel Goldman wrote that such policies were “achieved in societies where the vast majority of the population looked the same, talked the same, had names and relatives in common, went to the same churches, and so on.” Small, homogenous societies are much more willing to bear the burden of supporting their fellow citizens than members of large diverse ones. Besides which, immigration is becoming a problem to Scandinavia, testing whether the social welfare state will work as well in the future as it did in the past. The Economist noted a year ago that their open-door policy toward permitting asylum to political refugees is being challenged on economic grounds. In Sweden only 51% of non-Europeans have a job and 50% of all prisoners serving more than five years are foreign born. All of this is threatening the principal of redistribution, which is the heart of the welfare state.

Free market capitalism is not without inequalities and imperfections. We should never stop trying to improve the system. Nevertheless and despite the constant search for new and better ways, free market capitalism has done more to lift people out of poverty and to raise living standards than any other system yet devised. But, with freedom comes responsibility. Its corollary, dependency, as Mr. Khodorkovsky suggests, leads irrevocably toward intoleration and tyranny. While I disagree with Professor Kenworthy’s sense of the inevitability of the social welfare state, I fear the Administration is in agreement with him; thus my concern that they will push for a single-payer system. If successful, the professor’s prognostications will be fulfilled. Caveat emptor.

Labels:

Monday, December 23, 2013

"Political Correctness at Christmas"



Sydney M. Williams
                                                               Thought of the Day
                                                   “Political Correctness at Christmas”
December 23, 2014

While I am not a particularly religious person, I do know that Christmas is first and foremost a religious holiday. Next to Easter, it is the most important holiday on the Christian calendar – marking, as it does, the birth of Jesus. Christmas Eve marks the last day of the four-week period of Advent. The word advent stems from the Latin “adventis,” meaning “impending arrival.” Following Christmas, the holiday stretches for 12 days, culminating with the feast of the Epiphany on January 6, which celebrates the arrival of the Three Kings to see the infant Jesus.

Just as Christians should be respectful of all other faiths (including those with no faith), Muslims, Jews, Hindus, Buddhists, Atheists and others should respect Christian beliefs. It does not mean others should share our beliefs, and we certainly should not impose our beliefs on those who believe differently. Tolerance is a good thing, but intolerance and (especially) tolerance of intolerance are wrong. The decision by a school in Kings Park, New York (a wealthy enclave on Long Island’s north shore in Suffolk County) to remove all religious references from a rendition of “Silent Night” as sung by fifth graders was a ridiculous example of offending the many to satisfy what a “politically correct” minority, in the form of town and school leaders, thought might offend a few. It was an absurd decision. “Silent Night” is a beautiful, iconic song, and it is religious: it celebrates the birth of Jesus. If the school chose not to have the fifth graders sing it, that would be fine, though it would deprive the students of one of Christianity’s great hymns. The pretension of celebrating the holiday, without actually doing so, makes a mockery of the meaning of Christmas, and it reflects poorly on the town and school leaders. “Silent Night” does not have the same meaning when words and phrases like “Holy infant,” “Christ the Savior” and “Jesus, Lord, at thy birth” are replaced. There is no reason that children should feel embarrassed about their religion. Once we start apologizing for our beliefs, where does it end?

Words matter. When we hide behind euphemisms, and the camouflage such words provide, we do more harm than good. English is a straight-forward language. With an estimated 1,000,000 words, English is the most verbally inclusive language in the world. We should be able to express ourselves clearly and avoid the confusion of what George Orwell called “newspeak” and “double think.” When we refer to acts of terrorism as “man-caused disasters,” we diminish the horror of the killings and dishonor the victims. When the President referred to the Fort Hood shootings as “work-place violence,” it misled the public about the rantings of Major Nidal Hasan who shouted “Allāhu Akbar” (God is great), as he shot and killed thirteen unarmed people. The victims of terrorism are war-dead and war-wounded. They should be treated as such. We become less vigilant, and therefore less safe, when terrorists and their terrorist-actions are given the cover of benevolent euphemisms.

Political correctness can be defined as the avoidance of expressions that are perceived to exclude, marginalize or insult specific groups of people. While we should always strive to be respectful of others, the perpetual use of politically correct words and phrases risks costing us our moral sense. Political correctness can assume different forms. It can be so ridiculous as to be humorous; it can be so disingenuous as to be deceptive, and it can be uttered with such naïveté as to prove dangerous.

When we see the word “manhole” replaced by “maintenance hole,” we are more amused than offended. The same is true for silly terms like “visually challenged,” for blind, or “dysfunctional family,” for a broken home. Mankind has become “humankind.” “Selective speech” sounds better than censorship, but it is the same thing. Such words and terms can also be deceptive, or disingenuous. The changing of the name of North America’s highest peak from McKinley to Denali, appealed to the Athabaskan Indians, but did a disservice to our 25th President. And, of course, American Indians are no more; they are now “Native Americans.” The suggestion that the Washington Redskins should change their name to “Bravehearts” made controversial a name about which there had been no controversy. Come to think of it, where were the PC police when Elizabeth Warren claimed Cherokee heritage?

Most people no longer die when they stop breathing, they “pass away” – pass to where? No one knows. Barbara Walters must have heard a different Barack Obama than I did in 2008. A few days ago, she lamented: “We thought he was going to be the next Messiah!” Who was “we,” or was she using the personal pronoun in the royal sense? Even if she really felt he was an unusually good man, isn’t “Messiah” an unusual euphemism for a fallible man?

More important, politically correct language can be dangerous. Terrorists have been called “radicals” (BBC), “perpetrators” (New York Times), “militants” (Chicago Tribune), “Criminals” (The London Times) and “assailants” (National Public Radio) – everything but the terrorists they actually are. Such euphemisms are misleading. They serve to downplay the War on Terror. Since the early 1980s, we and our citizens abroad have been attacked multiple times with “man-caused disasters,” “violent extremism,” and “asymmetrical warfare.” We no longer have overseas combat operations, but “overseas contingency operations.” Try telling that to the families of the 6800 military personnel killed in Iraq and Afghanistan, or to the families of the 3000 civilians killed on 9/11 and the thousands killed by terrorists before and after that date.

The Christmas season is filled with such illustrations. Some towns are prohibited from exhibiting symbols of Christendom on the grounds they might offend non-Christians, or that they might violate the intent of the First Amendment in terms of the separation of church and state. The first words of the First Amendment don’t mention anything about “separation:” The first words of the First Amendment read: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof…” The Founders did not want to replicate the Church of England with a Church of the United States. They wanted people to be able to assemble and pray freely. Whose rights are being violated when Christmas trees become Holiday trees? Whose rights are being violated when nativity scenes are prohibited, when children are forbidden from acting out Christmas plays, or when Christmas carols are robbed of their proper lyrics?

It is true that Christmas has become commercialized. Seemingly forgotten amid the hustle and bustle, has been the religious meaning of the day. From the stampeding of stores on the day after Thanksgiving to last minute purchases on Christmas Eve, the airwaves and newspapers are filled with stories as to how much money consumers are spending this year versus last year. But, no matter the crassness, this period is also filled with joy and laughter. While wars have been fought over religion for centuries, and Christians and others are still persecuted, the inherent message of Christianity is one of peace, love and good-will, of justice, righteousness and forgiveness. These are attributes worth celebrating, not denying.

The fact that the world has never achieved the “peaceable commonwealth” is no reason to quit. Some quests have no end, as Stuart Little discovered: “As he peered ahead into the great land that stretched before him, the way seemed long. But the sky was bright, and he somehow felt he was headed in the right direction.” It is the search that is important and which must persist. There is no end to history, nor is there an end to our journey.

Christians should persist in using words that invoke Christ at this time of year. A Christmas tree is not a “Holiday” tree. This is a “holiday season,” but it is a holiday season because of the celebration of Christ’s birth. “Merry Christmas” is a jubilant greeting, invoking images of joy and wonder. There is no reason to avoid its usage, or to whisper “Merry Christmas” shyly because of shame, or surreptitiously for fear the ‘thought police’ might whisk you off to prison. As a nation and as individuals, and as long as we are respectful, we are far better served with honesty and straight-talk, rather than hiding behind a facade of political correctness.

Labels:

Friday, December 20, 2013

"Purge in Pyongyang"

Sydney M. Williams

                                                                 Thought of the Day
                                                                “Purge in Pyongyang”
December 20, 2013

Kim Jong-un’s killing of his uncle is a big deal. It is not as though Jang Song-thaek was one of the “White Hats” of North Korea. (There are none that we know of.) He was a notorious thug, a brother-in-law and accomplice of former dictator Kim Jong II. Mr. Jang was instrumental in upholding the policy that no one in North Korea can travel, speak or worship freely. He was there when North Korea experienced the “great famine” of 1995-97, which saw between 2.5 million and 3.7 million people starve to death. (That represented between 10% and 15% of the population.) Jang Song-thaek was described at his trial as “despicable human scum” who was “worse than a dog.” The expediency with which he was arrested at the behest of Kim Jong-un, tried and dispatched suggests a tyrannical government led by a psychotic – not the preferred type of individual for leading a country with nuclear weapons.

Keep in mind, baby Kim is a man who took seriously, earlier this year, an article in “The Onion” calling him the “sexiest man in 2012.” Mr. Kim is a man who it was said in February had “moon-walked” – “the first to do so since Eugene Cernan and Harrison Schmitt in 1972.” The man is certifiable, yet he controls a nuclear arsenal and his country is developing missiles capable of reaching our western shores. North Korea may be a client state of China; perhaps they are pulling the strings. Let us hope so. According to the current issue of The Economist, Kim Jong II was a “provocative menace, but he was at least predictable.” The son appears scarier. He reminds one of a psychotic killer in a thriller movie, the type who says menacingly, with the glaring eyes of a madman, ‘whatever you call me, don’t call me crazy!’

The concepts of freedom and democracy that swept the world following the collapse of the Soviet Union, and which gave life, inspiration and hope to millions of people in places ranging from Eastern Europe to Southeast Asia to South Africa, bypassed North Korea. Since those days that Francis Fukuama termed “the end of history,” dictatorships have strengthened in places like Pyongyang and radical Islam has been energized. It is a very different and far more dangerous world than the one in which we grew up during the Cold War. The Soviet Union, as a nuclear power, was a menace and millions of people died because of the harshness of their regime, but their leaders engaged with the West. They had skin-in-the-game. One does not have the same confidence with nuclear armaments in the control of leaders in North Korea, Pakistan and, soon, Iran. Will containment work? Can we be certain that previous nuclear powers, like Ukraine, Belarus and Kazakhstan, will not rearm? If the United States is no longer the world’s policeman, is there not risk that South Korea will feel compelled to build nuclear weapons as a defense? Will Japan determine they can no longer count on the United States and go nuclear? In the Middle East, once Iran has the bomb, would you not expect Saudi Arabia and Turkey to become nuclear powers?

One of the more chilling aspects of North Korea is how the place is treated by mainstream media, in what David Feith of the Wall Street Journal has termed a “trivialization of horror.” A good example is ex-basketball player Dennis Rodman’s well-publicized trips to Pyongyang, and his references to Kim Jong-un as a “friend for life.” He is there now, on his third trip of the year. A month ago, “Elle” magazine cited “North Korean Chic” as one of the top fashion trends for the fall of 2013. The plight of North Koreans has never captured the imagination of Americans, in the way, for example, apartheid did in South Africa. In part, that has to do with the fact we identify more with the West and, as a country, we understand White-Black tension and conflict. But it also has to do with the fact that the Kim family has done everything possible to cut off all communication with the outside world. North Korea is known as the Hermit Kingdom. When looked at by satellite at night, North Korea is the black space separating South Korea from China. In addition, our apparent lack of awareness has to do with the fact that very few have escaped. North Korea’s only land boundaries are with South Korea and China, both of which are heavily secured.

One of the few to escape was Shin Dong-hyuk, who was born in a prison camp in 1982 and who escaped in 2005. His story was told in a book by Blaine Harden, Escape from Camp 14. Mr. Shin, who is the same age as Dennis Rodman’s “friend for life” Kim Jong-un, recently sent a public message to Mr. Rodman asking him to speak up on behalf of the thousands imprisoned in slave-labor camps. It remains to be seen if he will. I wouldn’t hold my breath.

During the one-day trial of Jang Sang-thaek – he was executed later the same day – he was accused of perpetrating “thrice-cursed acts of treachery.” He was told that “it is an elementary obligation of a human being to repay trust with a sense of obligation and benevolence with loyalty.” The latter words sound ominous to me, as they echo what we hear from Washington. I fear we are losing the meaning of freedom in a nation that places dependency above responsibility. Physical comfort is enticing, but is no match for liberty and the freedom it brings. I worry that we no longer understand and appreciate the meaning of our “unalienable rights.”

A hundred years ago next summer, World War I broke out. It changed the course of the 20th Century, producing the bloodiest century in the history of the world. In Europe, civilization had blossomed, as did trade and economies. The plane, the automobile and electricity had all been recent inventions. Global trade and international banking, as well, helped lift living standards to levels far beyond what they had been a generation earlier. Yet that period of tranquility came to an abrupt end. For the industrial revolution had also brought with it far more effective means of killing – machine guns, tanks and mustard gas. Countries re-armed, not because they expected war to break out, but to defend their far-flung empires. It was bit-players in the Balkans that lit the tinder that inflamed the world for four years, killing an estimated 37 million people. (For comparison purposes, equivalent deaths today would exceed 100 million.) We, too, live in a time of economic progress. Technology has revolutionized the way we communicate. Much of the world lives in relative comfort. Free market capitalism has lifted millions out of economic deprivation. A global war seems unthinkable. But there are rogue governments – places like Iran, North Korea and other states with totalitarian regimes that have kept their people in ignorance and poverty. Nuclear weapons are proliferating among those unstable nations. There are state-less Islamic extremists that have become more powerful in the past two decades. What is happening in the Ukraine is indicative of a Russia that looks to reassert itself as a global power. China has made menacing moves in the East and South China Seas. But North Korea is the place to watch. The country has a population of about 25 million, but an army, including reservists of over 9 million – the largest military organization on earth.

The Atomic bomb has only been used twice – both times by the United States, as a way to convince Japan to surrender. The threat of mutually assured destruction has kept such weapons silent for almost seventy years. Will that continue with rogue nations who support terror now in control of such weapons? As I wrote back in February, in a piece entitled “Kim Jong-un – a Tinderbox,” the “most dangerous of tyrants are those that are stupid.” Nuclear weapons in the hands of a certifiable maniac become even more unpredictable and more ominously dangerous.

Labels:

Wednesday, December 18, 2013

"Budget Accord - Necessary, but no Applause"

Sydney M. Williams

                                                          Thought of the Day
                             “Budget Accord – Necessary Perhaps, but no Applause”
December 18, 2013

“Politics is the art of the possible,” so said Otto von Bismarck. He should know. He helped unify Germany, which had been a collection of states, into a single German empire in 1871. He served as Germany’s first chancellor, and remained in that position until he was dismissed by Kaiser Wilhelm II in 1890. He forged a balance of power, which helped preserve peace in Europe for over forty years.

Politics is about governance. The word has its roots in the Greek word politikos, which refers to citizens and the practice of influencing people on a civic level. In a democracy, politics is about influencing others, but it is also about negotiating with those with whom one disagrees. In places like Russia, China and North Korea, politics is about ultimatums. Those who disagree embark on

In the United States, government works best when members of both political parties work in a bi-partisan fashion, placing the needs of the nation above their personal or party preferences. Kennedy, Johnson and Carter are the only modern Presidents (post World War II) to have had control of both houses of Congress for their full terms, but it did them little good. Kennedy, as we know, was assassinated in his third year. Johnson won reelection in 1964, but was forced to bow out in 1968. Carter failed to win reelection in 1980. In contrast, Eisenhower, Reagan and Clinton are more fondly remembered and, with the exception of Johnson, accomplished more. All three served two full terms. Eisenhower was opposed in the House and Senate for six of his eight years. Reagan had to deal with a Democrat House for all eight years and two years of Democrat Senate control. Clinton, like Eisenhower, was opposed for six of his eight years in both the House and the Senate. Single party control did not assure personal or legislative Presidential success, nor did a “loyal” opposition prevent significant accomplishments.

All Presidents have had their nemesis. Kennedy had Senator Robert Byrd, but the West Virginian Republican became an ally to Johnson in passing massive social legislation. With anti-war groups chanting “Hey, Hey, LBJ! How many kids have you killed today?”, Johnson had the people as his nemesis. Reagan had “Tip” O’Neill. Reagan had come to the Presidency as a conservative, but he was also pragmatic. In 1983, in response to a concern that he was retreating from his principles, he said he was not prepared to “…jump off the cliff with flag flying. I have always figured half a loaf is better than none, and I know that in the democratic process you’re not going to always get what you want.” Clinton had to work with the feisty House Speaker Newt Gingrich. Together, they reformed welfare and helped promote a strong period of economic growth. They also presided over the collapse of the tech bubble.

Extremists are common to both parties. They serve useful purposes, in the sense that they can keep party leaders’ feet to the fire regarding principles, but they can be disruptive. Republican “extremists” are labeled “Tea Partiers” (a term that despite its storied origins has assumed a pejorative connotation). Tea Partiers represent a significant minority of the population and a few sit in the House and Senate. Extremist Democrats are known by the more complimentary word, “Progressives.” (Democrats have always been more careful about labeling. They are the ones that can truly put lipstick on a pig, to borrow a phrase. If only they would be as careful about the policies they peddle.) New York Mayor-elect William de Blasio ran proudly on that label and garnered 17% of the electorate – enough to win the election, but hardly enough to suggest a mandate. However, the most prominent “progressive” is President Obama. The President is never depicted by mainstream media as an extremist, but it is hard to characterize Mr. Obama as being middle-of-the-road, when he declared his intent to “fundamentally transform” America and used the “Life of Julia” to depict his vision of the relationship between government and its people.

Extreme adamancy has been harmful. It prevented a “grand bargain” four years ago. In ignoring the findings of the Simpson-Bowles Commission on Fiscal Reform and Responsibility, the President assured that the Federal Reserve would be the only game in town when it comes to addressing our sluggish economy. There has been no regulatory or tax reform, which would have helped kick-start the economy. While some of the blame can be laid at the feet of recalcitrant Republicans, this was not a solo dance. Three days after the inauguration, in response to Republican concerns regarding a stimulus package, Mr. Obama said, “I won.” The inference was, ‘you lost. It’s my sandbox!’ The stimulus package passed with no Republican votes in the House and only three (Arlen Specter, Susan Collins and Olympia Snow) in the Senate. The stimulus did nothing to help the economy. Two years later, the President joked: “shovel-ready was not as shovel-ready as we expected.” In the room with him, laughter was forced; most winced.

Neither party got what they wanted in the Budget resolution that passed the House last week. But, assuming the Senate passes the bill, the government will not shut down in January and that will be good for Republicans. It remains a mystery to me that when politics in Washington do not work, the inevitable goat is the Republican Party. Democrats walk away, scent free. The current agreement is modest in scope. It represents a baby step toward conciliation. However, as Charles Krauthammer wrote, the deal was “better than the alternatives.” The agreement calls for a $63 billion increase in spending in 2014 and 2015, coupled with $85 billion in deficit reductions over the next ten years. The ten-year savings of $23 billion is roughly equivalent to ten days of deficits! But there are no income-tax hikes and sequester is not dead. Federal civilian employees hired after January 1st will see a 1.3% increase in personal retirement contributions, and there will be a decrease in annual cost-of-living adjustments for military retirees still of working age. However, just the fact that a budget accord has been reached should improve confidence.

Uncertainty has plagued the business community since before the recession began, and what is bad for business is bad for job growth. Modest economic growth, continued productivity improvements, low interest rates and very solid corporate balance sheets have propelled stocks higher. But the economy has lagged and the country needs jobs. There is an irony in that the nation’s most progressive President has promoted policies that have helped the rich and hurt the poor and minorities. But it is unsurprising, as economic growth stems from that bête noir of Mr. Obama – the private sector. To the extent this budget agreement helps confidence that should be good for business expansion. Despite recent job growth, the economy employs about 3 million fewer people than when the recession began six years ago.

However, the accord does not deserve celebration. The problem of debt remains and will only become worse as ObamaCare is added to entitlement spending, which is crowding out spending on defense and infrastructure. Entitlement spending, as a percent of GDP, is currently running 35% above where it averaged over the past 30 years. Federal debt amounts to $17.3 trillion, or more than $50,000 per person. (With ten grandchildren, I hate to think of my share!) Future obligations to Medicare, Medicaid, Social Security and now ObamaCare are virtually incalculable. Niall Ferguson has suggested the number might be as high as $200 trillion. Social Security is a prime example. When it began in 1935, life expectancy was 61 and eligibility was 65. Today life expectancy is 80 and eligibility is 66. In 1945, there were more than 40 workers for every retiree; by 2030 there will be less than three. At some point, some President is going to have to muster the courage to address our entitlement problems. There are things they can do to slow the process. They can means-test recipients. They can raise the retirement age. They can ensure payments to disability recipients are only to those that deserve them. None of these “fixes” will prevent what increasingly looks like an inevitable train wreck, but they might slow the momentum, or delay a crash. If nothing is done, future obligations will be paid off with a depreciated currency, if they get paid off at all. The problem with Democrats especially, but with too many Republicans as well, is that they have assumed the position of an ostrich – head in the sand. The deal was, as Larry Kudlow wrote, “a political solution, not a fiscal one.” It gets us past the first of January, but not much beyond that.

But Paul Ryan’s and Patty Murphy’s budget deal was probably the best that could be hoped for at this time. It should allow Republicans time to focus on the 2014 Congressional elections and even on 2016. We’ll see. Nevertheless, we continue hurtling down the highway, in a driving snow storm, riding on bald tires, with a yawning crevasse somewhere on the dark road ahead.

Labels:

Monday, December 16, 2013

"The Volcker Rule - It's no Panacea"

Sydney M. Williams
                                                             Thought of the Day
                                               “The Volcker Rule – It’s no Panacea”

December 16, 2013

As America was entering World War I, Joe Young and Sam Lewis wrote a song that became a huge hit at the time: “How ‘Ya Gonna Keep ‘Em Down on the Farm (After They’ve Seen Paree).” Something similar happened in the banking industry fourteen years ago. In 1933, the Glass-Steagall Act, which separated commercial and investment banking activities, was signed into law by President Franklin Roosevelt. Sixty-six years later, President Bill Clinton signed the Gramm-Leach-Bliley Act, which marked the death knell for Glass-Steagall. It allowed commercial banks to fish in waters heretofore reserved for investment banks. Seven years later, the world witnessed one of the worst financial crises in history. Was the latter a product of the former?

Once liberated from the constraints of Glass-Steagall, American bankers, like America’s farm boys on leave in Paris in 1917, let loose. Exuberance and profligacy saw no bounds, especially as losses could be socialized, while profits would remain private. There death of Glass-Steagall, in my opinion, abetted the credit crisis of 2008. It had become easier for banks to speculate. Of course, changing the stature of commercial banks was not the only factor leading to the crisis. The concomitant explosion in the use of derivatives and computerized/algorithmic trading added to the speculative nature of the system. But government was a major component. Not only had they done away with Glass-Steagall, they were responsible for creating an environment of “too big to fail.” The threat of bankruptcy is critical to the functioning of capitalism. Removed, it loses its discipline. Capitalism without failure is like religion without sin. It does not work.

It is natural for markets to expand and contract. “Financial markets,” as Seth Klarman noted last June, “will never be efficient because markets are, and will always be, driven by human emotion: greed and fear.” Stability leads to instability, as economist Hyman Minsky noted in the 1960s. Complacency and low volatility encourage reckless behavior. In allowing banks to speculate with depositors’ money, a consequence of the elimination of Glass-Steagall, government helped create a more dangerous situation. It was that careless and carefree attitude that allowed JP Morgan chairman Jamie Dimon to initially refer to the $6 billion loss in London as a “tempest in a teapot.”

Similarly, the development of sophisticated financial risk management platforms across global lines gave confidence to traders, investors and economists that tail risk could be confined. “Tail risk,” as Alan Greenspan explained in the November/December issue of Foreign Affairs, “refers to the class of investment outcomes that occur with very low probabilities, but that are accompanied by very large losses when they do materialize.” Most economists rely on mathematical models to calibrate risk. The ability to quantify risk formulaically instills an unwarranted sense of confidence in his or her prognostications. In economics it can cause one to ignore human behavior, which is impervious of being quantified with any precision. It is not often that I find myself in agreement with Paul Krugman, but I do when he says that it is fine to use mathematics in economic forecasts, but the discipline “should be the servant, not the master.”

Economic forecasting is, in fact, as much art as science. Nevertheless, most people, including policy makers in Washington find it easier to make predictions based on mathematical models, rather than attempting to understand how humans behave in myriad circumstances. CBO (Congressional Budget Office) forecasts are a prime example.

The Volcker Rule may help temper those problems, but will not resolve them. Its length and complexity will better serve management working with creative lawyers than depositors or taxpayers. Nancy Pelosi once said of ObamaCare, “We must pass it to find out what’s in it.” Dodd-Frank, the parent of the Volcker Rule, is equally incomprehensible. At 828 pages, it is more than twenty times as long as Glass-Steagall. Its 398 separate “rulemakings” run to an additional 14,000 pages. (And two thirds of the rules remain unfinished!) The Volcker Rule alone has an 882-page preamble, which does little to explain all its ambiguities. For example, banks can engage in underwriting and market-making, as long as such activities are “designed not to exceed the reasonably expected near term demands of customers.” What, for example, is meant by “reasonably expected near term demands?” Such legal claptrap is like Manna to lawyers. The denser the material, the better they love it, especially when they bill at $700 or more an hour. “Complexity is regressive,” as Christopher Caldwell wrote in last weekend’s Financial Times. In 1873, Walter Bagehot, the long-time editor of “The Economist,” published Lombard Street. In the book, he wrote, “the business of banking ought to be simple. If it is hard it is wrong.” What was true 140 years ago is true today. Glass-Steagall was written in 37 pages and served the nation well for more than six decades. Will we be able to say the same of Dodd-Frank six decades from now?

Freedom to speculate is inherent to our beliefs and critical to the unencumbered workings of capitalism, but it is best done individually and in private partnerships that have no call on government assistance should trouble develop. As Isaac Newton made clear in his Law of Motion, for every action there is an equal and opposite reaction. Similarly, in investments: the greater the potential profit, the more likely the probability of loss. Depositors in commercial banks, by the nature of their savings, are risk-adverse. Preservation of capital and safety are their concerns. When banks use depositors’ capital to purchase esoteric, high-risk assets, instead of, for example, straightforward commercial or mortgage loans, they violate the basic tenets of banking.

It is not feasible to retreat to a simpler world. Technology has allowed the creation of derivatives of derivatives – assets so complicated they must use algorithmic equations and require math PhDs to implement them. The creative use of leverage permits the possibility of outsized returns. Most senior bankers have little understanding of the creative instruments contrived in the laboratories that are their proprietary trading desks. But they certainly enjoy the fruits when trades are profitable. Being able to pass losses onto taxpayers naturally increases their level of comfort and their risk tolerance. Likewise, most regulators don’t understand these new and complex instruments. If they did they would be working for the banks, not for some regulator in Washington or Chicago. Rules should be simple: easy to understand, implement and enforce. The simplest requirement would be for an easily enforceable tangible common equity ratio to be required, one that demands more equity as banks increase in size, on the thesis that banks too big should never fail.

Weaning banks off the platforms that have made a few of them enormously wealthy will be difficult, but not so difficult, I suspect, as would be the task of cleaning up after another (and larger) Lehman bankruptcy. Instead of transparency, the new rules are deliberately designed to be opaque; so that unscrupulous bankers abetted by equally shameless attorneys will be able to seek out exceptions and loopholes. To misquote Dallas Fed President Richard Fisher (though in a way I believe he would approve), it is time to reassume a more humble commercial banker. As Walter Bagehot, wrote so long ago, banking should be simple – and so should the rules governing bank behavior. While the Volker Rule is better than nothing, it does not meet Mr. Bagehot’s more stringent requirements.

Labels:

Friday, December 13, 2013

"Nelson Mandela - Apostle of Reconciliation"



Sydney M. Williams
                                                                      Thought of the Day
                                               “Nelson Mandela – Apostle of Reconciliation”
December 13, 2013

Nelson Mandela never considered himself a saint. He did not want to be put on a pedestal. In 1994, John Carlin of the London Independent asked the newly elected Nelson Mandela if he were a saint. Mr. Mandela responded: “Well, it depends on your definition…Some people have said a saint is a sinner who keeps on trying. From that angle, you may classify me as a saint, because I have many weaknesses.”

It is curious that so many journalists and politicians (including President Obama) made such a point of saying that Mr. Mandela was not a saint. Did they really think he might be? Did they think that the people believed he was? Did they fear they would be seen as worshipping a man, instead of celebrating his life? Holiness and sanctity are not characteristics Mandela would have assigned himself. He fought the ugliness of apartheid, which had deprived millions of blacks of their inherent rights, and imprisoned and murdered thousands more. He spent twenty-seven years in jail, but the years did not leave him embittered. On his release, he found reconciling differences with South African president Frederik Willem de Klerk and the National Party more effective than confrontation. After winning the presidential election in 1994, he named Mr. de Klerk his deputy. In 1993, the two men shared the Nobel Peace Prize. Mr. Mandela was able to transcend the violence of the Umkhonto we Sizwe (MK), the militant wing of the African National Congress (ANC), which he had co-founded in 1961. In prison he had undertaken the study of Afrikaans, the language most associated with apartheid, because he felt one must be able to communicate with one’s political foes. These were the things that made Mr. Mandela an exceptional human being – perhaps not characteristics that cause one to be canonized, but as close as it gets in politics.

The struggle to overthrow apartheid was a legitimate, revolutionary struggle. In 1980, non-whites composed about 84% of the population; yet they were unable to own land and vote. South Africa is richly endowed in natural resources, yet the majority of the population could not benefit. While blacks make up the majority of the population, the nation is truly polyglot. In a nation of 53 million, eleven languages are officially recognized with Zulu, Xhosa and Afrikaans being the three most prominent. English, the first language of about 10% of the population, is the language of commerce. Today, blacks make up about 80% of the population, with whites and those of mixed races roughly dividing the balance. Among the blacks, Zulus and Xhosas are the largest groups; others include the Pedi, Sothos Tswanas and Hottentots, each with their own language and customs. The white population is divided between those of British heritage and the Afrikaners who are descended from Dutch, German and French settlers.

In the 1950s, Mr. Mandela joined the ANC and their bid to rid the nation of what was legal racial segregation, known as apartheid. Besides being a member of the ANC at the time of his arrest in 1962, the forty-four year old Nelson Mandela was a member of the then underground African Communist Party, including being a part of the Party’s central committee. At his trial and conviction in 1964, Mr. Mandela never disavowed the tenets of the MK. He chose not to plead “not guilty” or to seek clemency. It was this clear enunciation of principle that labeled him a terrorist and which made him a political prisoner. (Ironically, he was only removed from the terrorist watch list in 2008.) It was also what made him a symbol of the anti-apartheid movement.

In 1962, Mr. Mandela was sentenced to life imprisonment, in particular for his role in organizing the bombings of police stations. It was in prison that he came to appreciate the concept of reconciliation and redemption. His cell on Robben Island, filled with books and manuscripts, was said to have had a monastic spirit. Like Gandhi, who also had practiced law in South Africa, Mr. Mandela became determined to appeal to the conscience of whites. Gandhi was recognized as the figurehead of Indian independence before arriving there. Likewise, Nelson Mandela became the voice of a free South Africa while still in prison. Reconciliation was imperative, not just between blacks and whites, but between ethnic tribes, particularly between the two majorities – the Xhosas (Mr. Mandela’s heritage) and the Zulus (the tribe of the current president, Jacob Zuma.)

What all revolutionaries have in common is a hatred for oppressive government. It was true in England in 1642, in the American colonies in 1776, in France in 1789, in Russia in 1917, and in other places from China to Brazil, from Cuba to South Africa. But what is striking is that the only truly successful revolution was the one in the American colonies – successful in the sense that the revolutionaries were able to quickly adapt a democratic form of government based on republican principles. More typically the vacuum created by the destruction of an existing despotic government was filled by another tyrant, as happened in France, the Soviet Union, China and Cuba. England evolved slowly toward a more liberal government. Yet, incomprehensibly most revolutionaries look to communist nations for help and inspiration – countries with governments as oppressive as the ones that were overthrown. It suggests that the leaders of most revolutions seek power for themselves, not liberty for the people.

But not Mr. Mandela. Following his release from prison in 1990, Mr. Mandela visited the United States. In a speech before Congress he spoke of the American dream, “of liberty for all…without regard to race, color, creed or sex…To deny any person their human rights is to challenge their very humanity.” While Mandela was still in prison, and with economic sanctions being imposed by the United States on the South African regime then governed by the apartheid Pieter Willem Botha, President Ronald Reagan demonstrated his personal views of apartheid. He named a black career diplomat, Edward Perkins, as ambassador to South Africa. Juan Williams recently wrote in the Wall Street Journal: “The American civil-rights movement particularly intrigued him [Nelson Mandela], because the racial dynamics in the U.S. were a reversal of those in his country, where the black majority population was oppressed by a white minority.” It was the non-violence of men like Martin Luther King and Mahatma Gandhi that Mandela saw as examples to emulate.

It was Nelson Mandela’s journey from angry young man to statesman that needs to be told, and which should be seen as an example for myriad African nations, as they emerge from European colonies to free and independent states. Unfortunately, the paths for most nations have been paved with repression, retribution and tribal warfare. It was a love for his country and the belief that liberty was universal and was a right of all free people that drove Nelson Mandela, or Madiba, as he liked to be called. In an act reminiscent of our first president, Mr. Mandela voluntarily gave up the office of president after one five-year term.

Unfortunately, Nelson Mandela’s legacy has been less munificent. He left the office of president in 1999 and was followed by three other men, all representing the ANC. The economy is suffering. Unemployment in the 3rd quarter of 2013 was 24.7 percent. The murder rate is one of the highest in the world at just over 30 per 100,000, about four times higher than that in the United States. Violent crimes, at more than 1,600 per 100,000 are more than three times that in the U.S. and one of the highest in the world. Corruption flourishes and was the reason that President Zuma was booed at the memorial services for Nelson Mandela earlier this week. While Mr. Mandela was a remarkable man, one who only comes our way rarely, one wishes he had the good fortune, as George Washington did, to be followed by the likes of an Adams, Jefferson, Madison and Monroe. By the time James Monroe completed his final term in 1825 – the last of the founding fathers to become president – the United States was thirty-six years old. Its future was assured.

Nevertheless, Nelson Mandela was a man deserving of the celebration that the 100 current and former world leaders gave him earlier this week. He showed that reconciliation works better than confrontation. Amazingly, he included white guards at Robben Island among his friends. He left behind the anger that drove him as a young man, and his humility allowed him to know that power resides in the people, not with a single person. He may not have been a saint, but he was a man deserving of our respect and admiration. He was a good man. How fitting that we should celebrate his life during the Christmas season! His life was one that other political leaders in South Africa and around the world would be wise to emulate.

Labels:

Wednesday, December 11, 2013

Lessons from Detroit

Sydney M. Williams

                                                             Thought of the Day
                                                           “Lessons from Detroit”
December 11, 2013

“Bad news isn’t wine. It doesn’t improve with age.” So spoke General Colin Powell in a different, but still appropriate, context. For years, state and municipal politicians and union leaders enjoyed a symbiotic relationship: union leaders would provide the votes necessary for successful campaigns, while politicians would ensure that unionized employees would receive good wages and even better retirement and healthcare plans. For years, those in Detroit ignored the creeping obligations that would undo their city. They lived in a Panglossian world, with little concern for tomorrow, a world where the sun always shone and there was never a cloud in the sky. Unusually good market returns over two decades allowed pension trustees to ignore the rumblings of a coming pension earthquake. The investment environment changed a few years ago. The trustees remained oblivious. Politicians and union leaders played the same games, combining delusion with corruption. Both leaders would have felt right at home in A.A. Milne’s make-believe 100 acre wood. “What day is it?” “It’s today,” squeaked Piglet. “My favorite day,” said Pooh.

Now it’s tomorrow. Last week, Federal Judge Steven Rhodes issued a wake-up call, allowing the city of Detroit to proceed with its bankruptcy filing. It is a decision that will have broad-reaching consequences, as the Chapter 9 filing does not protect the pensions of local government employees. About half of the nearly $20 billion in debt comes from the city’s unfunded obligations to its retirement and healthcare systems. Detroit filed for bankruptcy on July 18th, four weeks after Governor Rick Snyder appointed Kevyn Orr to be the city’s emergency manager. But it was only last week that Judge Rhodes issued his decision, one that will certainly be challenged, as the decision represents a clear and present danger for unionized employees in municipalities across the country. On the other hand, the decision represents a path forward for the hundreds of cities encumbered with debt and obligations, all of their own making. It presents an opportunity to look forward, rather than, fearfully, backward. The process will be painful enough that it should preclude a repetition…at least for a while.

The bankruptcy, besides demonstrating the inevitable consequences of promising what cannot be delivered, is a reminder of the risks of one-party rule. For the last 52 years eight Democrat mayors have been in charge, with Coleman Young running the city for twenty of those years. It is also a reminder of the Faustian bargains that are struck, when politicians, who are supposed to represent taxpayers, become more beholden and dependent on organized labor with whom they negotiate, and they did so, but not in good faith to the taxpayers.

Detroit today is a city of less than 700,000 – about the size it was a hundred years ago. Eighty-eight percent of the population is African-American. The city is a fraction of what it was in 1950 when the census showed 1.8 million people. The city is home to 78,000 blighted and abandoned properties, one fifth of the housing stock. Only 40% of its streetlights work and only a third of the city’s ambulances are operating. Sixty percent of the city’s annual 12,000 fires are in abandoned and blighted structures. According to the Detroit Free Press, it takes police an hour to respond to “priority one” 911 calls. The city has the highest property tax rates and the lowest property valuations in the nation. It appears a war-ravaged city. In fact, looking at photos of Hiroshima and Detroit today, one would be convinced that Japan won the War. Hiroshima is a modern city with towering office buildings. Its population has risen from 83,000 after the War to 1.1 million today. Detroit shrinks; Hiroshima blooms.

Granted, the management of the auto industry, which grew too fat off war-time military contracts and an absence of competition in the early post-war years, was incapable of adapting to global competition. They allowed unions to dictate terms so onerous they gradually bled the goose that laid the golden egg. Fat, dumb and happy would be a polite way of characterizing the management of Ford, General Motors and Chrysler at the time. Besides greed, corruption and cronyism among and between corporate leaders, politicians and public sector union leaders played a role. An abundance of riches became a scarcity in a few short years; but, like a character out of Edith Wharton, there was a refusal on the part of the actors in this tragedy to admit that circumstances had changed.

Redemption without pain is not possible. While the guilty in this saga are likely to escape without punishment, the same will not be true for the victims – Detroit’s 19,000 retired municipal workers and the 9,500 still employed. Judge Rhodes’ decision means that what had been promised will not be paid. In assigning responsibility, fingers will be pointed, but the fault lies with those who promised what could never be delivered. One might claim that recipients were naïve to believe in fairy tales, but I don’t believe that is fair. But neither is it the fault of the “rich,” nor of taxpayers across the nation. The fault lies with those who thought they could play the game forever – the politicians and the union leaders who lived off a lie and sacrificed the people who provided votes and paid their dues.

There are a number of troubled cities and counties in the United States. Most we have read about. However, the unfunded liability per household, as currently measured, exceeds $30,000 in Chicago, New York and Boston. Philadelphia’s solvency horizon is set at 2015; Boston and Chicago’s at 2019. In most cities and states the quoted unfunded liability masks the real situation. Most assume reinvestment rates of seven to eight percent. A more realistic number would be four percent. Under Governmental Accounting Standards Board (GASB) rules, governments will be required to use more reasonable investment forecasts. The difference is significant. Over forty years, the expected payout period for a forty-year old worker, one dollar invested at 8% becomes $22. That same dollar invested at 4% becomes $5, at 6% $10. Additionally, most pension plans assume an annual increase in inflation of 2%, which is built into their obligations. One union leader recently suggested that annual inflation increases be eliminated, as one way to reduce future obligations. That it will, but it strikes me as sneaky; by the time the recipient realizes what has happened, the official will have moved on. But then that is to be expected among the ethically-challenged in political and union leadership positions.

It is a sad story with an unhappy ending. Yet, from such ashes Phoenix’s can arise. The advantage of bankruptcy is that it allows for a fresh start, a new beginning. The attractions of Detroit that allowed the city to grow in the first half of the 20th Century are still there – its natural beauty, hidden by urban blight, yet seen in its lakes and hills, but most importantly in its people who have proved in the past they are smart, creative and industrious. A new start will allow the city to focus on its decaying infrastructure and allow it to tempt new investors, industries and businesses with favorable tax and regulatory proposals. We have seen the risks of one-party rule, and the corruption that it breeds must be weeded out. Defined benefit retirement plans must move toward defined contribution ones, but reform won’t be easy. Big government unions are pressuring those in Congress to disallow 401K-like plans, as being too risky for government employees. A bipartisan bill in the Congress that would amend the IRS code has gone nowhere because of union opposition, as Steven Malanga wrote in last weekend’s edition of the Wall Street Journal.

The lesson from Detroit will most harshly be learned by those whose pensions will be cut. But it will only have lasting meaning if those lessons trickle up to politicians and union leaders. Bad news cannot be hidden forever. And it will not get better over time. The false promises that have been made must be addressed squarely. Evasive answers to simple questions, or looking for someone else to blame will not do. Medicine must be taken for what has been done, and the system must be reformed to prevent future occurrences.

Labels:

Tuesday, December 10, 2013

"Common Core"

Sydney M. Williams

                                                            Thought of the Day
                                                              “Common Core”
December 9, 2013

Thomas Jefferson believed that education was fundamental to freedom – that only a people imbued with knowledge could achieve and hold onto a republic. George Washington Carver said: “Education is the key to unlock the golden door of freedom.” Schooling should start with the young. Nelson Mandela once said, “Education is the most powerful weapon which you can use to change the world.” In 1779, my four-great grandfather Noah Webster, in an essay on elementary education wrote, “The only practicable method to reform mankind is to begin with children.”

Nevertheless, until the mid 19th Century, education remained the provenance of the wealthy. In 1852, Massachusetts, with the prompting of men like Horace Mann, passed the first compulsory school laws. New York did so the following year. School reformers were of the opinion that education would create good citizens, unite society and help deter crime and poverty, concepts that are as true today as they were 150 years ago.

Fast forward to today. High school drop-out rates may have declined from almost 30% in 1990 to about 12% today, but the reason has to do with grade inflation. Knowledge, if anything, has declined. Teachers and administrators are more motivated to achieve high graduation rates than to actually teach students. In New York, Regents Exams are either being dropped or shortened. The stated excuse has been budget woes, but the failure of students to pass the exams has been also responsible in studies like geography and world history.

Too many of our schools are failing the students they are supposed to help. The results of the 2012 PISA (Program for International Student Assessment) are in and the results were not pretty. They show, in the words of U.S. Educational Secretary Arne Duncan, a “picture of educational stagnation.” When compared to the 34 members of the OECD (Organization for Economic Cooperation and Development), the results are even more dismal. Of those 34 countries, the United States ranked 26th in math, 21st in science and 17th in reading. While the math scores remained unchanged from 2009, the U.S. slipped in both science and reading.

The system needs improving. Besides the failure of our schools to compete effectively with other nations, there is a growing need at home for graduates who can better communicate and a need by employers for skilled workers. Common Core State Standards Initiative has been one response. The initiative was sponsored by the National Governors Association and the Council of Chief State School Officers and has been supported by the Obama Administration and Education Secretary Arne Duncan. The plan details what K-12 students should know in English and math at the end of each grade.

Raising standards and establishing conformity sounds appealing, but is fraught with the potential for unintended consequences. Does “conformity” mean changing the ways in which superior schools have succeeded? Does it mean reducing their standards? Will one consequence be a lowering of the common denominator to allow more schools to meet predetermined standards? Will schools graduate fewer exceptional students? There are some great public school programs in the country: Darien in Connecticut and Milburn in New Jersey are just two examples. While it would not be possible to replicate Darien High School in East Harlem, New York, might it be possible to borrow certain ideas from successful schools? Regarding the PISA results, Wendy Kopp, founder of Teach for America and Teach for All, wrote in last Wednesday’s Wall Street Journal: “Americans should ask how we can emulate other countries’ success rather than envy it.”

The Common Core State Standards Initiative details what K–12 students should know in English and Math at the end of each grade. They were announced on June 1, 2009, with the idea that they would be required in 2014. The standards are copyrighted by the National Governors Association for Best Practices and the Council of Chief State School Officers to ensure the standards will be the same throughout the nation. All but four states – Alaska, Nebraska, Texas and Virginia – quickly signed on to what was an untested program. One reason they did so is that President Obama had required adoption of the common core if they want to receive billions in federal grants and waivers from the 2002 No Child Left Behind Law. Another more sinister reason for the fast adoption of the standards is that Common Core was approved in states like Indiana by the Board of Education, without consulting the legislature. Indiana is now reconsidering its position.

A few weeks ago, Frank Bruni wrote a column in the New York Times in support of the Common Core curriculum. It was entitled, “Are Kids Too Coddled?” While I agree that many parents over-protect their children, are not teachers and school administrators being coddled by the unions which defend and represent them? Mr. Bruni quotes Marc Tucker, president of the National Center on Education and the Economy, and a supporter of standardized testing: “Our students have an inflated sense of their academic prowess. They don’t expect to spend much time studying, but they confidently expect good grades and marketable degrees.” Grade inflation would be outed by standardized tests, but that alone does not address critics’ real concerns, which is that too many schools have failed their students and they question if Common Core is the answer.

Common Core is the state’s response to a problem we all know exists – an undereducated youth, who finds herself incapable of competing in today’s global marketplace. But is Common Core the answer? There is a lot of blame to go around. In my opinion, it starts with the teacher’s unions. Underperforming and simply bad teachers are allowed to stay on when they should be fired. In the interest of gaining members, unions have applauded the increase in administrators – moneys spent on support instead of on teachers. Technology has brought efficiency to most all other sectors of the economy, relieving the need for support staff. Yet schools have gone in the other direction. Lindsay Burke of the Heritage Foundation found that between 1992 and 2009, the number of students in American public schools increased by 17%, the number of teachers grew 32% and the numbers of non-teaching personnel went up by 46%. Compared to other nations, U.S. public schools devote higher fractions of their budgets to non-teaching personnel – and lower fractions to teachers. Despite those increases in personnel, Ms. Burke notes that student achievement has been “roughly flat or modestly in decline.” In my opinion, students are being sacrificed on the altar of the teacher’s unions.

There are of course other problems. Fear of lawsuits and repercussions have prevented schools from appropriately disciplining badly behaving students. Bullying, for example, has always existed, but has recently gotten out of hand.

We are a nation of 330 million people, with about 50 million students in 98,800 elementary and secondary schools. They come from myriad social and economic backgrounds with varying ranges in native intelligence, diligence and aspiration. To assume that all students can be tested according to some common standard means that for some the bar will be set too high, but for others too low. Some children work independently, others require hands-on help. In the end, education is an individual experience conducted in a public arena. The purpose of an education is to produce inquisitive, productive and independent young men and women.

The claim is that Common Core State Standards are designed to ensure “real understanding.” The focus is supposed to be on “mastering” the material, instead of memorizing. To achieve that, courses will cover fewer topics, and will focus on “what students will use in life.” This syllabus, if you will, is troubling. What is meant by “real understanding?” What is wrong with memorizing? When my grandmother was dying of cancer in 1961, she was no longer able to read, but her mind was still sharp. She took great pleasure in repeating the poetry she had learned as a child. There is much more to education than memorization, but memorizing is work and helps train the brain. To teach the “Gettysburg Address” without putting it in to the context of when and where it was delivered, as Common Core demands, is to deprive the student of Lincoln’s real message. The assumption that some bureaucrat knows today “what we will use in life” tomorrow presumes a Solomon-like knowledge of the future – something I am confident he does not know.

In the end, we must accept certain truisms, some of which are unpleasant. Students do not have equal abilities or desires. The purpose of school is not to stamp-out robot-like graduates, all of whom have the same training. It is true that many schools have performed poorly; so into that void, states and now the federal government have stepped. There are schools and communities where such standards may be required, but they should not be for the schools that are doing well. To treat all schools equally means that overall standards will drift towards the lowest common denominator. No matter how well intentioned a standardized teaching program might be, the distance between that platform and state-run propaganda is short. Joseph Stalin, a supporter of compulsory education according to his standards once acknowledged: “Education is a weapon, whose effect depends on who holds it in his hands, and at whom it is aimed.”

Education must first emphasize what has been well understood for at least 150 years – the three “R’s:” reading, ‘riting and ‘rithmetic. Without knowledge of the basics, nothing else is possible. Education should encourage curiosity. The amount of information in the world doubles every twelve months. Even the concept of that statement is hard to grasp – a year from now there will twice as much data as there is today! Two hundred years ago a man or a woman could be broadly educated. That is no longer the case. Education must provide the basics and then provide the tools, so that learning will be fun and will never cease. Teachers should not teach to some pre-determined test; they should educate the young in the basics, recognizing individual abilities, and try to instill in them a sense that the quest for knowledge should never stop – that education does not end in high school, college or graduate school. Learning goes on as long as breath is drawn.

It is easy to understand the idea of Common Core as the federal government’s response to a growing problem, but I believe it fails to address the real problems. I worry about people in Washington who believe they have found the answer, to an admittedly intractable problem, in an inflexible curriculum designed to have youngsters of varying abilities conform to a single standard. It is best to keep such standards and testing closer to the communities directly responsible. It is better for our economy and safer for our republic.

Labels:

Wednesday, December 4, 2013

"Thoughts of a Conservative"

Sydney M. Williams

                                                                    Thought of the Day
                                                            “Thoughts of a Conservative”
December 4, 2013

A Caveat: I am not a scholar. I do read and I do think about things, but by no means do I believe I have the last word on what a conservative might be. There are men and women who have spent their lives studying conservative thinkers from Cato to John Locke, from Edmund Burke to Eric Hoffer and Ludwig von Mises. What follows are simply my thoughts and opinions, nothing more.

When I was growing up, I recall my mother saying that her father had told her that if one is not liberal when they are young they have no heart, but if they are not conservative by the time they reach thirty they have no head. Later I had heard that sentiment attributed to Churchill. Regardless of the source, the concept makes sense.

Words matter. Regardless of how I might define a conservative, the Left has done a far better job than the Right in putting their imprint in describing liberals and conservatives. Democrats and their cohorts in mainstream media use whimsy and fantasy when describing those they consider “liberal:” innocent, virtuous, youthful, energetic, free and compassionate are adjectives they employ. But there is harshness to their adjectives when conservatives are described: corrupt, impure, elderly, tired, restrained and miserly.

It’s all poppycock of course. Conservatives believe in personal responsibility, free markets and individual liberty. They see the principal roles of the state to provide protection for its citizens from within and without and to defend the rights of its citizens. They therefore believe in a robust department of defense and judicial system. They are skeptical of men in power, so prefer the rule of law to the “enlightened” rule of men. The believe government should permit people to be free to pursue their own goals. While conservatives respect other cultures, most believe in a universal moral sense, that there are absolute values with respect to life and property. They believe in what are termed family values – moral codes that have helped keep the bonds of families strong, and which have allowed communities to grow ever stronger.

They abide by the rights of all citizens as laid out in our Bill of Rights, the right to speak assemble and pray, the sanctity of life, the ability to purchase, own and sell property; they have respect for the beliefs of everyone, no matter how different. Conservatives believe in government, but that its role should be minimal, but they acknowledge that there are those who are unable to care for themselves and that government has a responsibility to look after them. Fiscal responsibility is important both on an individual level, as well as for the state. Conservatives argue that English should be a requirement for citizenship. They also recognize that learning the language is necessary to survive and thrive in our country – that when we don’t require it, we hurt those we purport to help.

Conservatives are more likely to embrace uncomfortable truths. They believe in social equality and believe our unalienable rights provide for equality of opportunity. But they recognize that equality of outcomes is only a dream promoted by would-be tyrants. For example, they recognize that not all high school seniors can go to a top college. They understand that there will be 3.3 million seniors graduating from high school in June 2014, yet the top fifty colleges and universities can only accept about 75,000 freshman, or two percent of graduating seniors. Is that unfair, or is that life? Conservatives live in the world as it is, not the one that appears in Disney films. Interestingly, the Left, which glories in youth, has, as their two favored candidates for 2016, members of the “over-the-hill” gang. Joe Biden will be 74 that year and Hillary Clinton will be 69. No man has been elected President over the age of 70, and only Reagan was older (by nine months) than Ms. Clinton would be. In contrast, seven of the ten youngest governors are Republican. So, who has youth on their side?

The Left believes in the goodness of government, especially in its role to achieve equality and fairness. They believe that government’s responsibility extends to guaranteeing that no one is in need and that everyone has access to healthcare. As such, they ignore the concerns of those who worry about unhealthy dependency. Conservatives understand that increasing dependency means less freedom. The Left believes that enlightened men and women can make better decisions for us than we can make for ourselves. They tend to be more careless about financing, trusting that future generations can shoulder the needs of debts incurred today. The consequences of such promises can be seen in the hardships they have thrust on credulous retired city employees in Detroit. Most of them believe in moral relativism – that there are differences in judgments across myriad cultures that should be respected – “judge not lest ye be judged” is their motto. So they allow such judgments to dictate untested changes in our culture – an example being those on the Left who want to bring Sharia Law to the United States – a law which treats women more harshly than men. The Left believe in multiculturalism and that it is not necessary to learn English to enjoy the rights of citizenship. In my small town in Connecticut, the Town Clerk is required by state law to ask if one wants his ballot in English or Spanish! Can a Spanish-speaking-only person vote knowingly in an English-speaking community and nation?

Conservatives understand the instability of our climate and that climate change is a fact of life – that it has always been with us, and that change is something with which we must learn to live. While conservatives acknowledge that man bears some responsibility, but, unlike the Left that is in denial, they recognize that the earth’s climate has been changing long before man appeared. There is much in the science that remains unknown. Seventeen percent of the world’s population has no access to electricity. Fossil fuels are necessary for them to gradually ascend to better standards of living, yet a sanctimonious Obama Administration recently announced that the United States would no longer contribute to the construction of coal-fired power plants financed by the World Bank. Feeding starving people and helping them keep warm should take precedence over priggish political decisions that may or may not help the earth over the next few centuries. And, you do not see conservatives traipsing around the world, selling scare stories and raking in millions of dollars, as we do those like former Vice President Al Gore and Hollywood’s Michael Moore.

One way to define a conservative is to describe what he is not: He does not waste other people’s money; he considers rising dependency a threat to personal liberty; he is never smug or condescending in his treatment of others; he is respectful of opposing opinions. You would never have seen conservatives at Brown blackballing Mayor-elect Bill de Blasio, the way leftists did New York Police Commissioner Ray Kelly, nor would you have seen conservative students hurling insults at Judge Shira A. Scheindlin, the way leftist City College of New York students recently did to General David Petraeus. The destruction of private property and mess left behind by the Occupy-Wall Street crowd was far worse than anything ever seen when a Tea Party Convention moves on. The first group feels entitled. The second does not.

At the end of the day, the real question is which political philosophy has done the most good for the most people. The world is our laboratory. The 2013 report on world hunger from the United Nations Food and Agricultural Organization is testament to the role free market capitalism has played in alleviating hunger. Over the past twenty years, as democracy and capitalism spread throughout much of Asia and Eastern Europe, the numbers of those suffering from “chronic undernourishment” declined to 870 million people from a billion in 1990, or from 20% of the world’s population to 12%. Much still needs to be done, but history shows that free-market capitalism, with all its warts, has done more to lift people from poverty than any other form of government or economic system. Africa, which has been mired in a socialism and tyranny, is one of the very few regions of the world where hunger has actually increased.

Perhaps most important, conservatives are optimistic about the ability of the individual to rise and succeed. They revel in the stories of individual American women and men who by the dint of their intelligence, grit and aspiration overcame incredible odds to succeed. They did so because of the opportunities they had to succeed as individuals. The most dispiriting thing about an intrusive state is that it squashes individual initiative by requiring conformity to established standards, whether imposed by society, schools or government. Conservatives believe that power resides with the individual, not with the state. So do I.

I am no longer young; so I suppose my grandfather is smiling wherever he might be. I only hope he is sharing his satisfaction with Winston Churchill and a host of other conservative thinkers whose great works have come down to us through the ages.

Labels:

Monday, December 2, 2013

"The Month That Was - November 2013"

Sydney M. Williams


December 2, 2013

                                                            The Month That Was
                                                               November, 2013

                                                              "November Comes
                                                            And November goes,
                                                          With the last red berries
                                                        And the first white snows.”
                                                                           Clyde Watson – (1947- )
                                                                           Author of children’s stories

The month debuted with Typhoon Haiyan, which devastated the Philippines. Haiyan formed on November 2nd in Micronesia. Five days later it made landfall in central Philippines with winds of over 160 miles per hour. Estimates are that 10,000 lives may have been lost. The month ended with the government planning a re-launch of ObamaCare, or rather a re-launch of the website. The website permits one to go to an exchange, to see what plans are offered and what subsidies may be provided. Personal information, such as income and Social Security numbers are required. The federal exchange HealthCare.gov is a $630 million on-line insurance exchange created by friends of the Obamas. Once it is repaired, as it likely will be, the fundamental problems with ObamaCare are likely to persist.

The stock market, digesting the news of the month, moved smartly higher, up 2.8% as measured by the S&P 500, with perhaps another 15 basis points because of dividends paid out. Going in the opposite direction was the Treasury market, with the yield on the Ten-Year rising 7.9%, reflecting lower bond prices. Year to date, the yield on the Ten-Year has risen 56% (from 1.76% to 2.74%), indicating a meaningful decline in the price of that instrument. During the same time, the S&P 500 has risen 26.6%. While I claim no expertise in markets, it is somewhat disconcerting to see stocks and bonds take such dramatically divergent paths. It is reminiscent of the spring and summer of 1987.

Equity markets, so we are told, look ahead, not back. That may well be true; though I have found them to be more whimsical and not always predictive. If it were easy, as a wag once said, we’d all be rich. I can say, however, that the natural direction of stocks, over time, is higher, reflecting increased GDP and corporate earning growth. The natural direction in bonds depends mostly on expectations for inflation and the perceived value of the underlying currency. The first forty-one years of my life, interest rates generally went higher. The last thirty plus years, they have mostly moved lower. Market moves of all assets, fortunately for all of us, tend to be more exaggerated than the changes we experience in our daily lives.

Elections gave few clues. Bill de Blasio, a left-leaning Democrat with a history of youthful Marxism, won the mayoralty of New York and Chris Christie, a pragmatic Republican, won re-election as governor of one of the bluest of states – New Jersey. Terry McAuliffe won the gubernatorial election in Virginia against Republican Ken Cuccinelli. However, McAuliffe failed to win a majority. He was aided by votes for Libertarian candidate Robert Sarvis, a man largely funded by Obama bundler Joe Liemandt, who took 6.6% of the vote. Mr. Liemandt deserves credit for Mr. McAuliffe’s victory, a situation he is likely to use to his advantage. Republicans were hurt by the government shut-down in the first two weeks of October; Democrats by the abysmal roll-out of ObamaCare. In the land of the blind, the one-eyed man is still king.

Among other major events was the announcement that a deal had been struck with Iran regarding the building of a nuclear bomb. The Middle East is already unsettled. A country that promotes and exports terror gaining access to nuclear weapons will do little for peace in the region. It is especially concerning to those who live in Israel, a country the Iranians have pledged to wipe off the map. But is the deal a good one? Is John Kerry the 21st Century’s Neville Chamberlain, or is he Henry Kissinger gone to China? While I hope Kerry is the latter and that Iran will relent in their pursuit for nuclear weapons, my fear is that he is the former.

And speaking of China, that nation of 1.2 billion people is dumping its “one child” policy and flexing its muscles in the East China Sea. The former acknowledges that an aging population is antithetical to long-term growth. The latter is concerning to the Japanese and the South Koreans, as well as the United States. The focus continues to be the Senkaku Islands. Japan’s sovereignty over the islands, which lie about 100 miles northeast of Taiwan, has been disputed since oil was discovered in 1968. China has declared an “air defense identification zone” over the islands. The United States sent unannounced B-52 flights through the zone last week, flights that were not challenged. However, the U.S. did warn commercial liners flying through that space to give advance warning to the Chinese. The last thing the world needs is a 747 intercepted by Chinese fighters as it carries hundreds of passengers. While the U.S. may suffer a short-term price in terms of diplomacy and trade, permitting China’s aggression would have more severe, longer term consequences to peace in the region.

Janet Yellen was confirmed by the Senate Banking Committee 14-8, suggesting she will have smooth sailing toward ultimate confirmation in the full Senate as the next Chairman of the Federal Reserve. She has favored the extraordinary low interest rates that have been the policy of the Federal Reserve over the past five years. ZIRP (zero interest rate policy), quantitative easing and Operation Twist have come to characterize Mr. Bernanke’s Chairmanship. In Mr. Bernanke’s defense, it is fair to say that in the absence of any fiscal policy, monetarism has been the only game in town. Ms. Yellen’s appointment will assure that those policies will continue, despite having done much more for speculation and asset prices (and therefore the wealthy) than for the poor and the jobless. An example of that speculative fervor can be seen in the sale of Francis Bacon’s 1969 triptych of his artist friend Lucien Freund. It was sold by Christies on November 12th for $142.4 million – the most expensive piece of art ever sold. Low rates have not done much for Black youth unemployment, which stands at a record 36.0% according to the November release from the Bureau of Labor Statistics.

The decision by Harry Reid’s Democrat colleagues to detonate the nuclear option – eliminating filibusters for most presidential nominations – shows that hypocrisy is alive and well. The decision is likely to have long-term consequences. The fact that Democrats repeatedly used the filibuster during the George W. Bush years did not deter them from scrapping tradition this time. Democrats claimed that the vote to do so was based on obstructionist Republicans. The decision may well be rued when Republicans become the majority, which they will at some point. Regardless, eliminating filibusters will do three things – it hurts minority rights, it cedes more power to the President and it increases the bipartisanship that divides our politicians in Washington. Of course, it is better that a nuclear option be exploded in the U.S. Senate than a nuclear weapon be tested by Iran!

There was other news, both trivial and important. Caroline Kennedy was named Ambassadress to Japan. Her principal qualifications are that she was an early supporter of Mr. Obama (alongside her deceased Uncle Ted) and that her father was a President, whose assassination fifty years ago was remembered four days after she presented her credentials to Japanese emperor Akihito. The first day of Hanukkah fell on Thanksgiving, a rare occurrence. The next time the days overlap will be about 79,000 years from now. (I do not expect to be around.) Bob Ford, the mercurial, alcoholic and drug-consuming mayor of Toronto, was stripped of many of his powers. However, he serves as proof that “crazies” in government are not limited to the United States. Income and wealth gaps have led to a widening jobs gap. Employers met in late November in Oklahoma City to discuss the fact that many jobs go begging because of an absence of skills. Worrying for Democrats is that all three “gaps” – income, wealth and jobs – have widened since the end of the recession in June 2009, in other words on Mr. Obama’s watch.

We enter December with stocks at all-time highs and opinions of politicians at all-time lows. Like stocks, opinion polls are notoriously volatile; so while Mr. Obama is currently not very popular, that is bound to change. Anyone who thinks that one Party has a lock on next year’s elections is smoking something that should be shared with the rest of us.

Besides Santa Claus and the start of winter, we shall see what else December brings.

Labels: