Monday, April 29, 2013

“Immigration and the Boston Terrorist Attack”

Sydney M. Williams

Thought of the Day
“Immigration and the Boston Terrorist Attack”
April 29, 2013

The terrorist attack in Boston should have minimum impact on immigration policy. However, it served to highlight the failure to share information and thereby violated at least one mandate of the Homeland Security Act of 2002 – to coordinate intelligence between the CIA, FBI and other intelligence-gathering services. The consequence of this failure, as we know, was deadly. Additionally the attack emphasizes the importance of critically investigating all applicants for legal residence status, green cards and citizenship. Given recent history, such investigations are especially relevant when it comes to those seeking political asylum from states noted for fomenting and exporting Islamic terrorism. Humanitarianism suggests we can listen to requests, but we must be assured as to with whom we are dealing. Technology makes the job easier than it once was.

Political asylum should always be an option a caring and open nation should offer, but diversity should never be the goal, as some would have it. We want the smartest and hardest working immigrants who will endow their children with the same moral sense. Current wealth today should be less relevant in a decision regarding citizenship than the ability to produce wealth tomorrow. While we should offer people an opportunity, we should demand that they learn English, American history and understand and obey our laws. New immigrants should be drawn by the culture that has helped make this nation great: the strength of family, rule of law, a work ethic that includes aspiration and a sense of personal responsibility, the concept that there will be winners and losers in all endeavors, a moral certitude as to right and wrong and tolerance for all but the intolerant. Diversity (an unseemly buzzword) is fine individually, but it should lead to unity within the community. All who seek citizenship should do so because they want to be Americans – not Hispanic-Americans or Chechnya-Americans.

Immigration reform is something most Americans agree needs to be addressed. However, it is an issue that garners tremendous emotions because we have allowed extremists on both sides of the issue to control the debate. And too many of our political leaders are concerned about appealing to their base and their bids for re-election, than in doing what is morally principled.

We should never forget that the United States was formed by immigrants. They came to escape tyranny and because of their belief in individual freedom and self-determination. In 1630, when John Winthrop immigrated to America, he saw New England as a ‘City on a Hill’ where people could create their own government and worship as they chose. Despite the preponderance of the founders being of British heritage, America was already a polyglot nation by the time of the American Revolution. At that time, Noah Webster noted that more than 50 languages were spoken in Pennsylvania alone. In that regard, we may not be unique, but we are far from the norm. If I were to emigrate to Spain and my neighbor to Japan, nether of us would ever be Spanish or Japanese. On the other hand, should a Spaniard or Japanese emigrate to the U.S., both, in a few years, would be considered Americans. And we should not forget, most of those who come to this country are driven by a desire to make a better life for themselves and their children. They are aspirational. That fact applies to those who cross the border from Mexico, or arrive here on a scholarship to one of our universities.

More recently, there has been a problem with the children of immigrants who seem to see the United States as a punch bowl from which to drink deeply. The cause is cultural and societal. They live in segregated neighborhoods, but so did the Irish, Italians and Norwegians. In large part, the fault lies with our politicians. They find it easier to appeal to constituents by compartmentalizing them, emphasizing the differences rather that the commonalities. Moral relativism means that new immigrants fail to learn and understand our own standards, which have developed over the centuries and continue to adapt. Our schools, thinking they are doing them a service, teach them in their native language – a path that inexorably leads to a subservient lifestyle. Regardless of the cause, this is a fact noted by sociologists Edward Telles and Vilna Ortiz in their 2009 book, Generations of Exclusion, a book highlighted by Ross Douthat in Sunday’s New York Times.

We all know that the history of the United States is not without its imperfections. There is no way one can gloss over our forbearers for their treatment of Natives Americans. However, the government those pioneers created and the culture that ensued has never been equaled in the history of mankind, in terms of its fairness and its respect for the rights of others. It is a history of which to be proud and it is one that should be taught to all who want to live here. A reading and understanding of American history is necessary to understanding the nature of freedom and liberty. The reading of our history also teaches that the individual rights enunciated in the first ten amendments of the Constitution are natural; they are not handed down imperiously from some magistrate or legislator.

In terms of immigration, we are faced with two major problems – what to do with about 12 million illegals who are already here and secondly, securing the border against illegal crossing, especially against criminals and terrorist. We must provide reform that includes a pathway to citizenship. Having almost 4% of the population of the United States living in the shadows is neither healthy nor economically wise. With no legal status they are harder to track, thus could be more prone to criminal activity. They are consumers of government services without being contributors. Additionally, it is unfair to their children who will grow up seeing themselves as permanent outsiders. And they should not receive priority over those who have chosen to enter the country legally. At the same time, border crossings must be made secure, without being turned into an “iron curtain.” Neither change will be easy; both will require settling for something less than perfect.

The consequence of a successful immigration policy is a rapid and smooth assimilation. The emigration of Muslims to Western Europe is an example of what not to do. Instead of adapting to the customs of the country to which they have chosen to emigrate, they insist on bringing with them the culture and moral teachings of the country from which they were trying to escape. This can be seen in the political influence of the Imams, the pushing for Sharia law and the wearing of burkas by women. Citizenship, in fact green cards and temporary visas are not rights; they are privileges. When immigrants violate the laws of their adopted country they must understand they are subject to deportation. We can be neither shy about using that term nor squeamish in tossing undesirables out.

The United States grants citizenship to about one million people every year. And, the estimates are that about 500,000 people enter our country illegally, putting a disproportionate burden on our border states. Our attitude should be one of welcome, but if you come you must learn our customs, which if anything is about tolerance. You must learn our language, so that you can rise economically. You must read our Declaration of Independence and the Constitution, so as to understand the value we put on rights. You must read our history, how the nation was formed and the principles that guide it; and of the sacrifices that have been made by so many so that future generations can live freely. You must aspire to be a contributor, not a taker. You must assimilate – you must become an American.

The terrorist attack in Boston demonstrated some of the weaknesses of the present system. It argues for the enforcement of existing laws and demands greater vigilance and cooperation between intelligence agencies. But it should not stop the debate over immigration. That a man who had been tagged at least twice by different intelligence agencies, and of whom we were warned by the Russians, was able to return after six months in one of the hotbeds of Islamic jihadists terrorists signifies a lack of common sense. But it does not mean we should shut our borders. We do not want to become xenophobic, but neither should we be careless. The welfare of the many must take precedence over the right of someone like Tamerlan Tsarnaev to come and go as he pleases. His brother Dzhokhar gave up his rights as a legal immigrant the moment he and his brother planned the attack. As the survivor, he should be treated as the enemy combatant he is, rather than the common criminal our justice department would have him be. We must be magnanimous, but we must also be ever-watchful – “trust, but verify,” as President Reagan might have put it.


Thursday, April 25, 2013

“Internet Sales Tax – A Trojan Horse for a VAT?”

Sydney M. Williams

Thought of the Day
“Internet Sales Tax – A Trojan Horse for a VAT?”
April 25, 2013

Parkinson’s Law tells us that the stuff we accumulate expands to fill the space available. At home, with books piled on the library floor, my wife often tells me I am in violation. In the same way, the more money a politician has access to, the greater will be his desire to spend. We can be assured that any taxes raised by an internet sales tax will not be used for reducing debt. They will be used to increase spending. Our economy is growing, albeit sluggishly, and we are badly in need of jobs. More government, whether federal, state or local, is not the answer, whether it is at the local, state or federal level. As Winston Churchill so graphically put it: “We contend that for a nation to try to tax itself into prosperity is like a man standing in a bucket and trying to lift himself by the handle.”

A sales tax is an aphrodisiac to a politician. It is stealth-like and seemingly innocuous, in that it requires only small layouts per purchase, making it almost invisible and relatively painless. Were it to be aggregated annually, taxpayers would have stronger resentments. It is also regressive and it raises prices. Don’t think for a moment that the retailer will swallow the tax and the cost of its implementation. The costs will be passed on to the consumer. For example, in the case of the proposed Marketplace Fairness Tax, it would require every on-line merchant with sales in excess of a million dollars to collect (or be able to collect) sales taxes for all of America’s estimated 9,600 state and local taxing authorities. Amazon is the giant of on-line sellers and it endorses the new tax. Of course, the fact that Amazon sells its own tax compliance service to other merchants helps explain their position. Amazon is also interested in building additional warehouses – distribution centers, or “fulfillment centers.” They currently have about 40 centers in about 15 states, giving them an expanding national presence. And, with each warehouse planned, new tax breaks are negotiated.

The bill has received bi-partisan support, including a hearty and unsurprising endorsement from the President. Remember when Mr. Obama suggested that taxes will only be raised on “millionaires and billionaires – they can afford to pay a little more?” Well, this tax is regressive, which will hit all consumers equally. The Senate voted on Monday 74 – 20 to open debate on the measure, which could mean the bill will go to the Senate floor by the end of this week. It would then move to the House, where its future is less certain. Today’s debate stems from a 1992 Supreme Court ruling that determined a state cannot force a retailer to collect sales tax unless the retailer has a physical presence in that state. To a large extent, with the notable exception of Amazon as noted above, the bill pits brick and mortar retailers against those that sell on-line. It also pits states with sales taxes against those without, which accounts for the bi-partisan support.

Wall Street is concerned that the adoption of an internet tax will lead to a tax on financial transactions. That is probably true and a worthy concern; for every politician is always on the lookout for new sources of revenues. Remember President Reagan’s quote about government? “If it moves, tax it. If it keeps moving, regulate it. And if it stops, subsidize it.” Thus far, sales taxes have been the purview of states. And the Marketplace Fairness Tax is a tax that would be collected for states. But it requires the imprimatur of Congress, and once Congress gets its claws into a tax, it is like giving red meat to a hungry lion – it will be devoured, not saved, and the lion will want more. The question then becomes: will a VAT be far behind?

The biggest problem with a sales tax of this nature, besides being a harbinger, is that it is regressive, both from the perspective of the retailer, as well as from that of the consumer. While it is being called a Marketplace Fairness Tax, it is anything but fair to small businesses and lower-income consumers. Large companies will find it far easier and, relatively, cheaper to set up the programs necessary to collect the tax. In a discussion on this topic, Andrew Ross Sorkin, in Tuesday’s New York Times, somewhat blithely retorts: “What Mr. Donahoe [John Donahoe, CEO of eBay] did not mention is that Amazon will already collect this tax for merchants if they ask, and eBay will provide them with third-party technology services that will help them do this, too.” I am sure they will – for a fee. Small businesses have no ability to negotiate the special tax treatments Amazon is able to receive when they make decisions as to where to erect the next warehouse. And, when someone on minimum wage buys a book from Amazon, he or she will pay the same tax as a “one-percenter.”

It is possible that a compromise will be reached and that those companies with less than fifty employees and $10 million in annual revenues could be exempted. That would be an improvement, but, like the Affordable Healthcare Act, it effectively puts a governor on rapidly growing small businesses – the engines of job growth.

There are some who suggest that Congress should not grant these new taxing powers unless it first requires states to accept a unified, simplified reporting form. Keep in mind, besides having to prepare tax filings for up to 9600 different tax authorities, on-line retailers would have to be prepared for dozens of audits from various states and municipalities. Having Congress mandate unified forms sounds like a good idea, except that it paves the way for a federal VAT.

Crocodile tears are being shed by many state and municipal office holders. They complain that revenues do not match expenses. In those states with sales taxes, blame is placed, by some, on the fact that internet sales are replacing sales from brick and mortar retailers, thereby depriving states of income they feel is rightfully theirs. There is some truth to that, but keep in mind internet sales comprise only about 8% of all retail sales, including food; so that loss does not account for the growing financial gap between intake and outgo. Everyone recognizes that the rate of growth in internet sales exceeds that in stores. Forrester Research projects that e-commerce will account for 10% of all U.S. retail sales in 2017. But their tears smack of the song, “Don’t Cry for me, Argentina.”

There are those who look at the deteriorated income and balance sheet statements of myriad government entities and determine more revenue is needed. There are others who review the same statements and decide more fiscal discipline is in order. If you had a son who was addicted to slot machines and, in general was a wastrel when it came to matters of money, would you refill his wallet, or would you send him to rehab?

The Marketplace Fairness Tax is certainly not fair, as it pits start-ups and small businesses against the giants and will add costs to an already strapped consumer. It will damage job creation, but the biggest risk is that it serves as a Trojan Horse for a VAT, the aim of all progressives who believe in the benevolence of an ever-expanding government.


Tuesday, April 23, 2013

“The Art of Freedom – A Glimmer of Hope”

Sydney M. Williams
Thought of the Day
“The Art of Freedom – A Glimmer of Hope”

April 23, 2013

Despite stocks trading at close to all-time highs, it is hard to be an optimist. Federal debt is at record levels and still expanding. The unsustainable growth in entitlements threatens our future and is largely being ignored by our political leaders. An ever-expanding federal government, under the twin umbrellas of providing protection and increasing dependency, is encroaching on our civil liberties. Our public high schools are failing our students, as they enter an increasingly globally competitive environment. Public policy – the keeping of interest rates at exceptionally low rates – has served to help the wealthy especially, while poverty has increased. The number of people on disability and food stamps are at record levels. Jobs, the poor, and the retired have been sacrificed on an altar of low interest rates and tougher regulation.

Overseas, resets with Russia have not worked, nor have olive branches to the Muslim world. Europe is a mess, placing band-aids on wounds that may require amputation. China’s extraordinary economic growth has reached a level where the natural trend will be more moderate. The Arab spring has devolved into an Islamic autumn, with the Muslim Brotherhood having been the only winner. The threat from Iran has intensified, with the country four years closer to nuclear weapons; and the North Koreans are now demanding preconditions before any bi-lateral talks with the U.S. – no discussions of nuclear weapons.

There are, of course, some positive developments. Our economy continues to grow, albeit slowly. Technology is automating the factory floor, making manufacturing in the U.S. more competitive, but at a cost to unskilled labor. There appears little question that the U.S. should become energy independent, but the timing depends upon freeing up exploration and development for natural gas and oil. Growth will either be hindered or helped, depending on the political environment.

However, it has always been individuals that make a big difference in America. There are those rare souls who, often in unconventional ways, sometimes find answers to problems that appear insoluble. And their very small steps, because of their success, raise the possibility that a way forward will be found. Earl Shorris appears to have been such a man. His book, The Art of Freedom, was published posthumously earlier this year. It tells the remarkable story of this man’s attempt to address extreme poverty in a radically different manner. The book was reviewed by Naomi Schaefer Riley, in last Wednesday’s Wall Street Journal.

Twenty years ago Earl Shorris, a novelist and journalist, visited Bedford Hills Correctional Facility for women, a maximum security prison. He was there to interview and talk with inmates about poverty and the role it played in their lives. He asked one woman, Viniece Walker: “Why do you think people are poor?” She responds: “Because they don’t have the moral life of downtown.” She was referring to plays, concerts, museums and lectures – the humanities. She said they should start with the children. It was this unusual female prisoner and her point about the poor not being exposed to classical works in moral philosophy that caused Earl Shorris, in 1995, to begin a remarkable program to teach the humanities at the college level to people living under economic distress. The classes were called the Clemente Courses, named for the Roberto Clemente Family Guidance Center in lower Manhattan where they were first held.

The poor, Mr. Shorris believed, are enveloped in what he calls a “surround of force” – hunger, isolation, illness, drugs, police, abuse, criminals and more. It is a prison from which they cannot escape and which prevents them from being political in the classical sense. Earl Shorris realized that “no one could step out of the panicking circumstances of poverty directly into the public world.” These courses would ease that transition. His concept was to recruit a well-qualified faculty that would provide courses in literature, moral philosophy, critical thinking, art, history and logic, with an exclusive emphasis on Western civilization. They would learn politics in the sense Pericles used the word – meaning activity with other people at every level from the family to the neighborhood, to the broader community. More than anything, Mr. Shorris believed the inability to communicate readily and easily with others was forcing the poor to live lives of fearful desperation. He would respect them, calling them either Ms. or Mr. He did not coddle them, but challenged them. He told those who chose to sign up for his courses that they would be tested and made to write papers – that the class would not bend to the lowest common denominator. They would have to keep up.

When Mr. Shorris was recruiting faculty from the University of Chicago, one candidate, Danielle Allen, listened intently and silently to the pitch. When Shorris finished, she sat quietly for a moment and then said, “Oh, I see: it’s about freedom.” She was right. One of the principal purposes of education is to free one’s mind from all the pre-conceptions that are innate to our beings – the “surround of force” that encompasses us all. Classes were taught using the Socratic method, involving inquiry and debate, with the professor leading the discussion. The success of each student depended upon their being able to present their argument in a manner that is logical and does not violate one’s inherent beliefs. Rap talk and hip-hop would not serve.

Mr. Shorris promised nothing; only that one’s mind would be enriched. The courses offered: philosophy, literature, art history, logic, rhetoric and American history. Students had to be poor, between the ages of 18 and 35 and able to read a tabloid newspaper. The first class, in 1995, was comprised of thirty students. According to the website for Clemente Courses, more than 10,000 have taken the classes, with half having completed the year-long program. Many of those have completed four-year colleges.

There are millions of people living below the poverty line in America, more than there ever have been. Politicians talk about the problem, but do very little to address its causes. Mr. Obama favors redistribution, as an immediate answer, but in four years, jobs have lagged and poverty has increased. As he was preparing his first course, Viniece Walker warned Mr. Shorris not to forget Plato’s Allegory of the Cave. To Ms. Walker, her life was akin to that Allegory. She had been trapped in poverty and contracted aids. As she saw it, the poor sit, chained to a wall as it were, convinced that the shadows on the wall are reality. The study of the humanities, she believed, would allow the desperately poor to cast off their chains and see who and what casts the shadows.

Reading the book and thinking about the issue, it seemed remarkable about how much more positive was this attempt to free people who have been enslaved by circumstances – inadequate education and a lack of hope – than simply making it easier to collect another week’s food stamps or sign up for disability. Not everyone will succeed in classes like the Clemente Courses. Some don’t have the ability and others are missing the aspiration to succeed. But my guess is that the numbers of those who would succeed greatly exceeds our expectations. It is not so different than the old Chinese proverb that says give a man a fish and he will eat for a day, but teach him to fish and he will eat for a lifetime.

The Western classics have been treated poorly in many of our schools and universities, as being dated and irrelevant in our global, multicultural world. But what has made Socrates, Sophocles, Shakespeare and John Stuart Mills such classics over the centuries is not that they were white men, but that they wrote about and discussed universal truths and ethics. They used politics to engage those with whom they disagreed. Contrast those studies to the teachings of Islamic extremists who see the path to heaven paved with the death and destruction of infidels. Which of the two choices is the more relevant in any world, multicultural or otherwise? The Tsarnaev brothers, as residents of the U.S., were given the opportunity to pursue Western culture. They chose not to do so; rather, they became radicalized. Instead of engaging in debate, they chose to kill. The consequence was the death of four people and the destruction of several families.

By itself, the teaching of the Western classics will not lead people out of poverty, but they will open minds, freeing individuals to pursue ideas and dreams, and doing so in a logical manner. The knowledge they are given will allow them to fully engage with others, in a political sense. As a result, they are far more likely to be productive members of society than dependencies. Mr. Shorris, in The Art of Freedom, shows what is possible for a people that society had largely written off. It is a message we should all take to heart, for there is not one of us who could not benefit from his wisdom.

It is enough to make one reconsider one’s negative concerns.


Monday, April 22, 2013

“The President’s Priorities”

Sydney M. Williams

Thought of the Day
“The President’s Priorities”
April 22, 2013

Last week two unrelated events served to reflect the apparent priorities of President Obama. In the first, when gun legislation failed in the Senate, the President said the vote cast shame on the nation. In the second, a Philadelphia abortion doctor, Dr. Kermit Gosnell, was in his fifth week of trial for murder, with scanty reporting from mainstream media. The details of the grand jury’s report and comments from eyewitnesses were finally beginning to be reported. The story is sickening – babies born alive that had scissors stuck into the back of their necks to sever the spinal cord. Despite the horrifying nature of the crime, the President, when asked to comment, chose not to do so, hiding behind the trial.

The President’s plate was overflowing last week – the terrorist attack in Boston on Monday; the disastrous fire and subsequent explosion in the West Fertilizer plant in West, Texas that killed fourteen and left at least 200 suffering burns, lacerations and broken bones, and the (temporarily forgotten) ongoing nuclear threats in North Korea and Iran. While it took him a day, Mr. Obama did refer to the Boston terrorists, as terrorists – something he has not yet done regarding the Fort Hood shootings, when Army Major Nidal Malik Hasan killed thirteen, while shouting “Allahu Akbar,” God is great! That terrorist attack is still referred to as “workplace violence.”

Nevertheless, the gun and abortion issues manifest the President’s ideological biases. Mr. Obama could have forged a compromise in an attempt to contain gun violence. The President’s has vilified the NRA as an organization that promotes gun violence by denying governmental authorities the right to restrict magazine capacities and to ban certain types of automatic weapons. Keep in mind, membership in the NRA by gun owners is relatively small, at just over four percent, but it is a convenient and recognizable target. Not surprisingly, Mr. Obama’s Presidency has been good for membership growth.

The Senate vote was a classic case of seizing defeat from the jaws of victory. An up or down vote on a deal struck by Senator Joe Manchin (D-WVA) and Pat Toomey (R-PN) would have prevailed with 54 Senators voting aye, according to the Wall Street Journal. But that would have opened the measure for up to 30 hours of debate. So the White House demanded, and Senator Harry Reid agreed, that the bill be passed without a debate – a procedure requiring 60 “yes” votes. It failed, garnering only the 54. A vote against restricting magazine capacity lost 54 to 46, with 10 Democrats voting against such restrictions. A vote to ban certain types of automatic rifles failed 60 to 40, with fifteen Democrats voting with the majority. If Mr. Obama had allowed the Manchin-Toomey vote to proceed, it would have represented a symbolic victory. If he had co-opted Republicans, and officials from the NRA, Mr. Obama might well have a bill that might have helped prevent the purchase of guns by known criminals, terrorists and the mentally unstable. The President put politics ahead of the American people and then described the defeat as a “shameful day for Washington.” It was a shame, but the shame is Mr. Obama’s.

In the case of Dr. Gosnell, politics again ruled the day. Dr. Gosnell activities were not unknown to the media or to the Left. He did what he did because of a conspiracy of silence – the title of Jonathon Capehart’s op-ed in the April 16th issue of the Washington Post. First, it should be understood that Dr. Gosnell’s clients were poor, mostly African-American women from Philadelphia’s inner city – in other words, women with very little political influence. Apart from a woman who died on his operating table, the victims were infants often aborted in the third trimester and, at times born alive. Criticism of abortion, no matter the circumstances, is taboo among the Left. Gosnell had been approved to perform abortions in 1979. Subsequent site reviews detected violations, but there was no follow up. Complaints to the Pennsylvania Department of Health and the Philadelphia Department of Public Health went unheeded. The police eventually raided his practice, not for the murders, but for “illegal drug activity.” Toward the end of his otherwise enlightening article, Mr. Capehart does his profession a disservice when he writes: “Ultimately, the conspiracy of silence lies not with the press.” He places the blame on co-workers, members of the community and city and state officials. The fact is, all are guilty, including the Press. The Press exists, in part, to prevent such tragedies. It was a refusal by the media and others to confront a politically sensitive subject like abortion that allowed Dr. Gosnell to kill for so many years. It is outrageous and shameful that the President was not outraged.

In many respects, both incidences are examples of the danger of political correctness. Our fears of offending the sensibilities of others can damage society. Consider the reluctance of mainstream media to use the term Muslim or Islam when reporting on the brothers Tsarnaev. Most Americans, including 99.9% of gun owners, are normal, sane, law-abiding citizens. There are a few who are criminals, a few who become terrorists and a few who have some form of mental sickness. Mass murders are not committed by criminals. Apart from acts of terrorism, mass murders, such as school and college shootings, are almost always the consequence of a deranged person with a weapon. Chicago is proof that strict gun laws don’t keep guns out of criminal’s hands. Connecticut’s relatively strict gun laws did not prevent Adam Lanza from having guns at home and practicing at licensed ranges. While Adam’s classmates described him as crazy, school officials used politically correct euphemisms, like “troubled,” “reclusive” and “painfully shy.” If school officials had described him as “crazed” or “deranged,” he might never have been allowed access to guns, either by his mother or gun sellers, and twenty-one children and adults might still be alive. We certainly do not want to return to an era of fortress-like lunatic asylums that dotted our landscape from the mid 19th Century to the 1950s, but neither should we be releasing into society those seriously in need of mental help.

Criminals will always find weapons, and crimes of passion will always result in someone getting hurt. The nation has done a generally good job in thwarting terrorism at home, but we persist in letting the crazies roam the streets. The focus of gun control should be to get guns out of the hands of the mentally deranged, and my guess is that the NRA would support such measures. If it means tagging people with politically incorrect terms, we should do so. Do their rights to privacy have precedence over our rights to safety?

President Clinton once said that “abortion should not only be safe and legal; it should be rare.” That was a sensible statement, as it suggests that “choice” should not be entered into lightly. There are justifiable reasons for abortion – rape, incest or a child whose defects may put an undue emotional or financial strain on the mother. It should not be seen as a form of birth control. As a State Senator in 2003, Mr. Obama voted against a bill that would ensure medical care to babies born alive after botched abortions. Perhaps it was that thinking that motivated Dr. Gosnell, who referred euphemistically to the severing of spinal cords of babies born alive as “ensuring fetal demise.”

It is proper for a President to avoid commenting on a trial in progress, when an accused stands charged, or when an incident has occurred with no clear cause. But that did not stop the President from commenting on the George Zimmerman-Trayvon Martin case a year ago last March – “If I had a son, he’d look like Trayvon.” When the Cambridge police arrested Harvard Professor Henry Lewis Gates for “breaking into” his own house, the President’s first reaction was to assume the police were racially profiling. As we all know, Mr. Obama’s immediate reaction to the murders in Benghazi was to blame a video, and then he continued to have his people tout the same story after it was abundantly clear that the attack was a consequence of Muslim extremists. It is not obeisance to legal etiquette that held back the President in the case of Dr. Gosnell. It was politics.

Abortion is a complex issue and many people, including me, are conflicted. I have enormous respect for a woman’s rights and determination regarding her body. There are times when abortion is necessary and appropriate. But conditions vary, and what is right for one person may not be so for another. Abortion, as President Clinton said many years ago, should be safe, legal and rare. Unfortunately Mr. Clinton’s words were dropped from the Democratic Party’s 2012 platform. Professor Leon Kass, in the weekend edition of the Wall Street Journal, admitted to being agnostic as to whether an embryo is “a human being equal to your grandchild,” a sentiment with which I agree, but there can be no doubt in anyone’s mind that life begins at conception. A problem with Roe versus Wade is that it never addressed the question as to when life begins, but to argue that life begins at some point other than conception is to deny science. Questions of health and mental well-being must be addressed when abortion is being considered, but so must questions of morality and ethics. If an embryo is to be destroyed, we cannot hide from the truth of what will be done.

America is a unique place. Our priorities help define us. Words and actions have meanings. There was nothing shameful in what the Congress did in turning down the President’s gun bill. His words afterward were simply petulant. The mistake was the President’s, in that he apparently had more interest in handing the NRA a defeat than in trying to prevent a repeat of the terrible incident in Newtown. And his failure to not show disgust at Dr. Gosnell’s crimes, indicate that politics for him prevail over what was an obvious moral wrong.


Friday, April 19, 2013

“Bitcoins – What Are They?”

Sydney M. Williams

Thought of the Day
“Bitcoins – What Are They?”
April 19, 2013

Bitcoins gained wide notoriety last week when their value declined by two thirds. They were created four years ago, by an anonymous individual (or group) operating under the pseudonym Satoshi Nakamoto. Bitcoins are not controlled by any country or central bank. They have no physical presence, so cannot be held. They can be described as an experimental, decentralized virtual currency that enables instant payments to anyone anywhere in the world who accept Bitcoins – a small but allegedly growing number of businesses. Wednesday’s Wall Street Journal suggested “…everything from maple syrup to pornography.” Transactions are anonymous, irreversible and untraceable; therefore ideal for those who would like to hide what they do.

The idea for Bitcoins was apparently a consequence of the near-fatal financial crash in the fall of 2008. They first appeared in February 2009. They are created through a process called “mining.” In theory, anyone with a computer could create Bitcoins through an automated mathematical process. By rapidly checking various combinations of number and letters, it is possible to stumble on unclaimed Bitcoins. But they were designed to become increasingly difficult to “mine,” as time goes on; so that the numbers outstanding will expand at a decreasing rate. Thus the number outstanding is expected to expand from slightly over 11 million today to 21 million in 2140 – an implied compounded annual growth rate of about 0.5%. They are traded on exchanges like Japan’s Mt. Gox, which allegedly controls 80% of the volume and San Francisco’s TradeHill. They are stored in Bitcoin “wallets.” It should be noted that on Wednesday one Bitcoin exchange, Bitfloor that handled an estimated 4% of daily volume, announced they were closing. They said they would return all customer funds.

According to a report last February by Loz Blain, Bitcoins traded for $0.10 in September 2010. They broke the dollar level in February 2011. This past February, when Mr. Blain wrote his article, Bitcoins were selling for $30. Last week, they traded as high as $266. Late Thursday, they were trading around $95. That type of price move is catnip to a speculator. Last Friday, the New York Times ran a front page article on the Winklevoss twins (of Facebook and Olympian fame,) as becoming Bitcoin “moguls,” with a position valued at $11 million when they traded for $120. Today’s price implies the twins are out about $2 million.

As to whether Bitcoins make it into the big time as an acceptable alternative to nation-backed global currencies, depends on a few issues. While the U.S. Treasury’s Financial Crimes Enforcement Network has indicated that individual Bitcoin users and miners would not be regulated, exchanges and “wallet” holders may be. According to some reports, about 10% of Bitcoins have disappeared. Despite the fact that Bitcoin’s protocol has not, thus far, been compromised, security is a constant threat. And, obviously Bitcoins must win broad acceptance by retailers and service providers. Ultimately, the value of any currency is the ease with which it can be exchanged for goods and services – its acceptability. A paper dollar has no intrinsic value; it is only its acceptance by others that makes it so. Its issuance is controlled by the Federal Reserve, which may choose to expand or shrink the supply outstanding – generally expand, not shrink. The word “currency” comes from the Middle English word “curraunt,” which meant “in circulation.” Bitcoins to work must not be hoarded; they must circulate. Mai-Kim Cutler of TechCrunch recently estimated that the daily trading volume of Bitcoins is $31.1 million. In a market that daily trades $5 trillion in currencies Bitcoins would barely create a ripple.

Speaking of ripples, Chris Larsen, a venture capitalist with half a dozen successful start-ups under his belt, founded OpenCoin and recently formed an alternative math-based currency called the Ripple. It is virtual, like the Bitcoin, but it doesn’t require mining and apparently its operations will be more open. Whether the Bitcoin and/or the Ripple succeed or fail, what seems indisputable is that my grandchildren’s payment transactions will not be their grandfather’s, just as mine are very different from my grandfather’s.

Regardless of the obvious potential pitfalls and the somewhat patronizingly snide comments from detractors, the possible disruptions to traditional transactions could be significant. The average of daily electronic payment transfers in the U.S. is about $5 trillion. Since the U.S. represents about one quarter of global GDP, the actual number must be between $10 and $20 trillion. Thus companies like American Express, Visa, MasterCard and Western Union have to take the threat seriously.

Certainly, the venture capitalists that Ms. Cutler interviewed are doing so. They seem to endorse the concept, as today’s international payment transactions are more complicated than, for example, sending a music track to Ecuador. There are also transaction fees. Personal information and credit card numbers are frequently hacked. Is my credit card information safer at Amazon than Bitcoins would be in a “wallet?” Is the fee charged retailers by credit card companies uncompetitive in a Bitcoin world? The VCs are betting that Bitcoin will need a more reliable “ecosystem of payment processors, exchanges, wallets and financial instruments.” But it seems they are taking the concept seriously.

It is the mystery that surrounds the creators of Bitcoins and the fact that buyers and sellers seemingly have no legal obligations or recourses, which may ultimately condemn Bitcoins as nothing more than the 21st Century’s version of Amsterdam’s 17th Century tulip mania. Paul Krugman has criticized their design as encouraging hoarding and thereby the risk of wild price swings. The anonymity of Bitcoin owners and the fact that transaction are difficult to track has supposedly made them attractive to pornographers, drug dealers and money launderers. However, when one first goes to buy Bitcoins, an account must be opened and either cash or a check must be deposited. Once Bitcoins have been purchased, all future transactions are untraceable. Because no one knows who is transacting, allegations that they are being used for nefarious purposes are pure speculation. Nevertheless, they have been described as being akin to passing over a paper bag filled with hundred dollar bills.

If Professor Krugman is right, and Bitcoins are hoarded, not spent, then they will never serve as a currency. If they are not readily circulated, their value will diminish. If it is true that the Winklevoss twins bought their Bitcoins last year, they may not be the “Greatest Fool,” but they are playing the game. The jury is out, but it seems obvious that payment transfers in the future will be different from what they’ve been in the past. Bitcoins may serve that function, but at this point caveat emptor appears appropriate.


Wednesday, April 17, 2013

“Boston –A Cradle of Civilization”

Sydney M. Williams

Thought of the Day
“Boston –A Cradle of Civilization”
April 17, 2013

“As all schoolchildren know, the “Cradle of Civilization refers to Mesopotamia, a region centered around the city of Harput in eastern Turkey, in that Fertile Crescent between the Tigris and Euphrates Rivers. It extends into the Kurdish region of Iraq and toward Baghdad. While it was home to the first written records, it is an area that has seen tremendous violence over the years. Five thousand years later, Bostonians, in the years following the American Revolution, looked upon themselves as the new world’s “Cradle of Civilization.”

The designation smacked of being supercilious, but was not unsurprising. The greater Boston area is home to the nation’s oldest university, Harvard, and to more than 100 other universities. The Boston Symphony and Boston Pops are world renowned. There are more than 40 museums, perhaps the most famous being the former home of Isabella Stewart Gardner. The belief in a “Cradle of Civilization” lay behind the Transcendentalists, a group that was formed in the 1830s. It included notables like Henry David Thoreau, Ralph Waldo Emerson, Walt Whitman, John Muir, William Ellery Channing, Amos Bronson Alcott and his more famous daughter Louisa May. They were idealists who believed in the inherent goodness of people and nature, but that society and its institutions (especially political parties) ultimately corrupted the purity of the individual. They considered themselves unquestionably civilized. Self reliance and independence were characteristics they highly valued. They would have been shocked, but not surprised by the horrific events in Boston on Monday.

That attack occurred on Patriot’s Day, a day that commemorates the first battles of the American Revolution, at Concord and Lexington on April 19th, 1775. Faneuil Hall came to symbolize Boston’s “Cradle of Civilization.” In the years just prior to the Revolution, it had become a gathering place for those known as patriots. They protested the growing presence of the British army and the raising of taxes by a Parliament in which they were not represented. The day is a state holiday and includes a home day game for the Boston Red Sox and the running of the Boston Marathon. The Boston Marathon has its own history. It is the oldest marathon in the Country and among the most difficult. It was first run in 1896. There were marathons during both world wars. In a less politically sensitive time, Korean Americans were precluded from participating during the Korean War. Women were only formally permitted to run in 1972, though the first ones ran in 1966. The last American man to win was Greg Meyer in 1983 – the last American woman, Joan Benoit in 1979.

In retrospect, it was not surprising that terrorists would use this day to exploit their heinous craft. They knew that the day, and especially the finish line, attracted thousands of people, maximizing their ability to kill and maim as many innocent men, women and children as possible. It was pure luck that not more people were killed. Their very act makes one reconsider the concept of civilization and a civilized society. However, the term “civilized” has long been associated with violence. Indigenous groups, whether they were Visigoths, Celts, Aztecs, African natives or American Indians, were often killed by “civilizing” conquerors.

What is civilization? When asked about Western Civilization, Mahatma Gandhi responded, “I think it would be a very good idea.” In The Philosophy of Civilization, published in 1923, Albert Schweitzer defined civilization as being “the sum total of all progress made by ‘mankind’ in every sphere of action and from every point of view, in so far as this progress is serviceable for the spiritual perfecting of the individual.” Webster’s first dictionary (1828) defines civilization: “The state of being refined in manners from the grossness of savage life, and improved in arts and sciences.” From having worked on trading floors for the most part of the past five decades, I suspect Noah Webster would have noted a reversion to savagery in much of the “civilized” world.

A lack of civility became commonplace in the past Century, as wars extended from the battlefield to the home front. Gas emitted from trenches on both sides in World War I followed the wind, which sometimes headed toward the enemy, but might drift over a field of cattle, or into a nearby village. When Germany bombed London during the early months of World War II, their purpose was to kill civilians, so as to dishearten the population. More than 28,000 Londoners died over an eight month period. During three days in February 1945, Allied bombers deliberately killed about 25,000 citizens of Dresden for the same reason. In August 1945, American bombers dropped atomic bombs on Hiroshima and Nagasaki, killing more than 200,000. The purpose was to shock the nation into unconditional surrender. It worked. A few years earlier in Nanking, the Japanese killed an estimated 250,000 to 300,000 Chinese, mostly civilians. Total dead in World War II is estimated to be 60 million, two-thirds of them civilians. Almost half a million civilians died in Vietnam during their two-decade civil war. It is estimated that 77,000 Syrians have died over the past two and a half years. So much for the civilizing influence of societies over the past hundred years. In his same book, Albert Schweitzer wrote presciently, “…the ethical ideas on which civilization rests have been wandering about the world, poverty stricken and homeless.”

Recently an old Greenwich friend asked if we might move back to “civilization.” My wife and I moved to Old Lyme from Greenwich twenty years ago. It got me thinking about civilization. My first reaction was to explain that I enjoyed the bucolic sense Old Lyme offers and especially our place on the marsh rivers near the mouth of the Connecticut. The contrast of standing on the cat-walk watching the season’s first Ospreys circling lazily in the sky with combating thirty-somethings, driving SUVs almost the size of my first house, jostling for a parking spot on Greenwich Avenue, made my response easy.

But civilization is not about the rural beauty of Southeastern Connecticut, anymore than it is about playing bumper cars in Greenwich. It is not limited to the city of Boston, nor is civility denied Boston because of Monday’s horrific events. The word civil derives from the Latin civilis, which relates to a citizen – a townsman, as opposed to a soldier. Citizens were considered more courteous than soldiers. Civilization is about the arts, but it is more a state of mind; it is an ideal toward which we strive. It represents the freedom each of us has to pursue our dreams, to speak and assemble unafraid. It is the living in a society based on the rule of law, one that supports the concept of the greatest good for the greatest number. It is true, as Henry David Thoreau once wrote: “While civilization has been improving our homes, it has not equally improved the men who are to inhabit them.” Nevertheless, it is a worthy goal. Not unlike Stewart Little’s quest, or the search for the Holy Grail, it is ultimately an elusive and unreachable goal. There is no Nirvana. It is, however, an ideal for which a civilized people should never lose sight and for which they should never stop searching.

The bombs that went off near the finish line of the Boston Marathon in no way infers that Boston is uncivilized, but they do say that there are uncivilized among us, and we must acknowledge their presence. There are those who now say that we will never be the same, that we will never feel safe. Time is a great healer, but we are all products of our experiences and what happened on Monday will not only live on in the memories of those who were there, but in the minds of all Americans. Time will serve to lessen the pain, both physical and emotional. But, we cannot and should not forget that there are those who wish to do us harm. At the same time, we cannot let a vindictive spirit thwart our desire to live peaceful and productive lives.

Terrorism is not limited to al Qaeda. It functions under many names. It can be domestic, as well as foreign. We have been involved in a war against terror for many years – long before 9/11, in fact. Thousands of Americans, including children, have been killed. The concept of such a war is difficult to grasp and even harder to explain and defend. In a speech to the nation shortly after 9/11, President George Bush noted that it would be a war unlike any other we had waged, that it would last for decades. Euphemisms, such as “work place violence” or “man-caused disasters,” serve to trivialize what are, in fact, acts of terrorism. They are perhaps meant to console, allowing us to more quickly get on with our lives. But they lack truth; so should have no place in our lexicon. The screams, the blood and severed limbs on Monday were reminders that terrorists remain among us. We must remain vigilant, even as we cannot allow such actions to disrupt our freedoms. In the face of such horror, we must catch and punish the offenders – we must not be afraid to refer to them as the terrorists they are – but never failing to forget that our liberty is dependent upon our keeping that illusive sense of civility forever in our sights.


Monday, April 15, 2013

“The Message in the President’s Budget”

Sydney M. Williams

Thought of the Day
“The Message in the President’s Budget”
April 15, 2013

Despite its flaws, this year’s budget proposal by the President should fare better than last year’s. That one went down to defeat in the Democratic-led Senate 99-0 and 414-0 in the House.

The President’s unveiling of his budget was not without humor. With a straight face, President Obama stood in the Rose Garden and said: “Our economy is poised for progress as long as Washington doesn’t get in the way.” He then proposed a budget of $3.8 trillion, 3.2% bigger than last year’s. It will hike spending by $247 billion for the next two years over the projected “baseline.” The increases for 2014 would include $76 billion for “preschool for all,” and other initiatives, like $40 billion for “fix it first” infrastructure projects. It would create a National Infrastructure Bank (more bureaucracy), $1 billion to set up 15 “manufacturing innovation institutes” (even more bureaucracy) and a new version of “Build America Bonds.” It would increase tax revenues by $1.14 trillion over ten years. If government is getting out of the way, one wouldn’t know it based on his proposals. Deficit reduction would amount to 1.3% over the next ten years. The national debt would continue too rise. Despite words to the contrary, Mr. Obama insists that government is the way.

It was amusing that papers like the Financial Times, the New York Times and the Wall Street Journal emphasized the risk the President took with his proposed budget. The risk to which they refer was the nominal one of using “chained CPI,” rather than the traditional Consumer Price Index for determining increases in annual adjustments for programs like Social Security. Chained CPI was created by the Bureau of Labor Statistics (BLS). It uses a time series of price levels of consumer goods and services. The real effects are tiny. A Social Security recipient of a $1000 check could expect $997.50 instead. Real reform would have meant increasing the age of eligibility and implementing some form of means tests. The President’s budget calls for more spending than that proposed by the Democratic-led Senate. Alan Blinder referred to this budget in Friday’s Wall Street Journal, as “…a reasonable model of what might pass for a compromise in a less partisan Congress.” If this is reasonable, I shudder to think what an unreasonable budget proposal would entail.

Mr. Obama makes no pretense at balancing the budget. Deficits come down under his proposal, but they persist as far out as a reasonable person is willing to look – more than ten years. It ignored the least painful and fastest way to reduce deficits, which is to accelerate economic growth. When economies recover smartly, Washington bureaucrats inevitably underestimate growth in tax receipts. The message from the President is that government remains the engine of economic growth. The $1.14 trillion in tax increases, coupled with $600 billion increase passed to avoid the fiscal cliff in December, will retard, not boost, growth. Additionally, economic growth will struggle against the headwinds caused by the adoption of the Affordable Care Act and all the taxes associated with that bill. People may go hungry, but government will not starve.

On Friday, in his New York Times op-ed, David Brooks noted that under the President’s proposal domestic discretionary spending would be the lowest since the Eisenhower’s years – a problem because “America faces two giant problems: social unraveling today and cataclysmic debt tomorrow.” What Mr. Brooks ignored is the reason why discretionary spending has fallen as a percent of the overall budget. The answer is because mandatory spending has risen from 25% of the Federal budget in 1962 to 56% in 2011 . Mr. Brooks is willing to accept more debt today to help solve issues like unwed mothers, men dropping out of the workforce and students failing to complete college. There is no question as to the magnitude of the two problems Mr. Brooks cited. They represent clear and present dangers. But social unraveling has many fathers, including a morally defunct entertainment industry, which is lauded in a politically correct media. It is also a consequence of government’s fostering dependency – think Julia’s world. An emphasis on personal responsibility would help resolve the social unraveling more quickly. Government intervention is not the answer to today’s ethical lapses. The fifty percent increase in disability roles has little to do with more people being disabled; rather, it reflects an increase in dependency. Debt may be “tomorrow’s” problem. But, if the Fed stops printing and if it lets interest rates reach a natural equilibrium, it will quickly become today’s problem.

In former times, the Country faced severe financial disruptions that did not create the problems we have today. In 1987, the Dow Jones Average lost 22.5% in one day. That would be equivalent to the Averages losing over 3300 points today. People were shaken; it was a frightening time, but the economy did not collapse. Then, between March 2000 and March 2003, the S&P 500 declined more than 50%, and the NASDAQ fell by 85%, causing $6 trillion in equity losses, yet the economy only suffered a ripple. Things are never the same, and I understand the risk of such comparisons, but the big difference is that Washington did not view those crises as opportunities to exploit.

While Democrats and their buddies in the media like to portray Republican’s objections to tax increases as favoring the rich and as mean-spirited aversions to social reform, that simply is not true. The two principal arguments in favor of lower tax rates are that such policies help keep government less intrusive and that they promote job growth. Warren Buffett cries crocodile tears, as he admonishes Republicans for not paying more in taxes; yet instead of writing a bigger check to Uncle Sam, he puts his money into charitable trusts. Why? Is it not possible that he believes that private charities are both better fiduciaries and better dispensers of his wealth? Government has been fiscally reckless for many years across administrations of both Parties. A byproduct has been a rise in crony capitalism, which threatens our economy and moral standards. Reform is needed, and the only reform that makes sense, in this regard, is term limits. Any loss of efficiency would be more than offset by more open and honest governance.

The message in the President’s budget is a call for bigger government and greater redistribution. Increasing dependency may score more votes, but it does little for the young and the poor who are aspirant. It is those who should be encouraged. Implying that $3 million is all one needs to retire simply reflects an innumerate mind. Three million dollars is, as Friday’s editorial in the Wall Street Journal noted, “…roughly the value of a California police sergeant’s pension if she works for thirty years, retires at age fifty and lives to normal expectancy.” Two and a half percent annual inflation erodes the dollar by fifty percent in a generation. Defined pension plans for public sector workers are no longer affordable. The end may come gradually, or it may come after – to use David Brooks’ word – a cataclysmic event. Given the reaction of several House and Senate Democrats to President Obama’s token reform measure of using chained CPI for Social Security payments, it appears that it will take bankruptcy on a major scale for politicians to realize the true financial state of our entitlement programs. What is true for Democrats is also true for most Republicans. In 2005, when George Bush attempted Social Security reform, his efforts were similarly sabotaged by his own Party. Personal savings is what is needed, and the President’s budget discourages savings exactly at the moment when the people need it most.

On March 11th, I wrote a piece, “Coolidge a Primer for Obama.” It was largely drawn from the informative book, Coolidge, by Amity Shlaes. In stark contrast to President Obama, Mr. Coolidge’s, 1929 fiscal budget was lower in absolute dollar terms than his 1923 budget. I will grant that the problems Mr. Coolidge faced on gaining the Presidency in 1923 were not as formidable as those that faced Mr. Obama in 2009. Nevertheless, Coolidge had his challenges – corruption within the Harding administration, legacy debt incurred during World War I, an over-indebted Europe and sluggish economic growth at home. He rid the country of corruption, erased our debt and presided over one of the greatest growth spurts in America’s economic history. He did so by limiting government and lowering tax rates. Coolidge’s years as President proved to be among the Country’s most productive, especially in terms of innovation in consumer products. When he became President only 30% of U.S. homes had inside plumbing. When he left, 70% of homes did. Economic growth, during his six years, averaged better than 5% per year.

The message in the President’s budget was one of continued spending and further tax increases. Federal spending is projected to grow fifty percent faster than growth rates in GDP and inflation. More than anything, it is a document that indicates how far to the left we have drifted. And with that drift has come pessimism about the future of the country, especially for the young to do better than their parents. The contrast with the optimism of the 1980s is striking. Ronald Reagan signaled a sharp departure from the gloom and cynicism of the 1970s. Since Mr. Obama assumed office four years ago, the country has grown more despondent, marked by a declining workforce and a growing number of people on food stamps and disability.

The President has advocated a “Julia’s world,” in which the government will look after its citizens from cradle to grave. There are many who find such promises comforting, but more who see them as patronizing and stifling. Keep in mind, those are the same promises made by both Communists and National Socialists in the years between the Wars. Mr. Obama is a great speaker, and sells his message well. But the benefits he offers come, definitionally, with a loss of personal freedom – the value of which can only be realized when it is lost. The history of America is the story of a successful struggle for independence and individual freedom. The President’s message leads in the opposite direction.


Thursday, April 11, 2013

“Margaret Thatcher – R.I.P.”

Sydney M. Williams

Thought of the Day
“Margaret Thatcher – R.I.P.”
April 11, 2013

One great irony of Margaret Thatcher’s death is that it comes at a time when the Western world is once again in need of someone of her character, her economic and political common sense and the moral courage she had to make unpopular decisions. “The principles she established,” wrote Andrew Roberts in Tuesday’s Wall Street Journal, “…have perhaps more relevance now than at any time since the 1980s.” Europe is a mess. Britain is in turmoil. The U.S. is like a ship that has lost its anchor. We live in a sea of moral turpitude, breeding dependency while starving personal responsibility. Our government has assumed debt at alarming rates and made promises that will be impossible to keep. At the same time, we seem unable to temper terrorist regimes, be they in the Middle East or East Asia.

Mrs. Thatcher was one of the last Century’s great political leaders of the Western world. She joins a pantheon that includes Churchill, Roosevelt, Reagan, de Gaulle and Adenauer. All were leaders who had a significant impact on their countries and the world. Three of them came to prominence during World War II. Do men cause events, or do events make men? It is an age-old question, without any real answer. History always remembers those who led during times of war. But Ronald Reagan and Margaret Thatcher did something different. They were elected when neither country was at war, and they each saw their respective countries in economic and cultural decline. They both altered that path. The Western world was fortunate that their years in office overlapped so precisely. Their most vociferous critics have been beneficiaries.

Baroness Thatcher of Kesteven, as she became following her resignation as Prime Minister, has certainly been controversial. She made enemies. There were the coal miners, who endured a strike lasting 362 days without a settlement and who, at its end, saw the elimination of at least 20,000 jobs and the collapse of their union. There were the pro Europeans who complained that her policies appeared contradictory and who disapproved of her reversing Edward Heath’s moves toward greater dependency on Brussels and the European Community. There were feminists who claimed she did not respect their cause. “The battle for women’s rights has largely been won,” she told them. She once referred to the genre as “fashionable rot.” In 1983, when she ran for re-election, many feminists taunted her. “Ditch the Bitch,” was their rallying call. Even in death, there are those who still see her that way.

But, a lot of the criticism was patronizing. Most British politicians were men, and most were products of public schools (Britain’s private boarding schools), and Oxford or Cambridge. While she had two Oxford degrees, she attended on scholarship. She was born in a cold-water flat in Grantham, Lincolnshire, over a grocery store owned by her father – a flat with outside plumbing. Her upbringing had more in common with early 20th Century American political success stories than late 20th Century British. In 1970, in Edward Heath’s government, she was appointed Secretary of Education, the only cabinet position then open to women. Her next job would be leader of her party in 1975 and then, four years later, Prime Minister. Personal responsibility, ambition, honesty and hard work were part of her personae.

The arts and academic establishment “loathed” her, according to Joseph Gregory in Tuesday’s New York Times. She was not ‘politically correct.’ A proposed honorary doctorate at Oxford was denied after a faculty debate, despite it being a tradition to offer such degrees to Prime Ministers who had attended the University. Even Paul Johnson, the conservative historian who wrote a wonderful column on Mrs. Thatcher in Tuesday’s Wall Street Journal, admitted that “she had become more imperious during her years of triumph and that power had corrupted her judgment.” Some of the criticism was mean-spirited. Youths held death parties on hearing of her passing, which says more about their ignorance and the moral relativism of our day than it does about her policies. Without her revitalizing the British economy thirty years ago, the demonstrators would not have been wearing their designer jeans and texting on their iPhones.

In an interesting aside, on Monday night in New York, the audience watching “Billy Elliot,” the Broadway musical, which is set during the coal strike of 1984-85, was asked whether they should delete the song, “Merry Christmas Maggie Thatcher,” which celebrates her future death. Properly, in my opinion, they elected to include it. Whether one agrees with its sentiment or not, the song is part of the show. On the other hand, reflective of the antipathy that Mrs. Thatcher still generates, Judy Garland’s “Ding Dong, the Witch is Dead” shot to number 27 on iTunes’ UK list and “Tramp the Dirt Down” appeared at 92.

Margaret Thatcher was outspoken. Taking on British labor unions in the late 1970s, she said that those who compromised, who strode the middle of the road, risked getting hit by traffic coming both ways. “I’m not here to be liked,” she stated. In 1975, and while in opposition in the late 1970s, she told the Tory leader Edward Heath she was putting herself forward for his job. Heath responded rudely, without standing: “You’ll lose, you know.” She won handily on the second ballot. Four years later, she was Prime Minister and Heath was history. At her first meeting with Mikhail Gorbachev, she said: “I want our relationship to get off to a good start, and to make sure there is no misunderstanding between us – I hate Communism.” In 1984, following an IRA bombing that killed four people, including a member of her cabinet, she declared: “The fact that we are gathered here, now – shocked, but composed and determined – is a sign not only that this attack failed but that all attempts to destroy democracy will fail.” Years later, as President George H.W. Bush was putting together a coalition to chase Iraq out of Kuwait, she famously told him, “Remember, George, this is no time to go wobbly.”

It will be for her successes, in dealing with an ailing economy and a still-aggressive Soviet Union, that she will be remembered. She served as Prime Minister just under twelve years, the longest serving British Prime Minister in the Twentieth Century, the longest since William Gladstone. From her earliest years, she was grounded in the fundamentals of individualism and fiscal prudence. By the time she had left office, the principles of Thatcherism were well known and had been copied, according to Professor Johnson, in fifty countries. Those principles included: a belief that economic freedom and individual liberty are interdependent; that personal responsibility and hard work are the only means to national prosperity, and that free-market democracies must stand firm against aggression. As is obvious, she was a disciple of Friedrich Hayek and Milton Friedman.

It is hard today to remember the desperate straights in which England found itself during what came to be known as the “Winter of Discontent” – the winter of 1978-79. Like Turkey at the start of the 20th Century, Britain was being described as the “sick man of Europe.” Inflation was 13.4% in 1979. It would peak the next year at 18% before declining. By 1983, it was down to 4.6%. The top income tax rate was 83%. She would cut it to 40%. Electricity shortages had reduced the work week to three days in many industries. A third of the economy was linked to loss-making state owned industries. Trade-union power was undermining the economy with persistent strikes and inflationary wage settlements. Historian, Paul Johnson described the scene: “The Boilermakers Union had already smashed the shipbuilding industry. The Amalgamated Engineers Union was crushing what was left of the car industry. The print unions were imposing growing censorship on the press. Not least, the miners union, under Stalinist Arthur Scargill, had invented new picketing strategies that enabled them to paralyze the country whenever they chose.”

Before she died, Mrs. Thatcher left instructions she not be given a state funeral. As Parliament would have to approve the funds, she felt she was too divisive a figure. (Interestingly, another British Prime Minister who asked not to have a state funeral was Benjamin Disraeli, the first – and only – Jewish Prime Minister.) Compliance with her wishes is what will be done and what should be done. Yet, I am bothered that a nation that so benefitted from her years in office will not accord her the same honor provided General Douglas Haig in 1928. As commander of the British Expeditionary Forces, he was responsible for the losses at the Somme in July 1915. On the first day alone of that infamous battle the British suffered 57,470 casualties, of whom 19,240 died. Before that one battle was over, five months later, Allied casualties totaled 623,907, of whom 146,431 were dead. Yet the trenches were only a few hundred yards from where they had been. And General Haig was accorded a state funeral, and not Margaret Thatcher!

She was not warm and fuzzy. There were many who were offended by her directness. She did not have Ronald Reagan’s ease of manner and sense of humor. The author Andrew Klavan, in a memoriam to Mrs. Thatcher put it well: “…in order to accomplish the truly good things in life, you sometimes have to court the disapproval of the finest, most sophisticated, most appealing, most educated and most urbane people…” Mrs. Thatcher bravely did so. The economic situation she faced, especially in regard to the unions, made ours pale in comparison. She brought prosperity and pride to the English people. She reacted decisively and with overwhelming force to defend British subjects in the Falklands.

She played a significant role in two global, transformational changes. At home, she halted the economic slide toward statism and despair. She broke the unions’ grip on the economy and privatized industries that had been nationalized. Second, she was instrumental in the collapse of the Soviet Union. No one will ever know how many people died in the Gulags or in other Soviet prisons and camps. Wikipedia suggests that between 85 and 100 million people were killed in Communist countries during the 20th Century. Whatever the number is, the world and especially the Russian people are far better off with Communism gone. Its collapse reflected the combined efforts of Ronald Reagan, Pope John Paul II and Margaret Thatcher, with Mrs. Thatcher providing the essential link between Reagan and Gorbachev, which was crucial. That it all happened without a shot being fired was a testament to luck and, more important, to their remarkable good sense.

Margaret Thatcher had her share of faults and more than her share of critics. She was opinionated and indomitable. While leader of the opposition in 1976, Mrs. Thatcher’s anti-Communist rhetoric earned her the sobriquet ‘Iron Lady,’ a term provided by the Soviet Defense Ministry. The intent was to be derogatory, but Margaret. Thatcher took it as an honor. She was a giant. Man and woman are ephemeral. Ideas are eternal. Her legacy will live on.


Tuesday, April 9, 2013

“Education and Political Correctness”

Sydney M. Williams

Thought of the Day
“Education and Political Correctness”
April 9, 2013

Is political correctness harming our students? It is well known that our high school students do poorly when compared to their peers in other countries – 25th in math, 17th in science and 14th in reading. U.S. Secretary of Education Arne Duncan cites a McKinsey study that said that the U.S. ranked 22nd of 27 OECD countries when they calculated the average salaries of teachers with 15 years experience compared to the average earnings of full-time workers with college degrees and, consequently, proposed raising teacher’s salaries. He did not cite the fact that the U.S. spends on average $10,995 per student, $2,826 more than the average OECD country. The difference is the bureaucracy of administration.

Schools face many problems, not the least of which is the egregious amounts of money spent on administrators and nonteaching staff. A report last year from the Freidman Foundation for Educational Choice studied public high schools, their enrollments and staffs, over the fifty-nine years from 1950 to 2009. They found that, while student enrollment doubled and the number of teachers grew 252%, the number of administrators and other nonteaching staff grew by 702%! In another instance, Maine saw a decline of 11% in students between 1992 and 2009, yet saw the numbers of teachers increase by 3% and nonteachers by 76%. Following the implementation of No Child Left Behind in 2002, teachers and nonteachers were hired at the same rate, but still double that of students. Perhaps the best exposé I have seen on this subject is Mind the Gap by Dr. Richard J. Soghoian. Dr, Soghoian is the headmaster of Columbia Grammar School, a private day school in New York, but his observations on this matter are relevant to public schools and colleges, as well as private schools. There is something wrong, and it isn’t a lack of money.

Political correctness plays a role. School choice has always been the right of the wealthy, but is generally unavailable to the poor and middle class. The concept of co-locating charter schools in traditional public school space is a choice that Leftists don’t want to offer the poor, as the New York Post noted in an editorial on Leonie Haimson, a New York public school parent-activist who does what most public school parents cannot – she sends her children to private school.

There has been an inordinate reliance on building bureaucracies within schools, which, while increasing union membership roles, diverts money from teachers and more productive purposes. There has also been an undue emphasis on political correctness, which is harming diversity of opinion and limiting ideas. This is especially true in elite universities. Ross Douthat, writing in Sunday’s New York Times, took issue with the criticism of Susan Patton, the Princeton alumna who urged Ivy League women to use their college years to find a mate. While she was denounced as a traitor to feminism and the university “ideal,” all she really did, Mr. Douthat noted, was to express a truth known to all who graduate from an Ivy League, or similar, university – a desire to pursue “assortative mating,” in other words, perpetuate existing inequalities. Victor Davis Hanson put it most succinctly and eloquently: “Apartheid is the unifying theme of coastal aristocracy.”

Another example. Friday, April 19 will mark the 13th annual Day of Silence, a student-led event that brings attention to anti-LGBT (lesbian, gay, bi-sexual and transgender) name calling. I agree with the notion that bullying and harassment are issues that need to be addressed. But I know of no school that has focused to the same degree on the bullying of fat kids and/or those that are poor, or who suffer some physical handicap. The Day of Silence has become an excuse for teachers and administrators to promote homosexuality, bi-sexism and transgenderism. While I suspect that such time spent by teachers may help students in their applications to coastally-elite universities, it will do little for them in a globally competitive world. A Day of Dialogue, sponsored by an evangelical Christian organization, Focus, has not received the same support. Its purpose is to engage “in honest and respectful conversation among students about God’s design for sexuality.” But it is not politically correct. Personally, given the opportunity, I would sit out both days, but one is optional and the other is not.

Defenders of what we term political correctness claim that what they are doing is embracing a richer, more dynamic and complex vision of Western culture and its relations to other societies. Twenty years ago, in a defense of political correctness, Professor Marilyn Edelstein of Santa Clara University wrote: “But no one I know wants to inhibit genuine free speech or an open exchange of ideas.” Unfortunately, that is what has happened. So-called liberal universities have become blind to ideas that do not conform to their pre-conceived opinions. They have, in fact, become illiberal. David Burton, author of The History of Mathematics, noted last year that fifty words have been banned by the New York City Department of Education from use on tests. Included are such words as: wealth, poverty, Halloween, terrorists, slavery, divorce and, mind-numbingly, birthdays and dinosaurs. For each there is a politically correct explanation, such as the word might evoke “unpleasant emotions in students,” or, in the case of dinosaurs, because “they might offend people who do not believe in evolution.” All of my ten grandchildren have more wisdom than the idiots who proposed such rules. How can children learn when such restrictions are deployed? Minds are designed to be expanded, not restrained.

An extreme example as to how far left university professors have descended was the recent comment by Tulane professor and MSNBC host, Melissa Harris-Perry. She let slip that the raising of children should be the responsibility of the community, not just the parents. Not surprisingly, Rush Limbaugh picked up on her comment, as only Rush can do. In response, in my opinion, she simply dug the hole, into which she had fallen, deeper. She declared that she was not saying families should be replaced. She claimed she said: “Once it’s [raising children] everybody’s responsibility and not just the households then we start making better investments.” The words are reminiscent of Mao Tse Tung’s, and not too different from a former first lady who once said it takes a village to raise a family. My wife, for one, would heartily, and respectively, disagree.

Jennifer Schuessler, writing in Saturday’s New York Times, noted that a new specter is haunting university history departments: “the specter of capitalism.” “After decades of ‘history from below,’ focusing on women, minorities and other marginalized people seizing their destiny,” she writes, “a new generation of scholars is increasingly turning to what, strangely, risked becoming the most marginalized group of all: the bosses, bankers and brokers who run the economy.” Who will be marginalized, in my opinion, are the suckers who sign up for such classes. The economy is far more complex than banks and brokers. The focus on so narrow a field is fine for graduate students, but does not serve the needs of those who need a comprehensive study of American history. Amazingly, as David Feith wrote in the weekend addition of the Wall Street Journal, Bowdoin College, one of the nation’s finest colleges, has no requirement for history majors to take a single course in American history.

Last Tuesday, Thomas Friedman’s column in the New York Times was titled, “My Little (Global) School.” He puts a positive spin on the subject of underperforming schools, by noting that there are schools that compete effectively against the best in the world. That seems to me a statement of the obvious. There are about 27,000 public high schools in the U.S. Statistically, a few of them will be superior. But he does note that the best schools have some things in common: strong fundamentals, strong cultures that believe anything is possible with any student. They emphasize “soft skills,” like “completing work on time, resilience, perseverance – and punctuality.” Those are all characteristics that would have been known to students and teachers 100 and 200 years ago – they do not represent multiculturalism; they reflect common sense. They are an acknowledgement that real values are absolute and eternal, not relative. They suggest a return to basics, and they acquiesce to the notion that our commonalities as Americans are critical and, frankly, should be emphasized over our differences.

Universities, the media and politicians have sliced and diced Americans every which way; so that we all have some reason to feel life has cheated us. We are African-Americans, Hispanic- Americans, Latinos, Mexicans, Jews, Asian-Americans, rich, poor, gay, straight, conservatives, liberals, cultured and rednecks. We all belong in at least one box, providing us some comfort and some grievance. It encourages people to seek help from a caring government. However, with a people so divided, it is impossible not to offend some, even if it is only those who are politically incorrect.

Tolerance toward all is a noble goal, but when it means tolerating the intolerant, it loses its honor. Schools and universities should be places where students are taught to learn, and to think and to question conventional thinking, not for the purpose of promoting a particular political agenda, but so they can better form their own opinions. Thomas Klingenstein commissioned the study by the National Association of Scholars that, among other findings, found that odd fact about Bowdoin’s history majors not needing to study American history. The purpose of the study, Mr. Klingenstein alleges, was not to call for conservative affirmative actions, but to push for an end to the “silent discrimination against conservatives.”

Ultimately all of this politically correct foolishness will collapse of its own weight. To a large extent, we all reflect the age in which we grew up. My era was the 1950s. Most of today’s senior professors and university decision makers came to maturity during the Vietnam era. Another, younger group – my children’s age – grew up during the 1980s – the Reagan years. Our behavior is influenced, to an extant, by the times in which we grew to maturity. Fads change, either for good or bad. Unfortunately this last one has lasted a long time and has severely and negatively impacted at least two generations of our youth, but eventually it will collapse as the professors who support such ideas and the universities that support them are found to be as ephemeral as they appear to people like me. The way in which we educate people is changing. Already, the move toward on-line education is making noticeable waves. On-line classes more closely link studies with jobs. But some things are forever. One hundred years from now, people will still be reading Shakespeare, the Bible and Dickens. Will one be able to say the same about The Feminine Mystique? I very much doubt it.


Monday, April 8, 2013

“Are Stocks Still Relevant?”

Sydney M. Williams

Thought of the Day
“Are Stocks Still Relevant?”
April 8, 2013

The question is spurious, of course. Stocks are relevant. They reflect the value of thousands of public companies. They add financial leverage to an economy’s production. Thousands of people make their living because of them, and millions have their savings tied up in them. And everyone knows they have doubled in value over the past four years. But attitudes toward stocks have changed so greatly in the past four or five decades that sometimes it feels that individual stocks have become irrelevant. It is the fact their numbers have shrunk, the demonization of wealth, the proliferation of investment intermediaries, and the increased costs and regulatory burdens of being public that are concerning.

Wealth was once something to which people aspired. It wasn’t something to be scorned publically, while being tapped privately by politicians. One of the main avenues to wealth was the purchase of individual equities and the building of portfolios over time. Working hard and thrift were considered virtues; the reward being that in time one could become, if not rich, at least comfortable. Attitudes have changed. On Friday, a couple of Bloomberg reporters disclosed that President Obama’s budget would prohibit taxpayers from accumulating more than $3 million in individual IRA accounts. He has been bashing the “rich” regularly, so putting his pen where his mouth is should be expected. Nevertheless, why should thrift, hard work and investment skills be punished? And why an arbitrary number like $3 million?

Over the past few decades, individual stocks have been lumped, commodity-like, into index funds, ETFs and high frequency algorithmic trading programs. Despite the market making new highs, recent increases in capital gains taxes and stricter regulation have reduced incentives to invest – the life blood of economic growth. At the same time, the number of companies in which one can invest has been shrinking. Regulations like Sarbanes-Oxley, the ever-higher costs of being public, and the fact that the quarterly focus of analysts tends to distract management’s attention from longer term strategic plans have caused a shrinkage in the number of public companies. CFO Magazine, quoting Grant Thornton, recently stated that the number of publically-traded companies in the U.S. declined from 8,823 in 1997 to 5,091 in 2011. An analyst with whom I used to work and who was consistently ranked number one in his industry, told me over the weekend that two thirds of the companies he once followed are gone. Simultaneously, according to SEC data, initial public offerings (IPOs) have declined. Between 1980 and 2000 they averaged 311 annually. From 2001 to 2011, the annual average has been 102. Birinyi Associates recently noted that announced corporate buybacks were $117.8 billion for the month of February, the highest in a year. At the same time, the population of the United States continues to grow. It has expanded by almost 50 million since 1997 and 100 million since 1980. And, the population is aging, with 10,000 people reaching retirement age every day. The need for investment vehicles has been growing, while the number of publically traded companies has shrunk. This, in part, explains the proliferation of so many derivative products based on underlying stocks.

The political demonization of wealth and the ubiquity of rogue traders (and cronyism) are also taking their toll. The former seems odd, for even populist politicians like our President are constantly on the prowl for money. (Perhaps Leftist billionaires are not considered “rich?”) The latter (rogue traders) reflect the insatiable greed of bankers who reap rewards when they succeed, while losses become the responsibility of taxpayers. The former (the demonization of wealth), has multiple fathers and has been a long time in the making. Elected officials in Washington and most government employees have fixed benefit retirement plans; thus are not subject to the necessity that most in the private sector have had to build wealth, either through direct savings or by way of defined contribution retirement plans, like IRAs and 401Ks. As a consequence, government employees have less sensitivity to the needs of those in the private sector that must save and invest for retirement.

Despite the need for households to increase their ownership of financial assets, the cumulative effect of these factors has been a decline in household ownership of stocks and mutual funds, and a drop-off in trading volume. A recent Gallup Poll indicated that 54% of households own stock or mutual funds, down from 65% in 2007 and 67% in 2002. The number of households owning individual stocks is less than one in five. A New York Times article from a year ago suggested that average daily trading volume is about one half of what it was at its peak in the fall of 2008, and volume has fallen since. Financial and cultural incentives emanating from Washington have been misplaced. They should be on wealth construction, not wealth destruction. Unfortunately, what has been happening is indicative of a government that foments dependency and renders responsibility. A Treasury Inspector General Report, issued August 9th, 2010, noted that average financial assets for those between the ages of 55 and 64 were $72,400, while the average income was $54,600. As a nation we are over leveraged. As a people, we have far too little investments.

Not unlike healthcare, one of the problems individual equities face is that the consumer has become increasingly distanced from the market. Thus today, very few individual investors have any idea what companies they own and even less what they do. The Peter Lynch concept of investing in what you know best seems as old fashioned as rotary telephones. Most investments that individuals do own are professionally managed, in mutual funds, ETFs, pension plans or 401Ks. While professional managers have generally been good for investors, one cannot help thinking that the loss of a direct link between the investor and his or her investment has been unfortunate. Beneficiaries of retirement plans are only concerned with monthly checks, not whether they derive from the sale of corn flakes or Chevy trucks. At the same time, the relationship between companies and shareholders has changed. To a passive investment manager – an index fund or an ETF – the individual stock components are of little consequence, other than representing a percent of the portfolio. Thus, they have little interest in the vagaries of managements. With holding periods sometimes measured in minutes, quant-like funds have further changed that relationship as well. Companies’ managements often have no idea as to who owns their shares. Those who seem to care the most are activist investors whose interests may or may not be aligned with individual investors.

It must be remembered that every mutual fund, ETF or managed account, every index or quant-like product relies on individual companies. Without them, those funds would not exist. The degradation of financial success, in the interests of populism, does little to instill the long term confidence needed for economic success. An obsession with protecting investors can lead to public policy advocates destroying the very fabric of our capitalist system – a system necessary for our economic well-being. The decline in the number of publically traded companies should be a warning shot across the bow for policy makers. Stocks and their intermediaries not only have relevance, they are crucial to the financial well-being of the nation.


Thursday, April 4, 2013

“North Korea – Time to Revisit Containment?”

Sydney M. Williams

Thought of the Day
“North Korea – Time to Revisit Containment?”
April 4, 2013

“Japan Shifting Further Away from Pacifism;” “U.S. Positions Missile Destroyers off South Korea;” “North Korea to Re-start Nuclear Facilities ‘Without Delay’,” and “China Mobilizing Troops, Jets near Korea.” Those were four headlines Tuesday morning. Early this morning CBS radio reported that North Korea moved missiles to their east coast, closer to the U.S. and Japan. Bellicosity is becoming elevated on the Peninsula.

Other than the fear of a nuclear war during the first fifteen years of the Cold War, and more recent threats from terrorists, the United States has been immune from concerns of attack. Consequently, it is difficult for us to imagine what it must be like to have mortal enemies on one’s borders. Europeans understand the threat. Asians do. African nations do as well. But, since the Mexican-American War of 1846-48 the United States has had generally friendly relations with its two neighbors. But, that is untrue for most of the world. Now, it increasingly appears that the prospect of war in East Asia is a possibility, if not a probability.

Kim Jong-un may be a kid, he may be stupid and a nut, but, as the dictator of the Democratic People’s Republic of Korea, he has nuclear weapons and commands the world’s largest army – larger than China’s and more than three times the size of the U.S. He has to be taken seriously.

In a February 4th TOTD, entitled “Kim Jong-un – A Tinderbox,” I wrote that North Korea was warning that it planned a third nuclear test. Eight days later they did just that. While the magnitude of the tremor, as measured by the U.S Geological Survey, was bigger than their previous detonations, it is its possible miniaturization that is most troubling. North Korea recently launched an Unha-3 rocket, capable of reaching the United States and capable of carrying a nuclear warhead. On March 11, Kim Jong-un said that the armistice ending the Korean War had been invalidated and that he was “bracing for a showdown.” Pyongyang declared that a “state of war” exists with South Korea, which is literally true, as no truce was signed in 1953. Two days ago, the Country said it would be putting all of its nuclear facilities to work expanding their nuclear weapons arsenal.

But it is not just North Korea that is disrupting East Asia. Following its devastating defeat in World War II, Japan renounced the right to wage war, or even to possess a military. Defensive forces, created in 1954, were, according to an article in Tuesday’s New York Times, constrained from acting in “too offensive a manner.” But that is changing. In late December, the Country elected Shinzo Abe, a conservative who has increased military spending. Mr. Abe is calling for rewriting the postwar Constitution to scrap restrictions on the military. While that idea remains unpopular, opinion polls show Mr. Abe has strong public support. Japan’s southern islands, known as the Senkaku, have been under dispute from China. “China is in their face,” is the way MIT political scientist Richard Samuels put it, “The mood has shifted toward giving more legitimacy to the guys in uniform.”

In February, 280 Japanese soldiers participated in war games with American marines in a mock invasion of San Clemente Island, off San Diego. As Martin Fackler, writing in the Times, noted: “There is only one country that Japan fears would stage an assault on one of its islands: China.” Even for the only country to feel firsthand the force of atomic explosions, the moderating influence of the passage of time has served to mute the horror of that moment. Today’s Japanese soldiers are the grandchildren and great-grandchildren of World War II’s solders. Many of them are too young to have known those who fought in World War II.

But it is the Korea Peninsula that seems most combustible. The Peninsula, Mr. Kim declared, has reverted to a “state of war.” The Korean conflict ended in an armistice sixty years ago this July. Whether North Korea is blustering or whether they are mobilizing troops, South Korea’s newly elected President Park Geun-hye is taking no chances. In a message to the South’s generals, she said: “If the North attempts any provocation against our people and country, you must respond strongly at the first contact with them without any political considerations.” (Emphasis mine). Ms. Park does not want a repeat of the somewhat feeble response by her predecessor in 2010 to the shelling of one of Yeongyeong Island, just seventeen miles from Seoul, which killed two soldiers and wounded twenty.

In response to the Pyongyang’s provocative words, the United States sent two B-2 Stealth bombers in a practice run over South Korea. F-22 Stealth fighter jets were also deployed. Separately, the Department of Defense sent the USS McCain, an Aegis-class guided missile destroyer to be positioned off the southwestern coast of the Peninsula. The five largest armies in the world are located in Asia, with North Korea, South Korea and China all on the list. (The other two are Vietnam and India.) North Korea, with a population of 25 million, has a total military of 9.5 million. South Korea has a population of 50 million and a total military of 5.2 million. In contrast, the United States, with a population of 315 million has a total military of 2.3 million, and China, with a population of 1.2 billion, has 4.6 million. The Peninsula is armed.

However, the real question is, what role will China play? When one looks at the region on a map, the Korean Peninsula looks like a natural appendage of China. The border between the two countries stretches for 880 miles, part of which is protected by a fence and by two rivers. The North Koreans don’t want their people leaving and the Chinese don’t want them arriving. China has been amassing military forces, including jet aircraft, tanks and personnel carriers along the border. According to one report, the PLA (Chinese People’s Liberation Army) is now at ‘Level One’ readiness, its highest. Additionally, they have been conducting live-fire naval exercises in the Yellow Sea, off the west coast of South Korea.

China, which has long been North Korea’s biggest ally, has recently been vocal in opposition to their announcement of restarting the nuclear facilities at the Yongbyon complex and their aggressive verbosity. Since North Korea is dependent on China for food, oil and electricity, their opinion matters. While China has limited imports and has frozen assets in two North Korean banks, they have not abandoned their ally. A group called 38 North: Informed Analysis of North Korea wrote on March 29: “No, China will not abandon North Korea, at least in response to the recent nuclear test.” Rather, their thinking is that Beijing’s policy has “evolved form a one-dimensional policy based on ‘friendship sealed in blood’ to a multi-dimensional one that seeks diverse strategies – including punishment – to manage different types of risks surrounding the Korean peninsula.” China fears regime change, as that could cause massive defections into China’s Northern provinces.

The resemblance between today in East Asia and the first decade of the Twentieth Century in Europe is eerie. Electricity, autos, planes and the telephone were all reasonably recent inventions. World trade had created national wealth on a global scale. The opulence of the rich – Downton Abbey – was a manifestation of the enormous differences between rich and poor. The desire for material goods blinded people to the risks of a combustible world. Weapons’ technology had surpassed the abilities of generals to understand the consequences of their fire power. Other than the Franco-Prussian Wars of 1870, the Continent had been relatively free from major wars since Napoleon’s defeat at Waterloo in 1815. Today, trade has brought riches to countries across the globe. Technology has birthed giant leaps in communication and the internet, shrinking distances between all parts of the world. We have weapons of such awesome power, their use could destroy mankind. The divide between rich and poor has been widening. Seeking material comfort has replaced seeking meaning. In 1914, small nations on Europe’s periphery catapulted the major powers into a war that no one really wanted. Ban Ki-moon, UN Secretary General, declared that tensions as are such that the world must negotiate with North Korea’s Kim Jong-un. Negotiating with dictators is a futile exercise, as history teaches. Nevertheless, we must be wary lest a small country in East Asia doesn’t cause a repeat of August 1914.

In a world of proliferating weapons capable of unbelievable destruction, President Reagan’s decision to establish a Strategic Defense Initiative seems uncannily prescient. Eliminating nuclear weapons would be a wonderful ideal, but because they have become ubiquitous such wishes will remain only that – wishes. The concept of a missile defense shield may seem akin to living in a fortified castle, but the alternative – of being vulnerable to attack, of living in a straw house during a hurricane – is far worse. I have never understood the policy decisions that caused first President George H.W. Bush and then President Clinton to talk down these programs. They were restarted by the second President Bush, but then curtailed again under President Obama…until the emergence of this latest threat, which reasserted the need for a missile defense shield. If we cannot control offensive weapons – and history suggests we cannot – we should concentrate on defense.

Following the devastation caused by World War II, and the concomitant rise of the Soviet Union, a policy of containment ensued. The term ‘containment,’ in a foreign policy sense, derived from George F. Kennan’s article in Foreign Affairs in 1947. He concluded: “The main element of any United States policy toward the Soviet Union must be that of a long-term, patient but firm and vigilant containment of Russian expansive tendencies.” There is an irony in that the policy traces itself back to a man who later retracted the concept. Nevertheless, the policy extended through the start of the Vietnamese War. President Lyndon Johnson, citing the domino theory, used the policy of containment to justify the build-up in Vietnam in the mid 1960s. With the collapse of South Vietnam, the policy of containment fell into disrepute.

A second article in Tuesday’s New York Times quoted a former North Korea policy adviser to President Bush, Michael Green. He noted that since there has been a lack of success in curbing North Korea’s nuclear and missile programs, the White House has little choice but to pursue a policy of containment, no matter the name given it. That sounds right to me. Certainly, we do not want to get drawn into what could be a conflagration of frightening dimension.