Thursday, May 28, 2015

:Debt + Lotteries = Social Decline"

                     Sydney M. Williams

Thought of the Day
“Debt + Lotteries = Social Decline”
May 28, 2015

When asked how he planned to finance his retirement, a wag once responded he would rely on three sources – Social Security, law suits and the lottery. His answer was an expression of hope over experience. While it was bittersweet, his answer also reflected a cultural shift – a growing dependency and a belief that riches can come with no effort. Social Security, unless changes are made, will be technically defunct in 2033. Frivolous lawsuits are a growing problem. In 2013, about 15 million such suits were filed in the United States, an increase on a population-adjusted basis of 150 times from 1967. Americans spent $78 billion playing lotteries last year, a form of tax that did not exist before New Hampshire introduced the first lottery in 1964.

Frivolous lawsuits cost American taxpayers about $275 billion a year in legal and court fees. The suits tie up courts. The lawyers who encourage them send false promises of million to millions of people. The Institute for Legal Reform estimated that in 2008 the average American spent an additional $3,500 on goods and services because of frivolous lawsuits.

By debt, in this instance I refer to entitlements that have been promised by elected officials, but exist only on paper. They are Ponzi-like in that the authors did not provide proper funding. Promises are the bait that gets politicians elected. Responsibility is left to the next generation. My home state of Connecticut ranks fourth in terms of household income; yet it is first in terms of state debt per capita. Future retirement and healthcare obligations represent the bulk of the obligations. It is a sad commentary on a state that has so much to offer – its people and their skills, its educational institutions and its proximity to major cities – that 12.5% of its population is on food stamps Connecticut’s poorly managed government, with its burdensome regulation and heavy taxation has driven out businesses and many productive citizens.

Residents of Connecticut spend about $1.3 billion on state-run lottery tickets. Of those dollars, all of which could have been expended in more productive ways, about $315 million go to state coffers and $63 million to the store owners that sell the tickets. The concept of lotteries was originally sold to the public as a means of raising money for purposes of doing social good, which generally meant the money would be earmarked for education. As states’ finances worsened, the money has gone to general budget items, which meant supporting pension and healthcare plans for retired state employees. Advocates for the lottery are legion, beginning with the politicians who see this as “free” money, the merchants who collect 5% of the sale, and the gullible consumer who is told that untold wealth is but a $2 ticket away.

Additionally, the state of Connecticut receives roughly $300 million from the state’s two Indian-run casinos, which combined had revenues of about $2.0 billion in 2014. While that $300 million represents a small fraction of what the State takes in each year, politicians love it, because it is painless. Nevertheless, there is a social cost, which is long-tailed, again something politicians prefer. But it is that social cost – the habits that get formed and the personal tragedies that ensue – that should concern us. In general, it is the poorest among us who sit endlessly before slot machines and who get suckered into paying for the false promises of lotteries. And it is the trial lawyers, not the victims, who get rich at the expense of the people for frivolous lawsuits. Earlier this year, in response to plans to expand casino gambling, former Governor Lowell Weicker who, in 1991, cut the deal that introduced casinos into the state said, “We need other kinds of jobs. I think we need to adjust our priorities, so they don’t include more gambling.” In this regard, he is right; we do. The state is already seeing the effects of competition on in-state gambling. Revenues in 2014 at the two casinos were about $1 billion below where they had been in 2013. Casino employment, which peaked in 2006, has declined from 22,300 to 14,750. Is the state better off encouraging more gambling – and the inevitable personal losses such activities promote – or should the state encourage new businesses and re-build its defense and insurance industries, businesses that made Connecticut rich in the post-War years?

I am not a fan of any tax, but I recognize that government is necessary and must be funded. National defense cannot be farmed out; neither can local police forces. Disputes between different interests must be arbitrated. Public schools and universities play a vital role in our well-being. The elderly, the sick and the indigent must be cared for. Transportation has a public aspect that must be funded, be it roads, bridges, rail systems or airports. Museums, parks and such add to the quality of life. Since we live under the rule of law, rather than the rule of men, we need legislators, judges and administrators to ensure that laws are constructed and enforced fairly so that no single individual is treated wrongly or that any position becomes too powerful.

But bureaucracies become bloated. Administrators see growth in expansion, rather than in better service. When the majority of the people have a financial interest in the perpetuation of government, the natural braking system of checks and balances becomes compromised. I understand the value of taxing luxury goods, cigarettes, liquor, etc. But in general, I favor a progressive income tax system to the regressive use of broad-based sales taxes, VATs and hidden fees. We need a system that encourages investment and the individual creation of wealth, especially with Social Security at risk. We need to encourage aspiration, skills and hard work. (As an aside, the automatic withholding of federal, state and local taxes by businesses has been an insidious conspiracy between government and business to mitigate the pain of paying taxes. The self-employed and many retirees must write quarterly checks to the IRS. If everyone had to do so, government would be smaller.)

Perhaps there is no direct relationship between the advent of frivolous law suits, lotteries and the growing dependency of Americans on government, but the timing is suspicious. At the time lawsuits began to explode and lotteries came into being, the sense of community in America began to break down. It was an event chronicled by Robert Putnam in his book, Bowling Alone. He described the decline of social intercourse and the collapse of civic groups that did so much to hold communities together. In part, this was due to the simultaneous rise of government programs designed to help the needy, the sick and the elderly. But the result is we have become alienated from many upon whom we once relied.

As government programs expanded, people became increasingly dependent. That dependency replaced the concept of self-reliance and personal responsibility. When we add in an addiction to promises made but likely never to be realized, the consequence is a people who look to lawsuits and the lottery to sustain them in old age. It is an equation that foretells a social decline.



Labels:

Wednesday, May 27, 2015

"Where Have All the Frogs Gone?"

                                                                                                                 May 27, 2015
                    Sydney M. Williams
Notes from Old Lyme
“Where Have All the Frogs Gone?”

“Where have all the flowers gone?
Long time passing.”
                                                                                                                           Pete Seeger
                                                                                                                          “Where Have All the Flowers Gone?” 1955

Every spring morning, once the swimming pool has been opened, I clean the filters. Inevitably, there are one or two frogs who wandered into the pool during the night. This is common after a night’s rain lured them on a nocturnal stroll looking for snackable insects. The temptation of cold clear water causes them to hop in. Unfortunately, finding no easy way out, they lose strength and get pulled by the currents into the filters. By the time I get there, most have drowned.

This year there have been no frogs. Not being a herpetologist, or even much of a naturalist, I could think of no reason other than the cold winter, with its heavy blanket of snow, or some fungi that had become rampant. Ignorant of an explanation, I read and contacted some experts. Frogs are amphibious, meaning they can live on both land and water. The cold winter should not have affected them, as frogs are ectothermic, meaning they rely on the environment to regulate their body temperatures. They also survive long periods without eating. In the winter, frogs find a cozy place known as a hibernaculum that protects them from extreme temperature changes, as well as from predators. It is only when their resting spot warms above freezing that the frog body thaws. He awakens, ready to eat and to mate.

The males emerge harrumphing, uttering mating calls, a sound with which those of us who live in the country are familiar. For the females that respond, their burden – after a few moments of delight – has just begun. She typically lays around 10,000 eggs, making my mother who raised nine children look like a piker. She lays such a large number because the odds on survival in this Darwinian world are small. (I wonder if my mother had similar thoughts?) Within a few weeks, the eggs that survive become tadpoles. In two to three months, tadpoles become small frogs. Life expectancy varies by species, but generally lasts between six to eight years.

Writing about frogs got me reflecting on the extraordinariness of nature and the interdependency of all species. Frogs, for example, are pretty far down the food chain. Like most people, I marvel and seek to understand what I understand least. Ospreys, one of nature’s most beautiful birds, have returned in abundance to the marshlands at the mouth of the Connecticut River. Dr. Paul Spitzer, a naturalist who grew up in this area, explained that their return is due to the Menhaden, which has resurged. The Menhaden is a foraging fish often used as fertilizer or crab and lobster bait by humans, but found especially tasty by Ospreys. In this “knee-bone connected to the leg bone” world of nature, the Menhaden’s return is due to Plankton, which grows in abundance in our creeks, and to the fear of Bluefish, Striped Bass and other predators that inhabit the Sound. The Osprey’s real name, for even those who are not interested, is Pandion Haliaetus, which derives from Pandion, a mythical king of Athens and haliaetus, which means a sea eagle. To watch them soar and then dive, talons poised for a fish having no idea that his life is about to end, is a beautiful sight to see – except, of course, for the fish. No matter, the Osprey is worthy of such a distinguished name.

While Osprey feast on fish, their feathered friends, seagulls and hawks have been known to toss down a frog or two. So frogs, when not drowning in my pool, play a critical link in the food chain among shore birds in our marshes. Typically, frogs eat insects, ridding us of natural pests. Having no teeth, they swallow whole whatever they have engorged. In turn, they are also eaten by fox (one of whom lives under our hedge) and swallowed whole by various snakes that slither about.

Living at the mouth of the Connecticut River is an extraordinary blessing. The marsh and the creeks that abut it, with the River and Sound a short swim or kayak ride away, are abundant with life. The estuary is one of the Western Hemisphere’s “40 Last Great Places;” so proclaimed the Nature Conservancy.

But to return to my concern about frogs: There are, from what I have learned, eleven species living in Connecticut. Among those that have found their way into my pool and its filters have been Wood Frogs, Pickerels and Bull Frogs, but most commonly Green Frogs, or at least that is what I believe from looking at pictures in the “Field Guide to Reptiles and Amphibians” by Roger Conant.

Like the flowers that Pete Seeger wrote and sang about, frogs die, as do all living things. Not only the individual, but also, over varying periods of time, the species. “The history of life,” wrote Evolutionary Ecologist James P. Collins in 2004, “is a story of extinction: ninety-nine percent of the species that ever existed are now extinct.” Regardless of what actions we may take, the same fate ultimately will be mankinds. We do what we can to survive – we try to limit our impact – but eventually nature wins. Its forces exceed anything man has devised.


In the meantime, however, I was happy to hear from Gregory Watkins-Colwell, collections manager for Herpetology and Ichthyology at the Yale Peabody Museum of Natural History. In response to my question about no frogs appearing in my pool, he told me that the cold winter had delayed their regeneration and mating. He added that a dry spring meant fewer nocturnal wanderings. He assured me they would show up. Wait, he said, for a morning after a good night of soaking rain. It hasn’t rained, but I remain vigilant and hopeful. 

Labels:

Tuesday, May 26, 2015

"Obama - Unfocused or Delusional?"

                   Sydney M. Williams

Thought of the Day
“Obama – Unfocused or Delusional?”
May 26, 2015

Nero, allegedly, fiddled while Rome burned. Today we have a Commander in Chief who seems equally unhinged from reality. In a world fraught with Islamic terrorists and muscle-flexing autocratic nations, the enemy on which he is focused is climate change. On the Wednesday before Memorial Day, President Obama came to the Coast Guard Academy in New London, Connecticut to warn the newly graduated second lieutenants of the far-reaching consequences of climate change, and of man’s responsibility to halt its effects. Like the Norse King Canute who, after conquering Denmark, England and Norway, tried to hold back the waves, Mr. Obama went to Denver in 2008 and promised that his Presidency would bring the time “when the rise of the oceans began to slow.” Global warming is the yardstick he has used to define his Presidency.

It seemed to make no difference that the world was being shattered – at least in part because of our neglect. The day before the President’s speech the Iraq city of Ramadi fell to ISIS, the North Koreans revealed they had the ability to attach a nuclear device to an Intercontinental Ballistic Missile capable of reaching the U.S., and the Iranians said that UN nuclear inspectors would not be allowed into Iran. The next day, the ancient (and strategically important) Syrian City of Palmyra fell to ISIS. Two days later, tensions rose between the U.S. and China over the latter’s constructing of artificial islands 800 miles off their coast in the South China Sea.

In New London, Mr. Obama said: “I am here to say that climate change constitutes a serious threat to global security, an immediate risk to our national security, and make no mistake, will impact how our military defends our country.” He left no doubt as to the imperativeness of his message: he threatened those who are dismissive or skeptical of man-caused climate change as being “guilty of negligence and dereliction of duty” – an ominous threat from the Commander in Chief.

His speech was akin to a ship that had hoved anchor. He spoke of the dangers of climate change with the close-minded fervor of a fundamentalist preacher, leaving no room for climate agnostics. There was in the speech an absence of any apparent concern regarding terrorism and the homeland. He ignored the fact that there are those who would do us harm, who would upset the security of the world. Mr. Obama’s studied avoidance of those risks seemed odd when speaking to those responsible for defending our shores.  

The President has the intellectually dishonest habit of calling his “climate” opponents deniers, while claiming that he and his believers are truth-tellers: “I know there are still those back in Washington who refuse to admit that climate change is real…” That’s hogwash. While I know of many who are skeptical as to the magnitude of the role man has played in climate change, I know of none who claim that climate change is not real. In making such outrageous accusations, Mr. Obama refuses to engage the real debate – What effect has man had on climate change? Is nature more or less powerful than man? What will be the economic costs to emerging and developing countries of complying with standards set by rich nations? How can we realistically enforce reductions of emissions of other nations without causing economic hardships?

A recent study by Dr. Philip Lloyd, a South Africa-based physicist and former lead author for the Intergovernmental Panel on Climate Change (IPCC), is of interest. Dr. Lloyd examined ice core-based temperature data going back eight thousand years – his purpose to gain perspective on the magnitude of 20th Century global temperature changes. What he found was that the standard deviation in temperature over that time was about 0.98 degrees Celsius, which compares favorably to the 0.85 degrees climate scientists say the world has warmed over the past century. Keep in mind, the 20th Century, experienced the industrialization of much of the world and two world wars, where the price of victory included ecological devastation. “The key challenge in understanding climate change,” Dr. Judith Curry, a climate scientist at Georgia Tech, told the Daily Caller News Foundation in April of this year, “is to assess the natural climate variability.”

Is man’s effect on a changing climate more important than the dangers we face from Islamic terrorism? Is it greater than the threat from a nuclear-armed North Korea, a nuclear arms race in the Middle East, an increasingly militaristic China that looks to dominate the South China Sea through which a third of all global trade passes, and a Russia looking to re-create its lost empire? From the files of Osama bin Laden that have been made public, it is obvious that al Qaeda’s real target has always been the “great Satan” that is the United States. The same is true for ISIS, as they have publically stated. The religious freedom we enjoy, along with our Constitution and Bill of Rights, are direct threats to those Islamists militants who would establish a caliphate, which is simply a dictatorship under the guise of religion.

Iran’s ability to build nuclear weapons – a possibility that seems likely – will cause a nuclear arms race in the region. Saudi Arabia, the UAE, Egypt, Jordan and Turkey cannot allow Iran to exert preeminence. North Korea is a nation run by a madman, but a madman with nuclear weapons capable of reaching our West Coast. China’s military’s creation of artificial islands in the South China Sea could seriously (and negatively) impact world trade. Vladimir Putin wants to recreate the Russian empire. Should he decide to invade the Baltic Countries, what nations will stand against him? If not us, who will be the sheriff?

The Earth has been undergoing climate change since it evolved billions of years ago. What is new to the world is the freedom we enjoy as a people. It is the continuation of that individual liberty that should be the focus of our leaders, not just during Memorial Day week, but at all times. Contrast the words of President Obama in New London last Wednesday to those of President Reagan on Memorial Day in 1982 at Arlington Cemetery. Mr. Obama: “Climate change will affect everything you do in your careers…it will impact how our military defends our country.” Mr. Reagan: “War will not come again, other young men will not have to die, if we will speak honestly of the dangers that confront us and remain strong enough to meet those dangers.”

The world was already dangerous. It has become more so in recent years, in part because of decisions we have taken. For the President to come and tell 223 newly commissioned officers that the major enemy they face is climate change was, in my opinion, the act of a delusional man.





Labels:

Thursday, May 21, 2015

"The Media Dumbs Down...Further"

                   Sydney M. Williams

Thought of the Day
“The Media Dumbs Down…Further”
May 21, 2015

The off-duty, undercover cop who watched while members of a bike gang hauled the driver out  of an SUV on New York’s West Side Highway last year and beat him was asked why he did nothing. His response: “If I knew what was going to happen, I would not have gotten out of bed”.

The question currently being asked of candidates – knowing what we know now, would you have invaded Iraq in 2003? – does little to reveal the judgment, temperament or character of the one being asked. It serves no purpose, other than to fill the questioner with supercilious indignation, and to make the interrogatee, no matter the response, look foolish.

The current uproar began when Megyn Kelly of Fox News asked Jeb Bush, “knowing what we now know,” would he have authorized the invasion of Iraq? Governor Bush answered what he thought was the question, but ignored the hypothetical introductory phrase. From a political perspective, it was a mistake on Mr. Bush’s part, but it was the question that was absurd. How does one answer such a hypothetical question? Ms. Kelly was surely trying to trap Jeb Bush and, unfortunately for him, she succeeded. But did her audience learn anything of importance? Was it newsworthy, or did she and the question become the news? On Sunday evening, Chris Wallace asked the same question of Senator Marco Rubio. When Mr. Rubio pushed back, Mr. Wallace became exasperated; so the Senator gave the answer Mr. Wallace wanted. The audience learned nothing, other than that Chris Wallace, whom I generally admire, can be an ass.

We cannot relive the past. It is gone. We must live with its consequences. Time changes our perception of events, as much as does the discovery of new information. The repetition of a favored narrative makes it even more difficult to reconstruct yesteryear with any accuracy. The decision to invade Iraq in 2003 is widely seen today as a monumental blunder, and the Left is not shy about saying so; neither is much of the Right. The invasion, so goes the story, was engineered by neo-cons in the Bush Administration. Their only wish was to engage in military action; so they conceived the idea that Saddam Hussein had weapons of mass destruction (WMDs) and pressured the intelligence services and Congress to go along. In other words, we are asked to believe that a few evil guys in the Bush Administration duped a naïve Congress and intelligence service.

It’s what my grandfather would have called “poppycock.” What revisionists conveniently forget is that the invasion was not only about WMDs. Congress had supported 23 writs for Saddam Hussein’s removal. Regime change in Iraq had been fundamental to Iraqi foreign policy from the Clinton years. He had used chemical weapons against the Kurds and Marsh Arabs. He had broken the 1991 ceasefire agreement. He had stockpiled chemical and biological weapons and was working toward developing nuclear capability. While the attack on 9/11 (the seminal event in the Bush Presidency) was carried out by al Qaeda, there were (and are) many other Islamic terrorist groups operating in the Middle East, including Hamas, Hezbollah, the Palestine Liberation Front, the Islamic Jihad Group and ISIS (to name a few). Saddam Hussein was a supporter of such groups. Over a 20-year period, he had murdered at least 100,000 of his own people.

The Middle East is a tinderbox, much as the Balkans had been in 1914. For decades, the hatred between sects lay dormant, as it had been muzzled by tyrannical governments. Leaders in the Arab world kept a lid on dissension and freedom. George W. Bush believed, perhaps naively like Woodrow Wilson, that democracy was the answer. Mr. Wilson was an idealist, and I suspect Mr. Bush is too. It took two devastating wars and many years, but eventually democracy came to most of Europe. In time, one hopes democracy will come to the Middle East. If Mr. Bush and Mr. Wilson were guilty, it is because they underrated the roots and intensity of sectarian animus – religious and racial hatred that goes back generations.

On October 11, 2002, 70% of Congress authorized President Bush to use military force against Iraq. This was not a quick decision. More than a year had passed since the attack on 9/11. Nevertheless, reasonable people can disagree as to whether invading Iraq was the right thing to do, but any debate should be based on facts, not innuendos, or the re-writing of history. “Gotcha” questions serve only to “put a smile on the face of the tiger” that is asking the question.

In my opinion, it was not the invasion that was wrong; it was the mishandling of subsequent events. There was obviously little or no pre-planning as to how to work with an Iraq devoid of a leader that had been there twenty years. The “surge,” engineered by President Bush and General David Petraeus was successful, but it came three years late. Nevertheless it worked. Predictably, its success was wasted when the Obama Administration withdrew troops too quickly in 2011.

There are legitimate questions that the media should be asking regarding the Middle East and other hot spots. Where do the candidates stand in terms of defense? What do they see as the role of the U.S.? What path should the U.S. follow in a Middle East descending into chaos? What should we do about China’s growing military? What about North Korea? Where is the “red line” Putin must not cross, as he attempts to reassemble the Russian Empire? Where does the candidate stand in regard to our allies, specifically Israel, the Baltic States, Japan and those in East Asia?

While we learn from history, it cannot (and should not) be altered to fit an agenda. Every experience effects how we respond to the next. In terms of Megyn Kelly’s question, one could as well ask every divorced person, everyone who has been in an accident – knowing what you know now, would you have married, would you have stayed home? Other than to raise her profile, consume oceans of ink and hours of air time, her question provided no revelations.


We live in dangerous times. When we allow 30-second sound-bites, twitter-feeds, slogans and hash-tags to be the source of our news, we are ill-informed. When we let newscasters with ulterior motives frame questions that do not allow insights into the minds, temperament and characters of those who would run our country, we become losers. Asking “what if” questions do not enlighten audiences. In the instance at the start of this essay, Megyn Kelly became as much the news as did Jeb Bush. Media and television revel in “stars.” They drive ratings; so we must live with them, but people should understand that the consequence of such newscasters – and there are more on the Left than on the Right – is a biased and ignorant consumer.

Labels:

Monday, May 18, 2015

"A Culture of Deceit"

                  Sydney M. Williams

Thought of the Day
“A Culture of Deceit”
May 18, 2015

“For the history of our race, and each individual’s experience,
are sown thick with evidences that a truth is not hard to kill, and that a lie well told is immortal.”
                                                                                                                                                                         Mark Twain
                                                                                                                                                                         “Advice to Youth,” 1882

An old joke goes: “How can you tell when a politician is lying?” The answer: “When his lips are moving.” While that may not be universally true, lying and deceit have infested our culture to an extent we no longer expect the truth. Lying is not new, but it has become pervasive.

White lies have always been around; they have always been acceptable and, in fact, are critical to a smoothly-functioning society. What characterizes such lies is that they are told to make someone else feel good, with little or no harm inflicted. For example, when my wife shows off a new outfit it is in my interest to express admiration. In turn, she will say things to inflate my ego, while (I am sure) crossing her fingers behind her back. Lying begins early. I recall occasions when, as a child, lying was preferable to the spanking I would get for a broken window or letting goats into the garden. The 2009 film “The Invention of Lying” depicted what the world would be like without lying – intentionally blunt and cruel, with no religion and no fiction.

There are also lies we expect, especially those from politicians and the marketing departments of consumer goods companies, banking institutions and the like, which download their message on a gullible public – those of whom P.T. Barnum was thinking when he said “There’s a sucker born every minute.” We learn to become skeptics; if we don’t, we become victims.

But there is a line, once crossed, where lies and deceit damage civility and the trust we must have in one another and in the institutions that serve us. Today, such deceit infests our culture. These are lies designed to promote the teller. They are deliberate mis-statements: cheating in schools and in sports, lying to advance a career, politicians promising that which cannot be delivered and bold-faced lies to those whom we love and who love us. We even lie in accusing other of lying.

In 1940, when college students were asked as to whether they had cheated in high school, 20% admitted they had. In a recent poll, the response was 75%. Tellingly, fewer college professors today consider student cheating a problem (35%), than the public as a whole (41%). While both numbers are low, cynicism seems to come naturally to many who teach in universities?

The hullabaloo surrounding Tom Brady speaks to this cultural decline. He is perhaps the best quarterback in the NFL, but it is beyond credibility to believe Mr. Brady did not know the football he was tossing was deflated. This is a man who has played more than two hundred games over thirteen years as a pro. He was first-string quarterback at the University of Michigan for two years, and before that played high school football in San Mateo, California. Of course he knew the football was deflated. But we live in a culture that puts winning above integrity. In doing so, we set terrible examples for children who look up to star athletes. Why did Lance Armstrong feel the need to take steroids and then lie about that fact he had? He was already the world’s best cyclist, and he’d had cancer! He was a certified hero. Did neither he nor Brady consider the effect their lying had on millions of youngsters who idolized them?

We expect politicians to lie. As H.L. Mencken once said, “Looking for an honest politician is like looking for an ethical burglar.” When we see politicians bobbing and weaving, as Hillary Clinton is now doing in search of her progressive self, or Jeb Bush in regard to Iraq, we are reminded of Groucho Marx’s line: “Those are my principles. If you don’t like them, I have others.”

It is amazing that there is no sense of remorse from public figures who deceive. Social media has meant there is no place to hide from one’s past. We are left with absurd excuses: “everyone does it,” or “get over it; it’s time to move on” – excuses apparently acceptable to mainstream media – consider the lies of Dan Rather, Brian Williams and George Stephanopoulos.

There are those in public life that deliberately lie in order to get legislation passed that they believe will provide a public good. I don’t believe Mr. Obama lied with malicious intent when he told the American people they could keep their doctor if they chose. He did so because he believed that passage of universal healthcare coverage was worth a few lies. But, in doing so, he prevented an open and honest debate as how to best achieve that goal. It could be – though it seems far-fetched – that Susan Rice and Hillary Clinton actually felt the video was responsible for the attack on the Consulate in Benghazi and the death of Ambassador Christopher Stevens and three others. But the information that has since emerged, and the timing (less than two months before a Presidential election), suggest that the statements were made knowing they were false. Their purpose was self-serving. Such deception diminishes us as a people.

When those on the left bellow out, “Bush lied,” it is the bellowers who are lying. Rational people may disagree with Mr. Bush’s decision that took our Country to war in Iraq in 2003, but to accuse him of lying serves only to lessen the credibility of those who disagree with the decision. All the facts we now know support the argument that he, along with the intelligence community and most of Congress believed that Saddam Hussein had biological and chemical weapons of mass destruction and that he was working to get nuclear capability. The lies of the opposition, in this instance, have become so ubiquitous that it is no longer possible to have a reasonable dialogue as to the causes of the war. Instead we get “gotcha” moments that provide no illumination.

One of the worst deceptions of the 20th Century, and one still perpetuated, has been the relatively benign depiction of Communism, especially when compared to Nazism. The Nazis were ruthless killers and rightly deserved our condemnation. In a thirteen-year period they murdered between fifteen and twenty million people, including six million Jews. However, Communists were (and are) just as brutal. A study by University of Pennsylvania History Professor Alan Charles Kors (“The Age of Communism Lives”), notes that the Soviets and Chinese killed a hundred and thirty million people, something to ponder as we open our arms to the Castro brothers.


As Mark Twain warned in the quote at the top of this piece, “a lie well told is immortal.” For the sake of our future and the preservation of our liberty and culture, we must reverse this acceptance of deceit. It is the truth, which is frangible, that sets us free.

Labels:

Thursday, May 14, 2015

"Barack Obama's Next Job?"

                     Sydney M. Williams

Thought of the Day
“Barack Obama’s Next Job?”
May 14, 2015

Mr. Obama will be a relatively young man when he retires from the most powerful position on earth – the Presidency of the United States. He will be 55, just a year older than Bill Clinton was when he left office, and seven years younger than was George W. Bush. What will he do for an encore? Will he go back to Hawaii and paint, like Mr. Bush? Will he use his years in public service as a means to accumulate personal wealth, as Bill Clinton has done? Or will he use the Presidency of the U.S. as a stepping stone to become the leader of the world – free and not free?

Descending from the imperial throne of the American Presidency cannot be easy. Though, there have been Presidents like Harry Truman, Gerald Ford and George W. Bush who, like Cincinnatus, exchanged the robes of Commander in Chief, if not for the plowshare, at least for a life away from the media and the siren call of fame. But humility is not in Mr. Obama’s DNA.

Modesty has never characterized him and his ex cathedra habits. Who can forget the Roman columns in Denver when, in 2008 he accepted his Party’s nomination for President. He is, as he has told us, “a better speech writer than my speech writers” and is “a better political director than my political director.” He has, as he has redundantly told us, “a healthy ego.” He has taken narcissism to a new level – a difficult accomplishment in a world that contains William Jefferson Clinton. With incongruous self-contradiction, Mr. Obama, like many anti-Imperialists, believes in rule by elites – those because of their talents and moral certitude can best decide for the rest of us.

The job of Secretary-General (SG) of the United Nations beckons. Were it not for the fact that Barack Obama comes from one of the five permanent members of the Security Council, the job would be a shoo-in. Historically (to the extent that seventy years of existence can be said to have a history!), the job of SG has gone to one from a mid-sized country, never to an individual from one of the five members of the Security Council and never to a person who had trod the world stage as a colossus. Nevertheless, an exception might be made in Mr. Obama’s case. This, I hasten to add, is not an original thought. Others have concluded that Mr. Obama may throw his proverbial hat into this global ring, most notably James Lewis who writes for American Thinker.

An interest in the job as Secretary-General would explain some of the inexplicable things Mr. Obama has done as President – such as not calling Islamic terrorists, Islamic terrorists; ignoring Congress; not confronting Russia regarding Syria or the Ukraine; removing, at the behest of Russia, missile defense systems from Eastern Europe; leaving China to flex its muscles in the South China Sea; making nice with Iran; alienating Israel, and befriending the Castro brothers. It would explain his “apology” tour, his grasping the hand of Hugo Chávez, and his bowing to Middle Eastern dictators. The job would feed his preference for a greener world, where climate change would take precedence over dealing with the survivability of Social Security and whether racial tensions in inner cities will worsen. He would no longer have to worry about a recalcitrant Congress or an interfering Judiciary. Since no prior Secretary-General has tested the limits of the job, he would be unfettered by rules or tradition.

Since inception, there have been eight Secretary-Generals of the UN. All represented mid-size countries, from Trygve Lie of Norway – the UN’s first Secretary General – to Ban Ki-moon of South Korea, the man currently in the job. In terms of continents, besides Europe and Asia, South America has been represented (Javier Pèrez de Cuèllar of Peru) and so has Africa (Kofi Annan of Ghana and Boutros Boutros-Ghali of Egypt). While there are no term limits, no individual has served more than two five-year terms.

The Secretary-General is chosen by the 193 member states. The Security Council, made up of the five permanent members and ten rotating members, is the governing body. Any of the five permanent members of the Security Council (China, France, Russia, the U.K. and the U.S.) can veto the selection made by the full body. With 57 member states (or almost a third of the body) being Muslim nations, they are critical to an election. The fame, charisma and prestige of a man like Barack Obama, with a Muslim father and who spent four years in a third-world nation, might transform what has generally been a delightful sinecure into a meaningful job. Having Mr. Obama in the role of SG would mark a radical shift from the past, as no one of his renown has served in the position before.

The United Nations is limited in what it can and cannot do. It has no ability to tax, relying instead on assessments and voluntary contributions of member nations; its legal authority is relegated below that of its member states; the International Court of Justice can decide disputes, but their decisions are based upon the voluntary participation of member states; it does not command its own army. While the UN Charter states that the SG shall be the chief administrative officer, it does not dictate any specific duties.

It would be unusual for the United Nations, with both China and Russia capable of wielding a veto, to have a former American President be elected its head. But Barack Obama is not your typical American, and certainly not a typical President. Both of his two nearest predecessors, while coming from opposite sides of the political aisle, were quintessential Americans. He is more cosmopolitan – more a man of the world than inherently American. He was born in Hawaii in 1961, 2400 miles from the California coast. His father left his mother shortly after his birth. At the age of four he moved in with his maternal grandparents. He then spent four years (1967-1971) living in Indonesia, where his mother had moved two years earlier with her second husband. At age ten he returned to Hawaii to attend Punahou School – the largest independent school in the United States – from which he graduated in 1979.


Mr. Obama is young, charismatic, articulate and ambitious. He appears to have little interest in knowing how things work, or in managing a cumbersome bureaucracy. He prefers speech making and extolling his vision that incorporates an anti-imperialistic perception of the world. The United Nations, by itself, is powerless. But the job comes with a pulpit that looks out over a large congregation. If one believes, as do I, that it is the person, not the position, that lends power to an institution then the possibilities for Mr. Obama are open-ended – that is, if the world lets him. Ban Ki-moon’s second term ends on December 31, 2016, about three weeks before Mr. Obama leaves office – providing a convenient moment for Barack Obama. The power and the influence he could amass would be unlike anything the world has ever seen.

Labels:

Monday, May 11, 2015

"When Speech Is Not Free"

                      Sydney M. Williams

Thought of the Day
“When Speech Is Not Free”
May 11, 2015

Free speech is fundamental to ensuring that any country remains free. Trifling with it should not be taken lightly. Three recent events in the U.S. remind us of its value. One was the Prophet Muhammad Art Exhibition and Contest in Garland, Texas. That incident created a debate between “free” speech and “hate” speech. Another was the PEN (poets, essayists and novelists) award to Charlie Hebdo, which was boycotted by some prominent writers who claimed the magazine is “racist.” The third, and scariest, was the assertion by Hillary Clinton and others that the Constitution may have to be amended; so that Congress in its wisdom can determine what is appropriate and what is not in regard to political speech during Presidential campaigns.

The example that is always used to define the limits of free speech is the crying of “Fire!” in a crowded theater when there is no fire. It is malicious and is intended to scare and harm those that are there. But words that are distasteful to some, or even to most, are protected. When Chris Ofili displayed his elephant dung-covered Madonna at the Brooklyn Museum in 1996, it was described by then Mayor Giuliano as “sick,” an assessment with which I agreed. But when he tried to have the City of New York withhold a $7 million grant, the museum sued on the grounds that the mayor’s action was an infringement of its First Amendment rights. The museum, rightly, won.

The exhibition in Texas was in poor taste. The New York Times alleged in an editorial, that it was “an exercise in bigotry and hatred,” thus should be banned. In my opinion, it qualified as protected speech; despite the anguish it may have caused millions of Muslims. The Times did not seem overly concerned about the effect a manure-covered Madonna would have on millions of Christians, nor did they see anything hypocritical in using the words “hatred”, “bigotry” and “blatantly Islamophobic” to describe Pamela Geller, the woman who put the exhibit together. While the exhibit was in bad taste, probably reflected bigotry and I would not have attended, it certainly should be considered free speech. It surely did not warrant the attempt by Islamists to kill exhibitioners and attendees.

The Charlie Hebdo situation is a reminder that freedom comes with a price. There are those who, in the name of political correctness (or fear), would take it away. In terms of speech, it is not prejudice on the part of the few that should concern us; it is when society willingly accepts limits to expression. We saw that happen in colleges and universities when Ayaan Hirsi Ali and Condoleezza Rice were denied opportunities to speak last year. As a conservative, I welcome a diversity of ideas. I only wish my friends on the Left felt the same way. The decision by those like Peter Carey, Francine Prose, Joyce Carol Oates and Michael Ondaatje to boycott the ceremony at the PEN awards was reminiscent, as Amanda Foreman reminded us in Thursday’s Wall Street Journal, of the Congress of Dubrovnik in 1933 when a small group of authors refused to take a stand against book-burning Nazis. 


When Hillary Clinton (who says she will need $2.5 billion for her Presidential run!) asserts there is too much money in politics, she is, from my perspective, preaching to the choir. But when her answer is that the Constitution may have to be amended, so that Congress can determine what speech is appropriate and what is not, she claims hers is merely an attempt to curtail campaign spending. But, in truth, her proposal is a step down a steep, dangerous and slippery slope.

The scapegoats that prompted Hillary’s illiberal recommendation were the Citizens United decision and the Koch Brothers who have become the alleged evil stepmothers to the Left’s self-anointed Cinderella. Their names have become synonymous with money in politics, despite the fact that the recipients of the largest amounts of money, in recent years, have been Democrats. Money flows where it can get the best return; thus money from public sector unions – the largest source of money for either Party – consistently ends up in the laps of Democrats. Warren Buffett argues against the Keystone XL Pipeline, not because he has environmental concerns, but because its construction would hurt Burlington Northern. Wall Street is apolitical. They give to whomever they feel is likely to win. There are others, like George Soros and Tom Steyer, who for policy reasons give millions to Democrats. The Koch Brothers, similarly, give to Republicans. They give because the policies and values of the recipients accord with their own beliefs. That is, and always has been, the American way.

It is natural for people in politics to desire power. It was that understanding of human nature that caused the founding fathers to include checks and balances on government. The government we have today is far different from that envisioned two hundred and twenty-five years ago. Because we live in a different era, changes are to be expected. But the power and reach of government today should concern us. In 2013, the Code of Federal Regulations numbered over 175,000 pages. More than half of all Americans are, in some way, dependent on government for at least part of their livelihood. Seventy percent of the federal budget involves payments to individuals, versus fifteen percent in 1950. We have, in short, become dependent on the beneficence of government. Increased dependency and less self-reliance do not bode well for a society that wants to remain free. What tyrants fear are ideas contrary to theirs. Anything government does to diminish the ease and frequency with which ideas flow should make a freedom-loving people fearful. Snuffing out the candle that lights the darkness is not the way to a freer and fairer society.

Congress should require full disclosure of every person and organization contributing to every campaign, directly or through a PAC, along with the amount given. Anybody who makes a contribution that lends support, either urging the adoption of specific policies or helping a candidate, should do so knowing that their name and affiliation will be in the public domain. Disclosure may or may not inhibit contributions, but transparency should be welcomed by all who live and participate in a free society. “Open the books,” as OpenTheBooks.com would say!


It was limiting free speech that first characterized Nazi Germany and Communist Soviet Union. It is true in all tyrannical-run countries, like Cuba, Venezuela, North Korea, Iran, Saudi Arabia, Syria, Somalia and China. When government leaders advocate limiting speech, the consequence is tyranny.

Labels:

Thursday, May 7, 2015

"Lessons from Baltimore" - Sydney M. Williams

                      Sydney M. Williams

Thought of the Day
“Lessons from Baltimore
May 7, 2015

The most visible teaching moment from Baltimore was the unrehearsed scene of a mother chasing after her son whom she had seen on television throwing rocks at police. It was important because it manifested the hurt and determination of a mother for a son whom she loved and who was at risk of destroying his life. She was not angry at the Baltimore police. She did not look upon herself as a victim. She understood right from wrong: that no matter the provocation, it was wrong for her son to cover his face and throw rocks at the cops.

The immediate source of the riots, as we all know, was the death of Freddie Gray while in police custody. But the violence that followed had little to do with reasons suggested by the media and those like Al Sharpton: Black youth alienation, police violence toward African-American teens, poverty, White racism and economic inequality. Those are real and/or perceived consequences, not antecedents to the root causes that divide a nation by race, wealth and social status.

The genesis of the problem that led to recent racial riots is, in my opinion, obvious, simple and fundamental; yet remains unaddressed. It is as though a politically correct society has deliberately conspired to ensure the continuation of an inner-city underclass. There are four principal causes: dysfunctional families, an education system that has failed inner-city youths, municipal tax and regulatory policies that discourage private investment in inner cities and thus the creation of jobs, and fourth, the politics of division which compartmentalizes constituents into easy-to-reach groups. The unintended consequence of the latter is to keep us segregated.

Out-of-wedlock births have soared in the past few decades, especially among Blacks and particularly among those with a high school degree or less. Fifty years ago, Daniel Patrick Moynihan, then a labor department official, released an alarming report, which noted that 25% of Black children were born to unwed mothers. Today, that number is over 72%, and shows no sign of diminishing. Innumerable studies have shown a link between children raised in single-parent households and poverty. Without assigning blame for the reasons, we can all agree that it is a cultural issue – that parenthood takes commitment and personal responsibility, traits key to individual success. Nuclear families should be encouraged, not maligned.

It is not just in Baltimore that our public education system has been failing; it is in most urban areas. A report commissioned by America’s Promise Alliance, entitled “Closing the Graduation Gap,” showed that the graduation rate in the country’s fifty largest cities was 53%, compared to 71% in the suburbs. In that report, Baltimore had the second largest gap – 41% in cities and 81% in their suburbs. More recent studies have shown that the graduation rate in Baltimore has risen to 56% – still dismal. Apart from a loving family, there is nothing more important than education in helping our youth become productive members of their communities. Too many public schools fail in this regard. They fight competition from non-unionized charter schools and voucher programs. Even President Obama, who should know better, failed to support the voucher program in Washington, D.C. that had given hope and opportunity to thousands of poor and minority students in that city. As unions seek to increase memberships, the ranks of administrators and non-teaching staff have exploded, raising costs, but not helping students. And too many who do graduate, do so illiterate and innumerate. A good education is requisite for a good job. It is not money that is needed in public schools; it is a total cultural transformation.

Maryland, with its proximity to Washington and its federal bureaucracies, is the richest state in the nation, with a median household income of $73,500. In contrast, Baltimore, which is 63% Black, has a median household income 44% lower – $41,400. Black youth unemployment in Baltimore (ages 20-24) is 37% versus 10% for Whites. In 1960, Baltimore was wealthy and was the 6th largest city in the U.S., with a population of 939,000. With a population of 622,000, it now ranks 26th and is poor. Like its sister cities, Detroit and Newark, Baltimore is a one-Party city that never recovered from the well-intentioned but ill-fated effects of the Great Society, or from riots of the 1960s. As they did last month in Baltimore, rioters destroy places that employ their neighbors. More than a third of businesses in Baltimore are Black-owned. The ownership of private property is fundamental to our system. When it is perceived that property will not be protected, owners tend to move out. The concept of “broken windows” policing, which holds that if a neighborhood is maintained crime rates decline, is under pressure under our new anti-police environment. Jobs are dependent on a government that uses its taxing authority to encourage businesses and a police force that affords protection. Without police protection, there are no businesses. Without businesses, there are no jobs. Without jobs, there is no hope.

The fourth cause has been the insidious political practice of dividing the electorate. It is especially common among those on the Left who would rather appeal to emotions and special interests than deal with ideas. It is done so that politicians can more easily address the peculiar needs of specific constituencies; the consequences have exacerbated natural differences – Black from White, rich from poor, women from men, young from old, traditional values from modern mores and liberal from conservative. Government’s slicing and dicing has created “pluribus” out of “unum.”

Political correctness and the misguided, sanctimonious nature of liberal elitists are the nemeses of racial and political harmony. Dependency has replaced personal responsibility. In a desire to be inoffensive, we have given up having school children salute the flag and repeat the Pledge of Allegiance. The Lord’s Prayer, Christmas and Easter celebrations are banned for fear of offending atheists or those of other religions. Through social and popular media, we celebrate those who live lives of purposeless immorality. Morals sink to the lowest common denominator. We express compassion for transgenders, and no matter which way the Court decides, we are generally tolerant and welcoming of gays who want to marry. While we don’t overtly condemn heterosexual marriage, it takes a back seat in the pantheon of our “inclusive” culture.

Raising a child requires loving and caring parents. Obviously, that cannot always be the case, but we should acknowledge its importance and it should become a goal toward which we strive. It also takes an education system where the focus is the student. It takes a city that is willing to lure businesses back, for the jobs they create and the dynamism they bring. But we must keep in mind, people are not the same. We are individuals with differing capabilities and aspirations. Outcomes will never be equal. But the opportunity to succeed should be given equally to all children. The current system has failed too many, especially in inner cities like Baltimore where despair has replaced hope. Love your child, strengthen traditional families, and let government open the doors to competition in public schools. Those are the lessons from Baltimore.



Labels:

Monday, May 4, 2015

The Month That Was - April 2015

                 Sydney M. Williams
                                                                                                                             May 4, 2015
                                                                                                             
The Month That Was
April 2015
(A month of Remembrance)

“The first of April is the day we remember
what we are the other 364 days of the year.”
                                                                                                                Mark Twain (1835-1910)

As will be true for the next eighteen months, Presidential campaigns dominated the news. In the week-ago weekend edition of the Wall Street Journal, political commentator Michael Barone noted we had better get used to long election cycles. “We ain’t going back.” That won’t change unless both Parties adopt the coronation method used by Democrats this season. Mainstream media will look into every dark recess – going back to pre-natal days – of every Republican candidate’s past. Whatever dirt they discover (and even some that will have been manufactured) will be prominently displayed. Fox News, the Wall Street Journal, talk radio and others will return the favor by revealing secrets of Democrat candidates. Vice sells better than virtue. We will learn more of indiscretions than accomplishments.

This April was no different than most months, in that it was chock full of news, some of it even important: The earthquake in Nepal, and the more than 7,000 who have died. The fact that Yemen, Syria and Libya are becoming failed states. Iran continued to taunt the U.S., despite desperate attempts by the Obama Administration to complete a nuclear agreement. The Taliban gained more ground in Afghanistan. The race riots in Baltimore highlighted Black alienation and concerns about police. The problems in inner-cities, however, are more grounded in dysfunctional families and an education system that has failed their youth. Hillary Clinton announced that she had deliberately destroyed the server on which she had deleted more than 30,000 e-mails. Oral arguments were heard by the Supreme Court regarding gay marriage. The NASDAQ, after a lapse of fifteen years, reached new highs. These stories and others represented important news, yet the most widely watched television program in the U.S. was Diane Sawyer’s two hour interview of Bruce Jenner. Transgenders are people too, and we all wonder why a man would prefer to becoming a woman after 65 years of urinating from an erect stance. But this focus on trivialities and personal quirks, at a time when the Middle East is imploding, East Asia at risk of erupting and our schools failing our inner city youth, seems misplaced. It says a lot about the way we live and our sense of priorities. Restoring morality in a pluralistic society will be a Herculean task.

April was a month of remembrances. In April 1789, George Washington took the oath of office as the first President of the United States. One hundred and fifty years ago Robert E. Lee surrendered to Ulysses Grant at Appomattox. General Grant, sometimes known as the “Butcher,” was magnanimous in victory, allowing Southern soldiers to keep their swords, weapons, and horses. Five days later, on April 14, Lincoln was shot. One hundred years ago, marked the start of the Armenian Genocide. (During the month, Pope Francis became one of few world leaders to refer to those killings as genocide.) It was also in April 1915 that French, British, Canadian forces began what would become known as the 2nd Battle of Ypres. By its end, there would be 120,000 casualties, many of whom died of chlorine gas. It was also this battle that prompted Lt. Col. John Alexander McCrae, a Canadian, to write “In Flanders Fields,” a poem still read every Remembrance Day.

“…if you break faith with us who die
We shall not sleep, though poppies grow
In Flanders fields.”

The amphibious assault on the Ottoman Turks at Gallipoli began on April 25, 1915. The campaign lasted eight months. By its end, when the British finally evacuated the peninsula, there were 350,000 casualties, including 110,000 dead. Particularly hard hit were forces from Australia and New Zealand. Blame for the debacle fell on Winston Churchill, then First Lord of the Admiralty. The 1981 Australian movie “Gallipoli” starring Mel Gibson etched that tragic and failed assault into the minds of millions.

Seventy years ago, on April 30, Hitler (finally) committed suicide. The full horror of what had happened to Europe’s Jews was revealed in the liberation of Bergen-Belsen and Buchenwald in April 1945. In the Pacific, the 82-day battle for Okinawa began on the first of April. By its end, 12,500 Americans were dead, along with an estimated 110,000 Japanese. It was on April 30, 1975 that the United States pulled out of Vietnam, leaving Saigon to the Viet Cong Communists. In the aftermath of our ignoble retreat, thousands died, including those who had sided with Saigon’s government. Our hasty (and frankly craven) evacuation added new terms to our lexicon: “boat people,” “re-education camps,” and “killing fields.” In Cambodia more than a third of the 5.7 million in population were killed by the Communist Pol Pot and his black-clad soldiers of the Khmer Rouge. We may debate as to whether we should enter wars, but when we leave precipitously disaster inevitably follows. Consider the experience of Germans, Japanese and South Koreans, in whose countries we still have troops today. Compare them to the people of Vietnam and Iraq, who we had pledged to help but whom we abandoned to an unknown but almost certain horrific fate.

Internationally, besides the terrible news still drifting out of Nepal, Islamic terrorism continued its rampage. In northeastern Kenya, 147 Christian students at Garissa University College were killed by the Somalia-based Islamic militant group, al-Shabaab. ISIS, in a repeat of what they had done earlier to Egyptians, beheaded dozens of Ethiopian Christians working in Libya. More than a thousand Libyans, fleeing a failed state that the U.S. quit, drowned in the Mediterranean trying to reach Italy. On a lighter note, Mo Hai-long, an official with a Chinese agricultural company, was accused of stealing seeds from Monsanto and DuPont in Iowa, reminding me of Henry Wickham who in 1876 smuggled 70,000 rubber tree seeds from the rainforests along the Amazon. In Wickham’s case he got away with it or, more precisely, the British government did, as they took those seeds and started rubber plantations in Malaysia.

The race riots in Baltimore, with attacks on police and the burning and looting of neighborhood businesses, brought back memories of Watts and Detroit in the 1960s. The impetus was the death  (and now alleged murder) of young Freddie Gray while in police custody. The rioters, with some professional help, were encouraged when Baltimore Mayor Stephanie Rawlings-Blake inexplicably instructed police: “… make sure that the protestors [are] able to exercise their right to free speech…we also gave those who wished to destroy space to do that.”  What was she thinking? With Rand Paul and Marco Rubio joining Ted Cruz in the race for the Republican nomination, we now have three junior Senators pitted against a covey of governors and ex-governors. Hillary Clinton announced her candidacy on social media and then “vanned” down to Iowa to meet with small, intimate groups of “real” people. Rahm Emanuel won re-election as mayor of Chicago. Loretta Lynch was confirmed as the first African-American woman to serve as Attorney General. General David Petraeus was fined $100,000 and given two years of probation for “pillow talk” with his girlfriend, Paula Broadwell. Peter Schweizer had his book Clinton Cash published, which predictably, raised tempers in the Clinton camp.

For the first time in fifteen years, the NASDAQ finally crawled above its March 9th 2000 close; though it finished the month below that record high. The biggest difference between now and then is that today the multiple on the index is about one fifth what it was in 2000. Switzerland became the first government in history to sell benchmark 10-year debt at a negative interest rate. Lucky buyers will have to pay 0.055% for the honor of owning these bonds. Royal Dutch Shell is buying BG Group for $72 billion, excluding debt – the biggest energy deal in a decade. The Euro-Stock Index, up almost 20% year-to-date, had a flat month. But in Asia the Shanghai Index had a strong month – up 18%. U.S. stocks gained nominally. While the U.S. had asked the U.K., Germany and France to not join – at least immediately – China’s Asia Infrastructure Investment Bank, President Obama denied that the U.S. had ever opposed the bank. He said he just wanted to make sure it is run “based on best principles.”  Preliminary first quarter U.S. GDP numbers were reported below expectations at 0.2%. This followed March jobs numbers of 126,000, the lowest in two years. The labor force participation rate at 62.7% remains the lowest since the 1970’s. GE went back to its industrial roots, in spinning off GE Capital and its real estate holdings. Comcast called off its proposed $45 billion merger with Time Warner.

Baseball season opened in the Bronx, with the Yankees losing 6-1 to the Blue Jays. Duke and Wisconsin made it to the finals in the NCAA. Duke won 68-63. In the women’s NCAA, Connecticut defeated Notre Dame 63-53, giving Coach Geno Auriemma his 10th national title. Jordan Spieth, at age 21, became the second youngest player (behind Tiger Woods) to win the Masters in Augusta. On the last Wednesday in April the Baltimore Orioles played the Chicago White Sox before an eerily empty stadium; the first time ever in major league baseball’s history. Riots two evenings before prompted security concerns. Baltimore won 8-2.

Günter Grass, winner of the Nobel Prize in 1999, author of The Tin Drum and once considered the moral conscience of Germany’s Nazi past, died at age 87. However, his reputation had been tarnished. Nine years ago, as he was preparing a memoir Peeling the Onion, he admitted to having been a member of the Waffen-SS during World War II. Victor Gotbaum, who in 1975 played a key role in helping to avert a possible municipal bankruptcy in New York, died at age 94. He was a longtime leader of New York’s largest municipal-workers union. And Gary Dahl, inventor of the “pet rock” in 1975 died at age 78. He was a validation of P.T. Barnum’s claim that there is a sucker born every minute, as Dahl made a fortune off a gullible public. While he did place his rocks on excelsior, package them in boxes with “air holes” and provide instructions for care, his cost of goods were no where near the sale price of $3.95. Later products, such as the Original Sand Breeding Kit did not do as well. “Fool me once…!”


So endeth April. Let us hope that last month’s showers in the East extend to the drought-stricken West, and also bring this month’s flowers!

Labels: