Wednesday, November 27, 2013

"Whistleblowers and Extortion"

Sydney M. Williams

                                                                 Thought of the Day
                                                       “Whistleblowers and Extortion”
November 27, 2013

There is nothing wrong with “blowing the whistle” on someone doing something corrupt or unethical. In fact, it is our moral obligation to be the “rat” in such instances. For example, whistleblowers regarding the cover-up on Benghazi and the harassing of conservatives would have served justice. However, when rewards are calculated in millions of dollars, the incentive to “rat” either prematurely or fraudulently may be a temptation too rich to refuse. The “rats” may see things that do not exist, or fabricate stories regarding an event or transaction that occurred only in the vividness of their imagination. The motivation to be a good citizen metastasizes into an opportunity to exploit – to be an extortionist, either for money or fame.

A case in point was the incident at Rutgers last year. Eric Murdock, director of basketball player development at Rutgers was fired at the end of June 2012. Towards the end of the year, he showed Tim Pernetti, Athletic Director, tapes of basketball head coach Mike Rice haranguing and intimidating his players. Rice was suspended at the end of December. Two weeks later Rutgers received a request from Murdock’s lawyer for $950,000. The University declined, so Murdock released the tapes to the media. By April, Rice had been fired and Pernetti had resigned. In the midst of this chaos, Murdock filed a wrongful termination lawsuit against Rutgers, which the University denies.

The bottom line, from my perspective, is that it was widely known that Rice was abusive toward his players and bringing his antics to the attention of University officials was the morally correct thing to do; though the University should have done so on their own, as Rice’s behavior was so well known. Nevertheless, for doing so Murdock deserved praise. But in demanding payment, Murdock stepped over the line from doing a good deed to being an extortionist.

Curtis C. Verschoor, research scholar at the Center for Business Ethics at Bentley University and writer of an ethics column for “Strategic Finance,” recently wrote: “It’s well understood that a whistleblower is the most important source of evidence in detecting fraud and other misdeeds.” While I am skeptical, the government seems to agree. In total, $13.2 billion was collected by the federal government in the first four years of the Obama Administration. In fiscal 2012, the federal government collected $5 billion in False Claim Acts, up from a record $3.2 billion in fiscal 2011. According to an article in the October 1, 2012 issue of the New York Times, the government has paid out $1.6 billion to whistleblowers since Mr. Obama became President, “with law firms taking a cut in some cases of up to 40% of the proceeds.”

Dodd-Frank expanded the government’s whistleblower program to include the SEC and the CFTC. Consequently, the former set aside $430 million for payouts to whistleblowers and the latter, $100 million. Anti-fraud is becoming a lottery – providing a chance for a few to get rich. Over the past four years more money has been collected by the federal government in financial penalties against drug companies than in the previous eighteen years. Whistleblowers have been credited with about three-quarters of the proceeds. Business has been so good that a Corporate Whistleblower Center (CWC) has been established. A press release from the CWC last week advertised: “There are potentially huge awards for whistleblowers who have proof that a pharmaceutical company is actively involved in kickback schemes to sell more drugs and medical devices.” They spoke of “massive” reward settlements for whistleblowers. Even the most upright must be tempted.

While there is an obligation to alert authorities when corruption or criminal activity is observed, there are moral hazards when rewards are involved, and perhaps even when there are none. One of life’s first lessons is not to become a “tattletale.” For years, snitchers, stoolies, canaries and squealers were portrayed as unattractive. Even the words sound creepy. Perceptions, however, are in the eyes of the beholder. A changing culture is becoming friendlier to whistleblowers. Americans are notorious for favoring the “underdog.” We make heroes of such people. John Grisham made millions on his novel, The Firm. Edward Snowden seems a traitor to me, but is a hero to many. There are even those who sympathize with Bradley Manning. Hollywood is cashing in on both stories and will probably be sympathetic to both men, as they were sixty-years ago (and deservedly so) with Marlon Brandon in “On the Waterfront.” Julia Roberts played the part of a working-class, single Mom in “Erin Brockovich,” a story in which she blows the whistle on a company contaminating the water supply of her community – a great movie, but a blatantly political, anti-business story.

There is an acknowledged mercenary side to whistleblowing. Besides dozens of movies, books have been written addressing the market, with titles like The Whistleblowers Handbook and The Whistleblowers Survival Guide. As rewards have risen, whistleblowers have emerged from the woodwork, a fact not unnoticed by the legal profession. Googling “whistleblower law firms” a few days ago, 620,000 results came up in 36 seconds. Litigators represent the professions’ highest paid people. The median lawyer in the United States makes about $110,000 annually, while trial lawyers can make between $11 million and $40 million a year, according to Sally Kane of the Hearst Corporation.

Unsurprisingly, trial lawyers want to keep this gravy train rolling. They are among the largest donors to the Democrat Party, arguing that they help the helpless in hopeless cases against evil corporations. Litigators find kindred souls in federal prosecutors who feed off the same temptations. When penalties are paid, the money comes from millions of shareholders or, in the case of government agencies, from taxpayers. Lawyers make a killing. The actual perpetrators, whether it is Jamie Dimon, Franklin Raines or Barney Frank, get to live another day.

There is no question that boardrooms and government agencies house few angels. In a nation of 330 million, there will always be corruption. But I worry less about corrupt business practices, which are usually outed by whistleblowers, competitors and the media, than I do about cover-ups in government, where mainstream media shows little interest and where employees, governed either by undue loyalty or fear, stand mute. And I rue a culture that celebrates extortionists who pretend to do good, but who, in truth, are more interested in feathering their own nests.

What does it all mean? Have we become so materialistic that we see nothing untoward in ex-Presidents using their former office to make fortunes giving speeches? The brave person who dares challenge the establishment in an attempt to right a wrong has been sullied by those whose desire for fame and money is paramount. It is the chance to win the lottery that has become the motivating factor in blowing the whistle. Perhaps I overstate the case, but I worry about the culture of a society that breeds such attitudes.

Labels:

Monday, November 25, 2013

"Senate Goes Nuclear"

Sydney M. Williams

                                                                  Thought of the Day
                                                               “Senate Goes Nuclear”
November 25, 2013

After 226 years that included some threats but no violations, Nevada Senator Harry Reid detonated the “nuclear option” – doing away with the filibuster rule designed to protect minority interests and to guard against what the Founders feared could become a tyranny of the majority. It was explained by Mr. Reid that eliminating the filibuster would pertain to executive appointees and federal judicial appointees, but not Supreme Court appointees. The only examples of bipartisanship were three Democrats who joined the forty-five Republicans in voting no.

Ironically, but not surprisingly, when Republicans threatened to explode the same “nuclear option” eight years ago, the loudest voices crying foul came from those who now say, yes. Senators Harry Reid, Joe Biden and Barack Obama, the junior Senator from Illinois were adamant in their opposition. They saw such a move as an “un-Constitutional attempt” to wrest power away from the minority. Mr. Obama was especially outraged: “The American people don’t expect…for one Party – be it Democrat or Republican – to change the rules in the middle of the game.” The “Gang of 14” interceded and the option wasn’t detonated. This time, President Obama summed up his feelings, in his schoolmarm’s and disingenuous way: “Enough is enough.” He added: “The American people’s business is far too important to keep falling prey day after day to Washington politics.” He avoided the inconvenient fact that Reid’s decision was based solely on politics (as were his comments) – diminishing the role of the Senate, expanding the role of the President and, at least temporarily, the power of the Democrat Party.

Living in America, we have been protected, thanks to our Constitution, against significant internal seizures of power and diminishments of liberty. These are rights we take for granted. In our trusting acceptance of a government that is “…a force for good,” as Senator Chuck Schumer recently referred to it, we ignore skepticism about the behavior of politicians in positions of power. We have, however, become stupider of history and our knowledge of the Constitution, suggesting we are vulnerable to charismatic leaders seeking power.

The desire for power is universal and omnipresent. Efficiency does not define our government. When government assumes responsibilities in the name of fairness or when security has expanded in the interest of safety, individual dependency has increased and rights have suffered. Since the days of the New Deal, dependency on government has gradually grown. During the Civil War, Lincoln suspended the right of Habeas Corpus. During World War I, individuals were jailed for speaking out against President Wilson. Franklin Roosevelt made an aborted attempt to “pack” the Supreme Court in 1937, but he did send Americans of Japanese descent to internment camps following the attack on Pearl Harbor. The Patriot Act swapped rights of the individual in exchange for national security. The Boston police virtually shut down the city and surrounding towns, in a hunt for the Tsarnaev brothers last April. One might argue that in those cases, such restrictions were warranted by events at the time. Nevertheless, it behooves us to be vigilant and questioning, not blindly accepting of such limitations to our personal liberty. “If men were angels, neither external nor internal controls on government would be necessary,” wrote James Madison on February 6, 1788 in Federalist no. 51.” In framing a government, in which men rule over men, “you must first enable the government to control the government, and in the next place to control itself.”

Harry Reid’s decision showed no wisdom, no forethought and was disdainfully disrespectful to the institution he leads. It was purely political. He surrendered the deliberative nature of the Senate for the cause of ideology and expediency. In doing so, he emasculated the Senate and handed more power to the President.

The reason he did so, he claimed, was because he had become frustrated with Republican “obstructionism,” especially regarding the appointment of three judges to the U.S. Court of Appeals. Democrats claim that a three-seat vacancy needs to be filled. Republicans counter that six retired judges have been filling in, and that in any event the Court is under-worked. I don’t pretend to know which is accurate, but it begs the question as to whether expediency should trump deliberativeness when it comes to the operations of the Senate. The cynic in me says the President and Senate and House Democrats have a strong incentive to divert attention from what continues to be a disastrous roll-out of ObamaCare.

One definition of leadership is the ability to get people of myriad persuasions to work together. In the Senate that means finding common ground, even among varying and warring factions. In that regard, Harry Reid has been a failure. He is far too partisan. As U.S. Senators, loyalty should be to the states they represent and to the American people, not to a Party and not to a President. That is why they serve six-year terms. As Americans, we all have common interests. It is a question of seeking it out. Insults and harangues may appeal to the media, but do little to help people of differing convictions find conciliation.

When frustrated by Republicans or simply by the deliberate way in which bills and appointments make their way through the Senatorial process, Mr. Obama has taken to issuing executive orders and creating czars to further his agenda. The powers of the EPA and unions have been enhanced through orders that bypass the Congress. Mr. Reid, in his recent decision, has abetted that process, in limiting the ability of the minority to filibuster appointments, whether to the courts, to Fannie Mae, the NLRB, the Department of the Interior, or the EPA. Too little regulation may be bad for the environment; too much is harmful to the economy. Defending unions against right-to-work states impedes employment growth. Immigration reform, when it is aimed at increasing Democrat voter registration is not reform. Fannie Mae symbolized the cronyism that helped bring about the financial meltdown in 2008. Appointing Mel Watt, a leftist Democrat Congressman, to lead the Agency risks a repeat of everything that went wrong in 2008.

People should not lose sight that this Congress and this President have been historically divisive and partisan. Throughout our history, landmark legislation has been passed with votes from both Parties. Social Security, Civil Rights, Medicare and Medicaid received bipartisan support, as did Medicare Part D. Debates were often heated, feelings ran high and votes were frequently close, but in all cases support came from Democrats and Republicans. Even Dodd-Frank had votes from three Republican Senators. But ObamaCare did not garner one opposition vote. It was passed unilaterally, in the same way the Stimulus Bill was. And Democrats and much of mainstream media wonder why its implementation has been so catastrophic? Blame does not lie on the shoulders of the opposition Party. When the fault is so fundamental and reconciliation is impossible, one should look to the leaders.

The wisdom of the Founders may seem archaic today. Modern times, so goes the argument, require modern responses. While that may sound appealing, one should be cautious; for the human traits with which the Founders were dealing are ageless. The desire for power, as I wrote earlier, is universal. It is also timeless and dangerous. Self-confidence is required for success, but hubris precedes failure. Surrendering rights for security may seem sensible when threats are imminent, but one must consider the consequences. Dependency precludes self-reliance. Having successfully fought a revolution, the Founders were not about to give in to despotism no matter its origin. It did not matter whether tyranny emanated from the masses via unfettered democracy, or through an all-powerful executive. More than anything, the Constitution is about restricting power. The Founders insured that the three branches of government would be checked and balanced. Reducing the role of the one increases the power of the other.

In disallowing the minority party the right to filibuster on Presidential appointments, Mr. Reid has opened Pandora’s Box. He has opened the possibility that all filibustering may become disallowed. The two-thirds vote requirement for numerous Senatorial responsibilities was a means of protecting the minority against the tyranny of the majority. Doing away with such a requirement argues that the Senate may well decide future matters with a simple majority, making it similar to the House. That was not the intent of the Founders, who certainly possessed more wisdom than Mr. Reid or Mr. Obama. In Federalist No. 10, James Madison warned against “the cabals of the few,” just as he did against “the confusion of the multitude.” Can you imagine any politician uttering such a warning today?

One cannot help but wonder if Mr. Reid fully thought out the consequences of his actions. Does he expect that Republicans will never again control the Presidency and the Senate? Or is he convinced that the Republican Party is in permanent retreat, thereby allowing him to act unilaterally with no unintended consequences? Or perhaps he just doesn’t care? Was Mr. Reid’s real intent to provide cover for Mr. Obama who has sunk so low in the polls? Perhaps he concluded that expediency in governing is a virtue, and that patience in such matters is a vice? The questions remain unanswered.

However, when erecting gallows, one should take care to consider for whom they are being built. Keep in mind, Robespierre (no matter one’s feelings for the scoundrel) was guillotined in 1794 by the very instrument whose creation he had encouraged, and to which he had sent so many of his former friends and enemies. Be careful what you wish for!

Labels:

Friday, November 22, 2013

"JFK - A Perspective"

Sydney M. Williams

                                                                Thought of the Day
                                                              “JFK – A Perspective”
November 22, 2013

The most commonly asked question today: Where were you on this date in 1963? Of course, a shrinking number of Americans can respond. The median aged American today was not born until twelve years after President John F. Kennedy was shot. I was in a college classroom, daydreaming as usual, when I noticed the flag outside lowered to half-mast. A little more than a year earlier I had been at Fort Dix during the Cuban missile crisis. We were ordered to stand in formation, duffle bags ready, until ordered to stand down a few hours later. The President’s handling of the crisis showed a toughness and maturity, in contrast to the disastrous Bay of Pigs operation, which occurred three months after he took office. Even I, at age 22, knew Kennedy was growing into his role as Commander in Chief.

Judging Mr. Kennedy as a President has always been difficult, as he was only in office for a thousand days. Any answer is speculative. While JFK’s time in office was short, six Presidents, including Gerald Ford, spent less time in office than he did, with much less written or said about them. He was a supply-sider, in that he lowered the top tax rate from 90% to 65%, and noted the increase in revenue such cuts brought to the Treasury. Like everyone at the time, he was a Cold War warrior. He combined youthful self-assurance with political pragmatism. In his inaugural, he said: “we shall pay any price, bear any burden…in order to assure the survival and the success of liberty.” Because of concerns regarding nuclear weapons, he pledged to renew efforts at peace, but added a caveat: “…only when our arms are sufficient beyond doubt can we be certain beyond doubt that they will never be employed.” In the same speech, he reminded Americans of the limits of government, that “…the rights of man come not from the generosity of the state, but from the hand of God.” And he uttered the phrase most remembered, but often misquoted: “My fellow citizens of the world: ask not what America will do for you, but what together we can do for the freedom of man.”

But it was his youth – “that the torch has been passed to a new generation of Americans” – for which he will always be remembered. At age of 43, John F. Kennedy was the youngest man to be elected President. (The youngest to become President was Theodore Roosevelt, who was 42 when William McKinley was assassinated in 1901.) He was the first President born in the 20th Century. He did represent a new generation. The previous five Presidents – Eisenhower, Truman, FDR, Hoover and Coolidge – had been born within eighteen years of one another between 1872 and 1890. His immediate predecessor, General Eisenhower had been Supreme Commander Allied Expeditionary Forces during World War II. Kennedy had served in the War as commander of a PT boat – a young, heroic junior naval officer. He did indeed represent a new generation.

Kennedy was the eighth President to die in office and the fourth to be assassinated. Three Presidents had been assassinated in a 36-year period, between 1865 and 1901 – Lincoln, James Garfield and William McKinley. Since Kennedy’s death, Presidential security has become far more intense, with Presidents living virtually in bubbles, no longer exposed to the masses. The idea of a President strolling along Pennsylvania Avenue, or an ex-President driving himself and his wife on a motor vacation – as Truman did in June 1953 – seems as remote as the building of the Pyramids.

While Garfield and McKinley have disappeared into the mists of history, the memories of Lincoln and Kennedy have survived. Lincoln’s greatness is understandable because of the Civil War, the freeing of slaves and for holding the Union together. Kennedy’s memory has been maintained in large part because of the myths that emerged in the months and years following his assassination in Dallas on this date fifty years ago. His Administration had included many of the “best and the brightest,” as David Halberstam immortalized them in his book of that title. Young academics from Harvard and other Ivy League colleges descended on Washington, joining other intellectuals – all young, athletic, wholesome and aspirant. Their ranks, according to Robert Dallek’s Camelot Court, included 16 Phi Beta Kappas and 4 Rhode Scholars. The ironic twist to the title of Halberstam’s book is, of course, that it was those same, bright young men that got the United States enmeshed in Vietnam.

Since the War-time death of his older brother, Jack Kennedy had been groomed to become President. He was the scion of a rich and powerful father. He grew up privileged in a large, wealthy, active and charismatic family. (Thirty years earlier, his father Joseph Kennedy had been a bootlegger and Wall Street speculator. The elder Mr. Kennedy had then been appointed the first chairman of the S.E.C. – a fox to look over the henhouse was the explanation given by FDR. Later, he served as Ambassador to the Court of St. James, before leaving the post in 1940 for being too anti-British. By 1960 he was the patriarch of America’s version of a royal family.) In 1953, Jack Kennedy married socialite Jacqueline Bouvier. It was a wedding that made the cover of Life Magazine. In 1946, he had been elected to the House of Representatives and six years later to the U.S. Senate. In defeating Richard Nixon in 1960, Kennedy became the first (and only) Catholic to be elected President of the United States.

The early 1960s marked a sharp change from the relative placidity and optimism of the middle and late 1950s – a welcome time after a decade of Depression and seven years of war, including Korea. By the end of the 1960s all innocence was gone – evaporated in the miasma of Vietnam and student unrest. Robert Kennedy and Martin Luther King lay dead, killed by assassins in 1968. Did JFK’s death provide a foreboding of what was to come? Would the war in Vietnam have expanded had Kennedy lived? In 1962 Kennedy increased the number of advisors in Vietnam from a few hundred to a few thousand. A U.S. backed coup by Vietnamese generals assassinated Premier Diem and his brother a few weeks before President Kennedy was assassinated. Would the SDS and other counter-cultural movements like the Black Panthers and the Symbionese Liberation Army have become as wide-spread? Would colleges from Columbia to Berkley have had to suspend classes? Would Woodstock have happened, with all it represented in terms of drugs and sexual license? Would the police have had to storm protesters, as they did in Chicago in 1968 at the Democrat convention? We can only speculate, but I would suggest such movements and demonstrations are far greater than any one man – that the ingredients were there. Their unruliness and disruptions became the misfortunes of future Presidents.

The Peace Corps, established in 1961, was an enormous success and remains a lasting legacy of JFK’s Presidency. However, the big social legislation of the 1960s – the Civil Rights Act, Medicare and Medicaid, and the war on poverty – became bills signed into law by Lyndon Johnson in 1965. But, when the counter-cultural movement was at its most outrageous, a vestige of Kennedy’s optimistic idealism was realized when Neil Armstrong took man’s first steps on the moon, on July 20th, 1969.

Despite being a country that revolted against the royal edicts of George III, many Americans are enamored with royalty. Frankly, it is a habit I don’t understand. Yet, it was in the mythical blanket of Camelot that Kennedy followers chose to wrap their fallen hero. It turns out that the first to use the term “Camelot” regarding JFK’s Administration was Jackie Kennedy. She contacted Theodore White, a writer with Life Magazine and the author of The Making of the President 1960, shortly after the assassination. Jackie wanted to be sure that the first piece to be written about her husband would be exactly what she wanted it to be. The musical “Camelot” had opened in 1960. According to Mrs. Kennedy, the song her husband most enjoyed contained the line, “Don’t let it be forgot, that once there was a spot, for one brief shining moment that was known as Camelot.” Whether or not that story is apocryphal, she succeeded in mythicizing his Presidency. This past week, JFK’s daughter Caroline was received in Japan, as our new ambassadress, like a princess. The more unattractive qualities of Kennedy’s character, such as frequent trysts with women were largely covered up. We were left with an image of sparkling, charismatic youth, who died with dreams unfulfilled, a man and a Presidency that would remain forever young. One cannot help but recall lines from A.E Housman’s, “To An Athlete Dying Young.”

                                                       Smart lad, to slip betimes away
                                                  From fields where glory does not stay,
                                                     And early though the laurel grows
                                                       It withers quicker than the rose.

The future will look at Kennedy’s Presidency through analytical rather than rosy lenses. He did represent a break with the past, a break that was without pain at the time of his death, but a break that devolved into chaos a few years later. As a nation, we still represent the greatest hope for mankind, but we have yet to recover the optimism, character and moral sense necessary for a civilized society to perform well both at home and abroad. We had it for a few years under another Irish President, Ronald Reagan, but it proved ephemeral. In a world that favors moral relativism, we struggle to differentiate universal truths of right and wrong. Learning more about the world Kennedy inherited and the world he left behind should help achieve a better understanding.

Labels:

Wednesday, November 20, 2013

"The American Experiment - A Lesson for Today"

Sydney M. Williams

                                                            Thought of the Day
                                   “The American Experiment – A Lesson for Today”
November 20, 2013

Life is an experiment. We begin as infants. Everything that comes later is untried, at least in our own experience, thus everything we face is new – every time we pick up the phone or cross a street. Almost exactly fifty years ago, my then fiancé Caroline (and who had been under pressure from her family about marrying a New Hampshire rube) told me, alright, we’ll get married; we’ll give it a try. We were married in April 1964. Our marriage (like all marriages) remains a work-in-progress. But that is what gives it excitement. It is what has kept the experience fresh.

The United States had an opportunity, rare for a nation, to begin with a clean slate, or at least a relatively clean one, in 1789. It was geographically large with a diverse population of about 2.8 million scattered over approximately 150,000 square miles. It had the benefit of English common law and the wisdom of philosopher-moralists from Plato to Adam Smith. The Founders knew that the Declaration of Independence and the Constitution with its Bill of Rights, should they prove successful, would become examples for all mankind. They also realized that no two political systems are ever exactly the same; as cultural and moral issues are unique to a people and state. Our Experiment began with the election of George Washington in late 1788 and early 1789, though its origins went back 400 years to the earliest settlers in Jamestown, Virginia and Plymouth, Massachusetts. The American character was forged in that wilderness.

The Founders recognized that what they produced in Philadelphia was an experiment, unlike anything before attempted. They also recognized its fragility. Benjamin Franklin, exiting what is now Independence Hall in September 1787, was asked by a passer-by: “what have you accomplished?” Allegedly, his response was, “A republic, Madam, if you can keep it.” Seventy-six years later, at Gettysburg, Abraham Lincoln spoke of the country’s engagement in a great war, “testing” whether any nation, “conceived in liberty and dedicated to the proposition that all men are created equal” and comprised “of the people, by the people, for the people,” could “long endure.” In defense of that “experiment,” the Civil War took the lives of three quarters of a million men. Their sacrifice allowed the Union to endure for another 150 years.

The concept of a people’s government was both radical and conservative. The Constitution was proposed and written by a few. It was then debated by a larger group. The finished product was then sent to the states where it was ratified by the many. It was radical, in that ultimate power lay with the people. It was conservative, in that checks and balances were imposed. America had no aristocracy, nor did it want one. Kings who served by divine right were left to the Europeans. While the founders frequently invoked God (our unalienable rights were endowed by “our Creator”), there was to be no central or State religion; people had come to this country to escape persecution, so they could pray freely to a God of their choice. Most importantly, they created a government in which ultimate power rested with the people, but exercised through their elected representatives.

James Madison predicted that the most likely invasion of natural rights would be the robbery of the few by the unpropertied many, whether by unjust taxation or debasement of the currency. With one percent of the population paying 36% of all federal income taxes, are we now approaching that point? With a dollar that has declined by a third over the past eleven years, are we debauching our currency? Continental Congressman Richard Lee once suggested that an indication of “elective despotism” would be when legislators passed laws from which they exempt themselves. Isn’t that what happened with the passage of ObamaCare?

To protect against that possibility, a mechanism of checks and balances was created, providing for a government with limited and defined powers. At a Memorial Day service in Northampton, Massachusetts in 1923, Vice President Calvin Coolidge succinctly described the function and limitation of each branch: “The executive has sole command of the military forces, but he cannot raise a dollar of revenue. The legislature has the sole authority to levy taxes, but it cannot issue a command to a single soldier. The judiciary interprets and declares the law and the Constitution, but it can neither create nor destroy the right of a single individual.” He added: “The chief repository of power is in the legislature, chosen directly by the people at frequent elections.” But, “It does not perform an executive function.” The concept of revolution may have been revolutionary, but moderation and conservativism determined the means and the outcomes. They created institutions that would weather future storms.

Unlike the French and Russian Revolutions which were fought for equality and fraternity (and delivered neither), the American Revolution was fought to guaranty liberty, which included a provision for equal opportunity, and has delivered both…so far. Madison had observed that in a genuinely free society you will always have inequality. (As, of course is true in all societies.) People have different talents and abilities. Some are ambitious, others not. Some are energetic, some are passive. With varying skills and aspirations, some people will prosper and others will not. Definitions of happiness are as varied as are individuals. Government cannot force round pegs into square holes, but it can provide opportunity and ensure that each individual plays by the same rules and be subject to the same laws.

Is the experiment at risk of failure? Has the hypothesis on which we thought the American Experiment was based been invalidated? Much has been written about the loss of a moral sense. As America and the West have grown in material wealth, moral values have declined. Elitism has become pervasive within government bureaucracies. Cronyism is alive and well in the halls of Washington, in the canyons of Wall Street, and in the offices of big business and union leaders. The idea of a “nanny state” is becoming reality. Do we really need government to tell us what size drinks we should down, or how many calories exist in a “Big Mac?” Are the rights of a single person more important than the welfare of the many? We don’t permit an individual to stand up in a crowded theater and yell, “Fire!” when there is no fire; yet the police in New York are being discouraged from practicing “stop and frisk,” despite the program’s proven value. Are the rights of the accused more sacrosanct than those of victims? Civilized society cannot survive without laws and regulations, but neither can we become the Eloi, wholly dependent on the state. It is balance that must be found. It is why the American Experiment will always remain an experiment. We cannot let independence of spirit be exchanged for dependence on the state. We must be responsible: we cannot be enslaved.

Robert Putnam’s book, Bowling Alone enumerates the decline in community organizations. This has not been a sudden or recent change. Alexandr Solzhenitsyn, in a harsh but honest speech thirty-five years ago at Harvard’s commencement noted the shift. “The West has finally achieved the rights of man, and even excess, but man’s responsibility to God and society has grown dimmer and dimmer…All the celebrated technological achievements of progress, including the conquest of outer space, do not redeem the Twentieth Century’s moral poverty, which no one could have imagined even as late as the Nineteenth Century.” What has happened, Mr. Solzhenitsyn said, was a loss of civic courage. “A decline in courage may be the most striking feature that an outside observer notices in the West today.” Some of that lack of courage is being countered by a growing number of people who are willing to challenge conventional thinking. They may have helped create political dissonance and polarization in Washington. They, and organizations like OpenTheBook.com, are condemned for being “outside the mainstream” and promoting dysfunction, but history has shown that they are the ones on the vanguard of change.

What came to fruition in 1789 was remarkable. A few dozen men in what was a remote part of the world created a government unique in the annals of history. It was an experiment and like most experiments must be monitored closely, to ensure it stays true to its intent. The government then created was based on the individual and the ideal of liberty. The Founders did their best to anticipate attempts to wrest power from the individual, but they could not have anticipated the growth of the welfare state. Their concern lay more with despotism in the name of “fairness and justice.” They understood the dynamic between the people and their government – more dependency equals less individual independence. The expanding reach of government is insidious in the damage it does to independence, subtlety but irreparably. It is a concern better understood by those born enslaved than those who have grown up with liberty. In the same Harvard commencement address, Mr. Solzhenitsyn said: “Even biology tells us that a high degree of habitual well-being is not advantageous to a living organism.”

Experiments, whether we speak of marriage or a political system, are always works-in-progress. They can never be taken for granted. Whether we were born in the United States or we emigrated to this country, no matter our economic or social background and regardless of our race or religion, we have been provided a unique opportunity, because of the men who created the American Experiment. We have been lucky. It behooves all of us to ensure that the experiment continues.

Labels:

Monday, November 18, 2013

"No Excuse is Better than a Bad One"

Sydney M. Williams

                                                             Thought of the Day
                                               “No Excuse is Better than a Bad One”
November 18, 2013

It is commonly thought that all politicians are inveterate liars. Apart from a small number of idealistic freshmen Congressmen, the statement is probably true. Dissembling the dissemblers should be a crucial role of the Press. Unfortunately, the media have become supplicants and advocates, rather than skeptics. Blogs on the internet are doing more to dispel this miasma of lies than mainstream media.

In 2008, when Barack Obama was running for President, there was only the promise – a promise of change, which the people took to mean an “un-Bush-like Presidency.” They expected government to be more open, more caring about its people, more brilliant in its execution, more dazzling in its demeanor, more respected among its allies and more feared by its enemies. Somehow it became lost on voters (or they failed to seriously consider what he meant) that Mr. Obama had promised to “fundamentally transform America.” Enamored by style, the Press ignored the substance of his comments. Any curiosity was subsumed by Mr. Obama’s articulateness and coolness and the fear that any criticism would be considered racist. But, in the months before it became obvious he would win the Democrat nomination, racism was being exhibited by members of his own Party. Their comments were mean and patronizing: In 2008, Senator Harry Reid claimed that he was “wowed” by Mr. Obama’s oratorical skills and that the country was ready to embrace a black Presidential candidate, especially a “light-skinned” African-American “with no Negro dialect….” Senator Joe Biden, in 2007, was just plain crude: “I mean you got the first mainstream African-American who is articulate and bright and clean and a nice-looking guy.” Both comments were not only in bad taste, they dripped with arrogance and condescension. Both men later apologized, but why Mr. Obama subsequently added the Delaware Senator to the ticket is a question worthy of the Oracle at Delphi.

Fast forward five years – we have seen change, but not what the people expected, or what the Press reported, but what Mr. Obama promised: a fundamental transformation. Yesteryear’s speeches in Denver and Philadelphia were, in fact, prophetic. Secrecy has intensified. Lies, deceit, dissembling and incompetence have become commonplace. It is obvious that Mr. Obama prefers the “Julia’s World” of his website. The New York Times sounds forced and tired in the defense of their idol, comparing the disastrous unveiling of ObamaCare to the Bush Administration’s bumbling response to Katrina. Katrina was a force of nature. ObamaCare, as we all know, was a legally passed law (but one passed without a single Republican vote). Everyone knew the Law would only work if existing privately insureds were forced off their individual healthcare plans and made to sign on to ObamaCare. The pretence that this was somehow unexpected assumes that most Americans are idiots. The calling of such individual and family plans “substandard” is wrong and misleading. It reeks of conceit and perpetuates a lie. Many are called “substandard” because they are high-deductible plans – insuring against catastrophic accidents and illnesses, rather than routine office visits. Most do not include maternity care and birth control pills. Requirements that all plans have such coverage, regardless of the individual’s sex or age, is absurd, wasteful and expensive. Existing plans incorporate the insured’s choice of doctor and hospital – something ObamaCare does not and cannot.

As David Gergen recently wrote, “We’ve seen the hubris. And now we’re seeing the scandals.” Like Icarus, Mr. Obama soared through the skies, levitated by adoring crowds and unexamined by the Press. Like Icarus, whose wings were made with feathers and wax, he came too close to the sun, and has crashed to Earth. It remains to be seen whether he will be resurrected.

The predictable crash of ObamaCare has consumed more press than other scandals because of the political fallout. Virtually every American has been touched, from those whose policies have been cancelled, to the millions who were counting on Mr. Obama’s promises to have healthcare provided for the first time, and to those with pre-existing conditions, so were unable to get healthcare insurance. It is the latter that is most distressing; for lying to or about one’s political opponents is to be expected in today’s environment, but hoodwinking those who one purports to help is despicable. These were not, as the New York Times euphemistically put it, “incorrect promises.” Mr. Obama is not a simpleton. He knew he was lying when he promised you could keep your healthcare, if you liked it: “Period.” It was the only way ObamaCare would be affordable, and the only way the highly partisan bill could be passed. He has betrayed his supporters. One recently said to me that Mr. Obama has probably destroyed the prospect for universal healthcare for another generation. The decline in Mr. Obama’s poll numbers reflects the man’s hubris, his belief in the omniscience of the state and his disdain for those who oppose him.

But we should not, in the rush of revelations about the failures of ObamaCare, take our eyes off what, in my opinion, are worse scandals, especially Benghazi and the IRS. “Fast and Furious” was an example of incompetent bumbling. It is not usual to give weapons to those who are trying to kill you, as in fact happened in Mexico. As for the NSA, spying must rank as the second or third oldest profession. It has, I suspect, been going on as long as nation states have existed. One can argue that people in government should not spy on those in the press who disagree with them, because freedom of the press is essential to democracy, but to assume no one ever put a bug in Ms. Merkel’s hotel room would be a mark of naïveté. When Mr. Obama travels, he brings along a special and secure tent, which is set up in an adjoining hotel room. It is in that tent that the President conducts sensitive meetings. His security detail knows that foreign governments spy on us. Why would one assume we do not do the same?

But Benghazi and the IRS scandal are, in the opinion of this non-lawyer, impeachable offenses. Both involved deliberate lies, but were worse. One was designed to cover up a killing at a sensitive time in an election cycle; the other was a deliberate attempt to emasculate the President’s political opponents.

It has now been more than fourteen months since the September 11th attack on the U.S. Consulate in Benghazi. I still shudder every time I see or read the comments from the President and Secretary of State Hilary Clinton the day they met with the families of the dead at Andrews Air Force Base on September 15th 2012. Their blaming the attack on that “awful video” was a lie, which they both knew. I still get shivers when I think of their callous disregard for the truth, and the gross unfairness to the families of the fallen. It was a heinous example of placing politics above honor. Stonewalling the investigation has been the Administration’s response. Presumably, they feel the longer any substantive investigation can be delayed, the better off they will be. And, unfortunately, they are probably right. Nevertheless, there are questions that should be addressed: Why was Ambassador Stevens in Benghazi that night and why had he not received the protection from State Department counter-terrorists he had requested? Why had the Consulate been allowed to remain open when security requirements had not been met? Was the U.S involved in a gun-buying mission for the Syrian rebels, as some have speculated? Why was requested assistance not forthcoming? Who gave the “stand down” order? Why did the Administration persist in blaming the attack on a video when they knew that was untrue? Why have survivors not been permitted to testify? Why were certain CIA personnel asked to sign non-disclosure agreements on May 21, 2013 at a memorial service honoring Tyrone Woods and Glen Doherty who were killed that night in Benghazi? Why has Nakoula Nakoula been the only person punished, especially when it has been known from the first that his video bore no responsibility for the violence? This scandal needs the disinfectant of sunshine.

In terms of the IRS scandal, any use of federal agencies for personal or political purpose is a step toward tyranny. The office of President of the United States is the most powerful office on Earth. But, with office comes a responsibility to uphold the inherent rights of the people. Sadly, that sense of responsibility and the moral sense that should accompany it have become alien to the modern Presidency. The British historian and moralist, writing to Bishop Mandell Creighton in 1887, reminds us: “Power tends to corrupt, and absolute power corrupts absolutely.” Adding to the problem is that Presidents are surrounded by sycophants. “It is difficult for men in high office to avoid the malady of self-delusion,” wrote Calvin Coolidge in his Autobiography. During times of crises, Presidents from Lincoln to Wilson, from Franklin Roosevelt to George W. Bush have suspended basic rights. The public has largely, in the interest of national security, gone along. Nevertheless, whenever that occurs, the concept of liberty suffers. While Mr. Obama may not have been directly implicated in the IRS’s bullying of conservative groups, as President he set the tone for his Administration and, thus, is responsible. The buck stops with him, or it should.

The backlash against ObamaCare appears primarily motivated by incensed individuals that what had been promised has turned into a lie. But I hope it represents more than that – that people are in fact upset with an ever-intrusive government – one that feels solutions to problems should be government-centric rather than consumer-centric. The violations against the principles of our country, manifested in the incident in Benghazi and its aftermath, and in the use of the IRS for political gain, reflect a decline in trust between the people and its government. I am happy to see the resurrection taking place against the ACA. The failings of ObamaCare may cause pause in what has been an ascendant belief that a beneficent government is the answer to all problems – that Washington bureaucrats know better than do we as to how our lives should be managed and lived. However, for Republicans to gain an upper hand they will have to offer a viable alternative. Dissent is not enough.

The title of this piece comes from a quote from George Washington, a man noted for his character, humility, moral sense and his fervent belief in the wisdom of the people. The scandals surrounding Mr. Obama stem from many years of disregard for the truth, a belief in his own destiny and his assumption that personal charisma will erase any fundamental flaws. One might argue that Mr. Obama’s aides are too quick to provide inadequate excuses, which the President simply repeats, but that ignores the fact that as President he bears sole responsibility. It is he who determines the culture. His assistants only carry out what they believe to be his wishes. A bad excuse simply makes a terrible situation worse. Mark Twain once said: “If you tell the truth, you don’t have to remember anything.” It is a lesson Mr. Obama, and a host of other politicians, would do well to inscribe above their Teleprompters – “Veritas vos liberabit.” The truth will set you free.

Labels:

Thursday, November 14, 2013

"Suffonsified"

Sydney M. Williams

                                                                      Thought of the Day
                                                                          “Suffonsified”
November 14, 2013

An unexpected revelation from two days spent in Indianapolis at a Liberty Fund-sponsored colloquium last week on Calvin Coolidge was learning a new word, “suffonsified.” The Liberty Fund, based in Indianapolis, is an organization devoted to teaching the precepts of liberty, as described in our Declaration of Independence, the Federalist Papers, and the Constitution and Bill of Rights.

After a particularly big lunch at Fogo de Chao, one of the conferees told me that his grandmother used to lean back after such a meal, with a smile on her face, and declaim: “I’m suffonsified.” While I had never heard the term, it is a word, at least according to the “Urban Dictionary.” It is Canadian in origin and has relevance, in my opinion, to the study of Coolidge and concerns of today, but, of course, for opposing reasons. When one thinks of the thirtieth President, the words economy and restraint come to mind; whereas exorbitance and hubris better define today’s occupant of the White House. Despite Coolidge being noted for his silence, he spoke frequently. Reading his speeches, along with his autobiography, one is struck by his great knowledge, his beautiful use of language and his persistent references to the ideals of the founding fathers.

While he was a product of the 19th Century (he was born in 1872, so was 42 when World War I erupted), Coolidge was the first “modern” President, in that his Administration coincided with the incredible commercial developments of the “roaring” ‘20s. (Warren Harding was only President for two and a half years, and his last year was tainted by scandal.) Radios, cars, planes and telephones had all been invented earlier, but became commonplace during the 1920s. In 1920, about one-third of all homes had telephones. By the end of the decade, the number had doubled to roughly two-thirds. In 1920, 35% of all homes were electrified. By 1929, 68% were. That meant toasters, washing machines, electric stoves, irons, etc. – labor-saving devices for the home. Packaged foods added convenience and saved time. Birds-Eye frozen foods and Del Monte canned foods arrived on grocery shelves during the decade, as did Wheaties, Jell-O and Planter’s Peanuts. By 1925, the Ford Motor Company was producing two million Model T’s a year. In all, the company produced 16.5 million units – a record for a single model that remained unsurpassed until Volkswagen’s Beetle bested that record in 1972. Consumer products and autos brought with them, however, the concept of installment buying. And, as we all know, the margin requirement on stocks was 10% when the decade ended, and stocks crashed.

Amid this prosperity and growth, Coolidge’s words came across as a governor, a restraint on excess. His Vermont upbringing abhorred the concept of speculation and show, and even debt. But he did not see government as a caretaker. If people could reap rewards, they had to be held responsible for their losses. He understood the importance of history and had a clear understanding of the nature and uniqueness of our founders and the representative government they created. That is not to say that he did not celebrate economic growth, for he did. But in his speeches there was an emphasis on the spiritual as being more important than the material. In a speech at Wheaton College in June 1923, he said: “We do not need more of the things that are seen, we need more of the things that are unseen.”

Character, morality, self-reliance and forbearance were important to him. He knew that government, through elected representatives, reflected the will of the people. He wanted to ensure that the electorate was educated in the rudimentaries of their country’s past, so they would be better able to deal with the present. He knew that only an educated voter could ensure the maintenance of our enduring but fragile government. As a product of Yankee thrift, he feared people losing the proper balance between the secular and the nonsecular so necessary to living both freely and prosperously.

Mr. Coolidge was surely aware of Benjamin Franklin’s admonition to the Continental Congress, as the process of ratification was underway, on September 17, 1787. At the age of 81, Franklin was almost twice as old as the average delegate. With age had come wisdom. In his speech, Franklin said he agreed “to this Constitution with all its faults.” He said that it “may be a blessing to the people if well administered.” (Emphasis is mine.) He added: it is likely “to be well administered for a course of years, and can only end in Despotism, as other forms have before it, when the people become so corrupted as to need despotic government, being incapable of any other.” As a student of human behavior, Franklin was aware of the failings of man. His words were an endorsement, but with caveats. “Thus, I consent, Sir, to this Constitution because I expect no better, and because I am not sure that it is not the best.” He concluded by urging his fellow delegates to get beyond approval and concentrate “our future thoughts and endeavors to the means of having it well administered.”

It is the “well administered,” as it applies to government today, that should concern us. Since the 1930s, the nation has been tilting left. Even before that, the federal government had been assuming more power and the Presidency was becoming more isolated and more imperial. In his autobiography, published in 1929, Coolidge sounded cautionary alarm bells that today seem remarkably prescient: “It is difficult for men in high office to avoid the malady of self-delusion. They are always surrounded by worshippers. They are constantly, and for the most part sincerely, assured of their greatness. They live in an artificial atmosphere of adulation and exaltation, which sooner or later impairs their judgment. They are in grave danger of becoming careless and arrogant.”

A compassionate government is good, when the fruits of compassion are received by the impoverished, the sick and the elderly. But when those gifts are extended to almost half the population, it suggests one of two things, neither good. Either we have become too poor or too generous. In fact, in a remarkable léger de main, Mr. Obama has made both prospects come true.

With a federal stimulus that did not stimulate, quantitative easing that brought profits to big Wall Street banks but did nothing for Main Street, a federal bailout of GM that saved unions, but violated contract law, the recovery has been the weakest in the last sixty-five years. Further suffocating the feeble embers of economic growth, the government has raised taxes and increased regulation regarding the environment, banks and healthcare. Labor force participation remains three million below where it was in 2007. The poverty rate, at 15%, is the highest it has been since the early 1960s and matches the levels reached in 1982 and 1993. Joining the wealth and income gaps is a “jobs” gap. Four and a half years after the recovery began the unemployment rate among those 16 to 24 is 15.5%, versus 6.7% for those between the ages of 25 to 54.The gap is the highest it has been in over two decades.

There is no question; Americans are a generous people. In 2012, charitable giving by individuals amounted to $316 billion. But Washington politicians are even freer with our tax dollars. We spend about $80 billion a year on SNAP (Supplemental Nutritional Assistance Program), commonly known as Food Stamps, and $520 billion on unemployment insurance. In total, federal welfare spending in 2012 totaled $746 billion, according to the Washington Times, or 21% of the federal budget.

From a fiscal perspective, the situation is tenuous and difficult to address. No politician wants to be known as the “Scrooge” of Washington. At some point, government runs out of “other people’s money.” We should be reminded, as Coolidge did frequently to his constituents, that every dollar sent to Washington is one dollar less in the hands of taxpayers, and one dollar less that can be reinvested privately in the economy.

There is a level where recipients of welfare become dependent and lose their sense of personal responsibility. The problem is that, in a country of our size, there is no clear line that distinguishes between the truly needy and those who are capable of being self-reliant but who have become addicted to government handouts. Thus we err on the side of compassion. In doing so, however, we risk a downward spiral in which entitlement spending prevents us from focusing on economic growth. We risk debauching the currency and ignoring a weakening infrastructure. We lose sight of the lesson Coolidge never lost sight of – the purpose of government is to protect our liberties, not to protect us from ourselves.

How far down this path have we traveled? The “Life of Julia” slide-show depicted how Obama’s policies would take care of women from the ages of 3 to 67. It showed an Orwellian world, in which people have been reduced to the Eloi characters of H.G. Well’s The Time Machine. Coolidge frequently reminded the people that the more responsibilities assumed by government – allegedly in the interests of people, like banning fracking, smoking, trans-fats or big sodas – the more our individual liberties would be diminished. He acknowledged the need for an ordered society and the importance of the role government plays in maintaining peace and security. But he also recognized the fine line democracy must tread between tyranny and anarchy. It was balance that he sought and balance that he achieved, but one we may be losing.

Most of those who eschew persistent and increasing entitlement spending are not insensitive or indifferent to those in need. Their concern is that we are on a path that leads inevitably to declining economic growth, lower standards of living and reduced liberties, which ultimately leads to less funds available for the truly needy. They are interested in preserving what has been created in the United States for future generations – the richest and most free nation the world has ever known. Dependence on government, when it is not absolutely necessary, is insidious, becoming addictive over time. While it is in the re-election interests of politicians who hand out goodies, it is not beneficial to those recipients who are aspirant, talented and motivated. Such ties keep them tethered to the mother ship that is Washington. It is also not in the interest of those of us who are forced to divert investments into tax payments and who worry about the erosion of our rights and what that means for the forces of freedom versus those of despotism.

We have been (and are) a fortunate nation. When we read of previous civilizations that rose and then declined, we know there is no such thing as permanence in civic societies. When we see the consequences of those who live in unfree societies, it should make us cling more aggressively to the principles that underlie the American Experiment. Our appetite for entitlements should be satiated, lest we overeat and die of indigestion. It is time to sit back and say: we are suffonsified.

Labels:

Monday, November 11, 2013

"Man versus Machine"



Sydney M. Williams
                                                                  Thought of the Day
                                                                “Man versus Machine”
November 11, 2013

When chess grandmaster Jan Hein Donner was asked what strategy he would use if asked to play against a computer, he replied: “I would bring a hammer.”

The problem of a widening income gap has long plagued commentators and policy makers alike. The passage in 1993 of section 162(m) of the U.S. Tax Code, which limited the deductibility of executive compensation, played a role. It caused an increase in the use of options, particularly among internet start-ups during the greatest bull market of the last hundred years – a bull run that ended in March 2000. That was when the gap reached its widest point. Since then, the use of options has declined. However, amplifying the divide has been the cronyism that exists and has grown between Washington, Wall Street and big business, which has resulted in a concentration of capital in the hands of a few corporations, big banks and individuals. Hypocrisy runs thick and deep among those who pose as egalitarians. They repeat the same platitudes, smile benignly and, with eager, grubby hands, reach discretely behind for dollars. While all politicians partake, the combination of greed and dissimulation lie most obviously among those like the Clintons’ and Obamas’. But perhaps as insidious as anything has been the march of the machine and its economic consequences. An article on this subject, written by Daniel Finkelstein, appeared in last Wednesday’s London Times.

Mr. Finkelstein wrote: “In the past 30 years, the proportion of national income taken as a reward in the form of wages has fallen while the proportion due to owners of capital has risen. And this has happened all over the world, pretty much regardless of what politicians have tried to do about it.” The owners of technology stand in stark contrast to the individuals who are replaced. But there is nothing new in this. It has accelerated in very recent years, as intelligence has been incorporated into increasingly sophisticated programs and algorithms. But the origin of Schumpeter’s “creative destruction” goes back much further. In 1776 Adam Smith, in the “Wealth of Nations,” wrote of the division of labor, as it applied to the manufacture of pins. It was the start of applying specialization to manufacturing. In 1913, Henry Ford created the assembly line for the manufacture of the Model T. With a slow-moving conveyor belt, workers repeatedly performed the same task, which may have been mind numbing for the individual, but decreased the time to produce a car by a factor of eight. The result was more cars at lower prices to consumers. Increased production meant increased employment. Higher paid workers and lower car prices meant increased demand.

Machines are increasingly able to do things that in the past required humans. In the New York Times’ investigative reporter Charles Duhigg’s new book, The Power of Habit, a man walks into a Target outlet, complaining that the store had been sending his daughter discount vouchers for baby clothes and equipment. “She is only in high school,” he complained. A few days later, the man was back, apologizing. His daughter was indeed pregnant, a fact known by Target’s computers because of her buying habits, but unknown to her father.

Algorithms allow American Express to notify us when our spending habits take a change. If we cannot be reached to confirm a purchase, our card is cancelled. Artificial intelligence allows surgeons to operate remotely and Drones to kill terrorists in remote locations with little collateral damage. Google’s driverless cars have traveled well over 200,000 miles. A programmer and a PC can replace a company’s entire marketing department, using algorithms to measure consumer behavior. Machines replaced the London Stock Exchange twenty-seven years ago, and today handle much of the trading in New York Stock Exchange-listed securities. Engineering and design firms have been affected, improving output per employee, much as manufacturing firms did thirty years ago. Newspapers are facing competition from bloggers, as dangerous and illusive to them, as terrorists are to Western democracies. Massive Open Online Courses (MOOCs) are revolutionizing the college experience – reducing prices and increasing the value of selective professors, but eliminating the “college experience.”

Wall Street has applauded many of these productivity improvements, as the consequence is generally improved earnings and higher stock prices. But corporate productivity and individual productivity are on different axes. Machines, in saving labor costs, improve a businesses bottom line. On the other hand, and especially as machines become increasingly “smart,” they carry out tasks that negate the need for skilled labor. The consequence, as Tyler Cowen noted in his new book Average is Over, is an America that has divided in two – the children of “Tiger Mothers” and the rest. Mr. Tyler wrote that markets are exceptionally accurate at measuring an individual’s economic value, sometimes with “oppressive precision.” Thus we have millions of individuals who have less value to the economy than they had a few years ago. We either become masters of computers, or they become our executioners.

There are no easy solutions. Income redistribution is the usual answer for those on the left, but social engineering does not address the cause and, in fact, tends to aggravate the situation. The answers lie in education and the need for people to adapt and anticipate the “next new thing.” Creative destruction is an unpleasant, but a necessary aspect of economic growth.

Equality is a nebulous condition. Our Constitution ensures we are equal under the law. The Declaration of Independence states that all of us are endowed with certain unalienable rights. Civil society demands that we have equality of opportunity. But there is no way in which a democracy can ensure equality of outcomes. Those that suggest otherwise are knaves and hypocrites. If there is inequality in our society, it exists in Washington where the legislature passes laws exempting themselves from its consequences. There will always be bosses and workers. Equality of outcomes has never been achieved under socialism, communism or despotism. In Russia and China, power and wealth accrue to those that govern and their crony friends. The gap between rich and poor in those countries is far greater than it is in ours, which is why we should fear the growing cronyism in our own country. The need is to find a balance in a fluid and ever-changing society. The source of that balance exists in the steadfast principles on which our country was founded – the embedded and unchanging rights granted each of us – that all men are created equal; that we are all endowed with unalienable rights, and that government derives its just powers from the consent of the governed. Those are the ideals we must defend and those are what we must never lose.

Individuals should not throw up their hands in despair. There are personal characteristics that will always be in demand, and that are not (yet) replicable in a computer. They will, however, drive future success – intelligence, aspiration, motivation, creativity, a willingness to accept change and fearlessness for the future, and, perhaps most important, a willingness to take risks. None of this answers the problem of a widening income gap. In a global world, in which it is increasingly easy to move financial assets and where location matters less than it did, the power of the very wealthy grow in relation to his or her government. Patriotism, for many of these people, means less than preservation of one’s wealth. The manifestation of that can be seen in major cities, like New York, London, Geneva, Hong Kong and Singapore – cities in which many of these people have homes and assets. If taxes become too high, a new jurisdiction is found. Money can be moved with the click of a mouse.

Government can expand educational opportunities – emphasizing the needs of the student over the demands of teachers’ unions, for example. Cities like New York should acknowledge that charter schools are still public schools, even though the teachers are not necessarily members of a union. Federal and state governments can reduce regulation and simplify tax codes. Government can never mandate equality of outcomes without greatly reducing overall wealth, and voiding liberty. But they can encourage equality of opportunity; so that motivated children of the poor will not be disadvantaged to those of the rich.

While politicians sell this condition as unfair, the world is the way it is. Teaching a person to fish is far better for the individual than giving them a fish. Dependency leads to poverty. Another characteristic of success is optimism. A friend recently sent me this quote from British novelist, J.B. Priestly: “I have always been delighted at the prospect of a new day, a fresh try, one more start, with a bit of magic waiting somewhere behind the morning.” Those are good words to wake up to.

One thing is certain – a hammer is not a realistic option.

Labels:

Wednesday, November 6, 2013

"Humanities in the Age of Twitter"

Sydney M. Williams

                                                                Thought of the Day
                                                     “Humanities in the Age of Twitter”
November 6, 2013

“College is increasingly being defined narrowly as job preparation, not as something designed to educate the whole person:” so said Pauline Yu, president of the American Council of Learned Societies, as quoted in last Thursday’s New York Times. The front-page article in the Times was headlined, “Interest Fading in Humanities, Colleges Worry.”

A decline in the percentage of students studying the humanities was the subject of two reports last spring. On May 31, Harvard issued a report, “The Teaching of the Arts and Humanities at Harvard College: Mapping the Future.” A second study, released in June from American Academy of Arts & Sciences, indicated that the number of students majoring in the humanities nationwide has dropped from 14% in 1970 to 7% today. (At Harvard, including history as one of the humanities, the decline has been from 36% to 20 percent.)

While the reports caused handwringing among some academicians and columnists, Princeton history professor Anthony Grafton, in an article in the July 1, 2013 issue of The Chronicle of Higher Education, puts some of those concerns into perspective. He noted that humanity enrollments in the 1940s and 1950s were at levels similar to today’s. Michael Bérubé, director of the Institute for Arts and Humanities at the University of Pennsylvania, wrote in the same issue of the Chronicle that enrollments in the humanities rose from 14% in 1966 to 18% in 1970. Most of the decline, according to Professor Bérubé, happened between 1970 and 1980. Professor Grafton concluded that what we are witnessing is not so much a decline in the humanities as a whole and at Harvard, “but one of a large-scale fluctuation with a bubble in the middle.”

The last time humanities’ enrollment was this low, in the late 1940s, the economy (and the age of many students) played a role. The GI Bill allowed returning veterans to attend college, in a once-in-a-lifetime opportunity. Most were older. Several were married and they all wanted to get on with the job of being a civilian. An education was the best means. The Bill, which expired in 1956, provided cash payments for tuition and living expenses for all veterans who had spent at least ninety days on active duty and who had been honorably discharged. Today’s anemic economy, with high unemployment and tuition costs, has created uncertainty. It is unsurprising that aspirant but nervous students would look at college more as job training than as a place to get a rounded education.

But comparing today’s humanities courses with those of the late 1940s and early 1950s is contrasting apples to oranges. For example, “cultural studies” have replaced English departments at some universities, as professors teach what is politically correct and trendy – women’s literature, hip-hop, comic books – rather than Aristotle, Herodotus, Shakespeare or Milton. Historically, humanities consisted of English literature, philosophy, religion, western civilization and history. Those courses, as David Brooks wrote in a June 20 op-ed for the New York Times, focused “not only on the external goods the humanities can produce (creative thinking, good writing), but also on the internal transformation (spiritual depth, personal integrity).” Today, according to the NCES (National Center for Education Statistics) the study of humanities includes ethnic and gender studies, along with the visual and performing arts. In his June 20 op-ed, Mr. Brooks added that “humanities turned from an inward to an outward focus…Liberal Arts professors grew more moralistic when talking about politics, but more tentative about private morality because they didn’t want to offend anybody.”

Changing dynamics are causing colleges to adjust. John Tresch, a historian of science at the University of Pennsylvania, was quoted in the Times article: “There’s an overwhelming push from the administration to build up the STEM (science, technology, engineering and math) fields, both because national productivity depends in part on scientific productivity and because there’s so much federal funding for science.” Tamar Lewin, in the same article, wrote: “Some 45% of the faculty members in Stamford’s main undergraduate division are clustered in the humanities – but only 15% of the students.” Stanford appears to be a case of resource misallocation. Harvard is looking to reshape its first-year humanities courses to sustain student interest. Wake Forest has integrated its career-service department into its curriculum.

Any discussion of a decline in humanities leads to the question of the purpose of education. The question has become more imperative, with soaring tuition costs and continuing high unemployment among college graduates, especially for those with degrees in the humanities. Jennifer Levitz and Douglas Belkin, writing in the June 6, 2013 edition of the Wall Street Journal, reported that the unemployment rate for recent graduates with degrees in English was 9.8%. It was 9.5% for those with history and philosophy majors. In contrast, it was 5.8% for chemistry majors and 5.0% for graduates with degrees in elementary education. For comparison purposes, the unemployment rate for recent college graduates is 7 percent.

A survey, “Is college worth it?” was conducted by Pew Research in 2011. They interviewed more than 3,000 adults aged 18 and older, including presidents of several two and four-year private and public colleges. Forty-seven percent of respondents said the main purpose of a college education is to teach work-related skills, while thirty-nine percent said that it allows students to grow personally and intellectually. The answers were skewed according to the education of the responder – those without a four-year college degree tended to look upon college as a place to develop work-related skills, while those with graduate degrees favored college as a growing and learning experience by 56% to 26%. Interestingly, and I believe notable, character was valued by the respondents as being more important than college. In terms of what it is “extremely important” to succeed in the world, 61% mentioned work ethic, 57% cited the ability to get along with people and only 42% said a college degree.

Our high schools are regularly judged to be inferior to schools in other parts of the world, as detailed in two reports: Trends in International Mathematics and Science Study and Progress in International Reading Literacy Study. A renewed emphasis on STEM has been pushed by many, including President Obama. MOOCs (massive open, online courses) have been among the private sector’s response. However, our universities are always included among the world’s best. In QS World University Rankings, a British company that specializes in such studies, seven of the top eleven universities (Princeton and Cal Tech were tied for 10th spot) are in the United States – the other four, not surprisingly, are in England. Tellingly, five of the highest ranked American universities are ones in which the humanities make up more than double the nation’s averages – Harvard, Yale, Stanford, University of Chicago and Princeton.

A problem with today’s world is that, while we read as much as we ever did, we no longer have time for the close-reading practices that were common in my growing-up years. We skim the news, read e-mails, text-messages and IMs, scan the internet, and glance at postings on Facebook, LinkedIn and Twitter. David Mikics, professor of English at the University of Houston, claims the shift away from classics started in the late 1960s, “when traditional literature and poetry were jettisoned at many universities in favor of hip-hop, fashion ads, graphic novels, comic books – whatever facilitates sloganizing about gender, race and class.” But it is the proliferation of information that makes finding the needle so difficult. Buckminster Fuller, architect and futurist, noted that in 1900 knowledge doubled every 100 years. By 1945, knowledge was doubling every 25 years. Today, estimates are that information doubles every 12 months – much of which is redundant or useless. IBM expects that to shrink to every 12 hours in a few years. How will our grandchildren cope?

Studying traditional humanities provides students the opportunity to read and study some of the greatest minds that ever lived. A study in western civilization allowed a student to experience vicariously every imaginable emotion, through the plays of Shakespeare, and to understand some of the earliest concepts of democracy and liberty by way of the writings of Thucydides and Plato. How can anyone understand the concept of representative government without reading Edmund Burke, or free market capitalism without looking into Adam Smith? The Federalist Papers provide an understanding of the thinking that went into our Constitution and the mechanics of our government. That human behavior is timeless is something one learns through reading the classics, like Dickens, Shakespeare or Jane Austen. Yet only recently has knowledge of behavior become common in the study of economics. In a time of moral relativism, a moral sense is relevant and ageless, as one learns from reading Greek and Roman mythology and the Bible. Of course, the teacher is crucial to one’s learning experience. Lee Siegel, New York writer and cultural critic, wrote last summer. “[While] the British scholar Frank Kermode kindled Shakespeare into an eternal flame in my head – there were countless others who made reading of literary masterpieces seem like two hours in the periodontist’s chair.” Encouraging people to continue reading and learning long after graduation is the mark of a good teacher.

Education is a privilege and an opportunity. Circumstances have perhaps made it more likely that the four years one spends in college should be looked upon as preparation for a job, but in fact those years should be seen as preparing one for life – instilling curiosity, the desire to question and the ability to think. It makes no difference if we are living in the age of Dostoevsky or in the time of Twitter. The ability to communicate is important no matter one’s job and no matter whether by Tweet, on one’s iPhone or in essays. Most of us spend ten times as much time in our jobs as in college. Those four years represent the only time in one’s life when one can read books and study subjects that have nothing to do with one’s ultimate career. It may be that the decline in degrees granted in the humanities is due to an economy that has forced people to look at education as a means to a job, but I also suspect that many humanity departments have committed suicide by replacing the great books with “cultural” studies. No matter the cause, the consequence is unfortunate.

Labels:

Monday, November 4, 2013

"Requiem for Republicans?"

Sydney M. Williams

                                                                   Thought of the Day
                                                             “Requiem for Republicans?”
November 4, 2013

“The report of my death was an exaggeration.” Mark Twain was referring to an article that had appeared in the June 1, 1897 edition of the New York Herald, which reported that Twain was “grievously ill and possibly dying.” Mr. Twain, being mortal, did die, but thirteen years later, on April 21, 1910.

Today, similar obituaries are being written about the Republican Party. A few days ago, in the Wall Street Journal, John G. Taft, a grandson of Robert A. Taft, wrote a plaintive op-ed, “The Cry of the True Republican.” He was expressing what he felt his grandfather, “Mr. Republican,” would now say about the Party: “If he were alive today, I can assure you he wouldn’t even recognize the modern Republican Party, which has repeatedly brought the United States to the edge of a fiscal cliff – seemingly with every intention of pushing us off the edge.” Perhaps that is so, but I suspect the elder Mr. Taft would despair even more about the fiscal state of the United States – the size of the deficits, the size of our debt, both legal and contractual. I think he would mourn the loss of a sense of personal responsibility and what growing dependency means for individual freedom. Mr. Taft would weep for a country that debauches its currency in a bid to keep alive unaffordable promises. He would grieve for a nation in which there are fewer people working than receiving government aid, where less than half the citizens pay federal income taxes and more than half receive some form of assistance.

If we assume that self-interest is the principal motivating factor for most voters, we may have already slipped the mooring that keeps our ship of state seaworthy and steady. No one who is receiving entitlements will vote for reductions or eliminations. That is a truth better understood by Democrats than Republicans and its manifestation can be seen in the ruthlessness of political campaigns that celebrate victory regardless of the means – a lesson learned the hard way by Hillary Clinton in 2008 and Mitt Romney in 2012. It has sharpened the edge between the Parties and has led to the partisanship that divides Congress.

Years ago, people like John Taft’s grandfather entered public service because they cared about the country – they wished to serve a nation that they viewed as special, to preserve and expand a unique experiment which had done more good for more people than any other nation or people ever did. They knew that republicanism depended upon an informed and responsible electorate. They wanted government to maintain its historical and moral values. They came not for acclaim or riches, but to work as servants for the master that was the people. They came to Washington with differing views as to how to best perpetuate the American experience, but they were all loyal sons and daughters of America. No one questioned their loyalty or patriotism. Together they were able to forge laws and maintain security that met the needs of a polyglot nation.

The polarized world of growing extremism today has been fed by many sources, but especially gerrymandering, which created “safe” seats for members of both Parties. The nominating process is more important than the election in those districts, allowing special interests to dominate. A number of seats on the Left and the Right are occupied by Representatives who are more interested in staying in power than doing what is right for the nation. But schisms have long existed among Republicans. In 1912, “progressive Republicans” nominated Theodore Roosevelt under the “Bull Moose” banner. In splitting the vote with Taft, the election went to Democrat Woodrow Wilson. In 1992, Independent candidate Ross Perot siphoned off enough votes from George H.W. Bush that the election went to Bill Clinton. More than anything, today’s divide is manifested between those who believe in “big” government and those who believe in “small” government.

The rise of the Tea Party can be seen as a visceral, grass-roots reaction of the drift toward socialism, in which the state has assumed more and more responsibility for the well-being of its citizens. While intentions are honorable, such trends are fraught with risk for liberty. In Friday’s “Notable & Quotable” column, the Wall Street Journal quoted from C.S Lewis’s essay anthology “God in the Dock.” “Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive.” Professor Lewis added: “To be ‘cured’ against one’s will and cured of states which we may not regard as disease is to be put on a level of those who have not yet reached the age of reason, or those who never will; to be classed with infants, imbeciles and domestic animals.” It was exactly that thinking that was behind ObamaCare – that we individually (and markets) are incapable of taking care of ourselves, that only government, which insidiously reaches further and further into our lives, can perform that task. Like pets, we first become pampered and then scolded.

The genesis of the Tea Party in early 2009 was reflexive and intuitive, perhaps not well thought out or well expressed – a true grass-roots movement. It was a call for smaller government, but one that recognized that laws must be obeyed and guidelines need to be observed. The Tea Party was populated by average, middle-income Americans – plumbers, the retired, small businessmen, etc. They were for a government that provided security for its elderly citizens and a safety net for those incapable of caring for themselves. But, they wanted a government that operated with an acknowledged understanding that every dollar they received from taxpayers was a dollar taxpayers did not have – that tax dollars were sacred to be used with care. They wanted a government that understood that prosperity stemmed from the private sector, while recognizing that government must provide rules and their enforcement. They were concerned with political and corporate cronyism. They did not want a government that managed their lives, or one in which politicians saw power as an end, not the means. They did not want a government that wasted money or intruded where it was not needed and was not wanted. Instinctively, they understand that government can influence behavior, but cannot dictate it and maintain a republic.

The Tea Party has been demonized because it is seen as a threat to all politicians, regardless of Party, those who see politics as a career – who view themselves as entitled to all the trappings, power and wealth that comes from being a Senator or Representative. Washington has made good men and women arrogant and supercilious, giving them a sense that they are the masters and the people is the servant. When citizens give up their rights and responsibilities, they reap the wind that is tyranny. In 1926, in a speech before the Daughters of the American Revolution, President Calvin Coolidge made that point. “It is not in violence and crime where our greatest danger lies…A far more serious danger lurks in the shrinking of our responsibilities of citizenship, where the evil may not be so noticeable but is more insidious and likely to be more devastating.”

Nevertheless, Republicans, being pulled in opposite directions, have stumbled. They have let Democrats dictate the agenda. Too many are humorless, in a business where humor is a positive attribute and can be used, as Ronald Reagan demonstrated with favorable effects. Too many Republicans sound defensive, knocking the programs of the opposition, rather than promoting their own. Politics is “feeling,” as Governor Chris Christie has said. Platforms must be simple and understandable. They must explain why their policies are important to the freedom of the individual. The willingness to shutdown the government was short-sighted and wrong. President Reagan had a 70% rule. If you agreed with him 70% of the time you were his friend.

Republicans, in my opinion, have been on the wrong side of the immigration issue. Pettiness and meanness among them have given the advantage to Democrats on this critical and sensitive issue. Phyllis Schlafly, lawyer and conservative activist had an op-ed in last Thursday’s Investor’s Business Daily: “For Republicans, Amnesty Will Be Political Suicide.” She wrote, “The current level of legal immigration here adds thousands of people whose views and experience are contrary to the conservative value of limited government.” Earlier, however, she had noted that the “borders test” (immigration versus emigration) “proves that people are coming to America, not fleeing from America to exit to other countries.” Those two observations are inherently contradictory. It is my guess that Ms. Schlafly underestimates the character and the value of immigrants. In my opinion, most immigrants should naturally belong to the Party of individual freedom and less intrusive government – the Party that wants you to keep more of what you earn. It is the opportunities that America offers, which explains why immigrants have disrupted their lives, taken a chance, left the homes of their forefathers to come to America. Republicans have simply ignored them, when they should have been cultivating them – speaking positively of the opportunities America offers for aspirant, hard working people, and explaining why they – Republicans – are the best Party to achieve those goals.

Democrats have a natural constituency in the growing number of people who are recipients of government largesse; yet, so should Republicans have a natural constituency with the providers of those gifts. Money is not free. Every dollar government spends means one less dollar in the hands of taxpayers. Every working person – including immigrants – recognizes that every entitlement paid out means higher taxes on workers. Democrats are very generous with taxpayers’ money, but according to most studies, Republicans are more generous with their own money. That speaks to character and is an easily understood concept, whether one comes from Bangladesh, Mexico, Illinois or New Hampshire.

The failure of the progressive state, as manifested in ObamaCare, should be Republican’s trump card: Seven of the richest counties in America are in the Beltway region. Since the recession ended, the number of people employed has declined by three million. Over the past five years, the rich have become richer and the poor poorer. Since 1990, while the population has increased 25%, the numbers of those on food stamps has risen 140%. More people are on some form of federal assistance than work. That trend is ominous and should be to Republicans’ advantage.

Republicans do have a registration disadvantage to Democrats, but it is only about four points – 27% versus 31%. Independents, at 40%, are the largest group – a sign of discontent with both Parties. Nevertheless, Republicans control 30 State gubernatorial seats and half the state legislatures. Keep in mind, people’s political convictions are more accurately reflected in state and local elections than in national ones where media hype and charisma play bigger roles. The idea that Republicans have met their Waterloo is wishful thinking on the part of Democrats. The Republican Party may or may not be mortal like Mr. Twain, but it is too early for a requiem.

Labels: