Thursday, June 24, 2010

1948: The Fateful Year of the Israeli-Palestinian Conflict


©2010 Daniel A. Brown

As the world reacts in revulsion to Israel’s latest ham-fisted tactics against the Palestinians trapped in Gaza, it might be a good time to revisit 1948, the year that created both the State of Israel and the Palestinian Diaspora. Both sides have generated a fair share of self-serving myths, but there are truths and falsehoods in each account.

The events leading up to this critical impasse are far too many to relate here. During World War I, with the Middle East a critical battleground, the British were exploiting the nationalistic goals of both the Jews and Arabs in order to defeat the Ottoman Turks. Of course, when the war ended, the Brits and the French carved up the former empire into several new countries and maintained their own level of control over puppet leaders. And as Jewish immigration picked up, the native Palestinian population became more restive and resentful, a resentment that exploded in the Arab Revolt of 1936-1939.

This revolt was pressed by Haj Amin al-Husseini, the spiritual leader of the Palestinian majority (who then spent World War II in Berlin supporting the Nazis). The British crushed it so thoroughly that Palestinian social, military and governmental infrastructures were all but ruined, consequences that have a great deal of bearing on the events of 1948. The Jewish community, meanwhile, was quietly crafting their own nation-in-waiting which allowed them to be better organized and prepared.

A year earlier, the now-famous UN Resolution 181 had divided the British Mandate into two separate homelands for the Jews and the Palestinians. However, both sides made it clear either secretly or openly that they would not be satisfied with this solution.

The Arab nations invaded the day after the Jews declared independence. Contrary to what American peace activists believe, the United States did not arm the Israelis (they wouldn’t in earnest until 1970 as part of the Cold War chess game with the Soviet Union). The US declared an arms embargo against all combatants and stuck to it. Israel got almost all of its weapons from Czechoslovakia and whatever they could finagle on the vast post-WWII surplus arms market. Still, according to various CIA reports, no one in the American government expected the Jews to win.

Oddly enough, most of the leaders of the invading Arab nations expected to lose. They knew that the Jewish fighters were better armed, organized, and most importantly, motivated, but after years of agitating the Arab Street against “Zionist aggression”, they couldn’t back down.

If you need a villain in this piece, the unlikely candidate is King Abdullah of Jordan. The center of gravity of this conflict has mostly been the West Bank, which was mandated as part of the new Palestinian homeland. Abdullah decided that he’d rather conquer it himself and therefore engaged in secret negotiations with both the Israelis and the British. The Israelis agreed for obvious reasons. Better to have one less enemy at your throat. The British went along because they wanted Jordan to remain a client state especially if the newly emerging Cold War with the Soviets got hotter. With this arrangement in place, the Arab Legion, Jordan’s superb British-trained army, swept into the West Bank thus allowing Jordan to control it until 1967.

The other Arab armies were equally interested in grabbing chunks of Palestine for themselves instead of helping their Palestinian “brothers”, an attitude of neglect and exploitation that holds to this very day. They didn’t succeed. As history records, the Jews won but at the cost of 1% of their population. That doesn’t sound like much until you realize that 1% of the United States translates as three million dead Americans.

One contentious issue of the 1948 war involves who practiced ethnic cleansing and committed atrocities. The answer is that both parties were equally guilty of massacres, rapes and the blatant murder of prisoners. Who did more is irrelevant, due to the mutually vicious nature of the struggle. The Israeli process of ethnic cleansing depended more on the whims of local commanders than as a national policy as evidenced that there are still major Arab population centers in Israel today. There were also several anarchic paramilitary units that operated separately from the main Israeli army, one of which, the Irgun, committed the infamous Deir Yassin massacre.

But Palestinian sympathizers choose to forget that not only did their friends commit their own share of atrocities in 1948 (the mass murder of the Etzion Bloc, for one) but had they won, the Jews would have been the ones ethnically cleansed out of their homeland. One wonders whether these activists would be expressing the same moral outrage if it were the Jews being oppressed today in Gaza or Ramallah. I think not.

The Israelis cynically believe that the world has a better opinion of Jews when they allow themselves to be exterminated (an observation recently validated by Helen Thomas). Needless to say, such a fate is not an option. But having proved to the world that they can defend themselves, the Israelis now must demonstrate a willingness to live peacefully with their Palestinian neighbors. At some point, when the religious and nationalist extremists on both sides are invalidated, perhaps that day might come about.


For further reading on this topic, I suggest, “1948” by Israeli revisionist historian, Benny Morris.

Tuesday, June 15, 2010

The Parting of a Mother


©2010 Daniel A. Brown


Decades after extolling the virtues of youth, we Baby Boomers are finally discovering death.

We are doing this courtesy of our aging parents, many of whom are coming into their nineties and defying most of the conventional wisdom their offspring have mistaken as truth. For example, most of my generation has been led to believe that if they lived a healthy smoke-booze-red meat-free lifestyle with plenty of exercise and a daily meditation practice to render them blissfully positive, they would stay young and beautiful forever.

But by now, I have attended enough funerals of friends who did all that and died before they reached 65 while my own parents, violating every one of the above precepts, lived a combined 194 years between them. However, they never told their children how to prepare for their eventual deterioration and demise, believing that they were going to live indefinitely. My Dad was still functioning quite well at age 95 and could beat me in arm wrestling to his immense gratification. When I dared asked what I was to do when he got old, he snapped “I’m not old!” and walked out of the room, Inwardly I knew that the day was coming when their lifestyles would implode and I dreaded that day, knowing how unprepared I would be. It was that proverbial ticking time bomb, unwanted yet inevitable.

And when it happened, it happened fast and my sister and I were suddenly immersed in a crash course on the care of the elderly. My mother fell and needed 24-hour care which I arranged only to discover that the caregivers were using her failing memory to obtain multiple payment checks per week. Their apartment was sold to a new owner who wanted to evict them and go condo so it was only a matter of time before we packed 44 years of life and memories into one truck and moved them both to Langdon Place in Keene. My dad died soon afterwards at age 100.

My mother was placed in the semi-independent wing but as her memory collapsed so rapidly, she had to be moved to the locked-down Alzheimer’s unit. And so it was that within a year, her life transformed from a spacious apartment overlooking Central Park to a tiny room right next to the nurse’s station. This new truncated life was as much a shock to us as it was to her. As her dementia cemented in place, she wondered when she would be leaving this nice hotel she was staying at and returning home to New York.

Mentally, I was going through my own roller-coaster ride, made bearable by accepting the sometimes irrational thoughts that were bouncing around inside my head. Initially, I resented my mother for not being young and vibrant anymore. I would stare at pictures of Mom when she was in her thirties and try to remember a mother who wasn’t presently gazing owl-eyed from a wheelchair and repeating the same questions a dozen times. I wanted my mommy back but knew that if I was going to stay emotionally healthy I would have to accept this woman under the new terms.

Which I learned to do. I also had to accept how little I knew about her, despite the family mythology that was based on as much fabrication as truth. I also had to confront her on some of the more dysfunctional elements of our turbulent family which happened during a lively two-hour car-ride among the hills of southern New Hampshire. Suddenly, Mom was speaking with a clarity and an honesty I had never encountered before and it was as if a mighty wall had come cascading down. Although those circumstances rarely returned, subsequent visits with her became events to look forward to instead of endure. In the last five years of her life, we saw more of each other and shared more than we had in the previous 35.

Her memory was all shot to hell by this time, remembrances flickering in and out like a faulty circuit breaker. She would relate recent conversations she had had with family members who were long gone and I would correct her and we would both joke about it. At times, her mind would clear allowing her to converse insightfully about aspects of my life she was interested in. I always brought an album of photographs which we would look at and even though she would get me, her father and her late husband all bollixed up, the act of sharing the images was pleasant for both of us.

As the end approached, slowly but irrevocably, she decided that she didn’t want to eat anymore or get out of bed. A week before she left, I asked her about this and she answered humorously, “I’m lazy!”, and then more cryptically, “It’s time for me to be the audience” meaning it was time for her to put aside her sense of duty and giving to others and receive. I kissed her on the forehead and told her I’d see her again soon. A partial truth.

Because when I did see Mom again, she was lying almost mummified in bed unconscious under doses of morphine. Instinctively, I knew that her spirit had already departed a body that was slowly winding down. I sat with her during those final hours, holding her hand and waiting.

Our images of deathbed scenes are conditioned from too much television and hokey movies. In reality, there is no soaring background music, no poignant parting words. Mom was breathing laboriously and then she stopped. It was like a fan shutting off. In the background, I could hear the night nurses discussing a movie they were watching. The overhead light hummed. Mom was gone.

The tears came later, brought about from both loss and the loving support of my friends. At her service, I closed my eyes and found myself thanking God that she had been my mother, warts and all, and in my family, the warts were predominant creatures. But in the end, the thankfulness won out.

Sunday, June 13, 2010

Beckoned

©2008 Daniel A. Brown

When I was 18 years old, I found out that my deceased sister, Deborah, was, in fact, very much alive. She was supposed to have died in childbirth back in 1940, ten years before my birth. But in 1968, I was looking at my birth certificate out of curiosity. The names and occupations of my parents were duly noted as was another fact that subsequently changed my life. Under the heading: “How many children alive at birth”, someone had typed, “2”.

2?

That couldn’t be. There was only Janet, who was born two years before me and if Deborah died in childbirth in 1940, what was she doing alive in 1950? When I asked my parents, “Who’s Deborah?” their response was to jump out of their skins, shocked insensible by a name that they had never thought to hear again. Recovering, they informed me that I did, in fact, have another sister, one who had been born profoundly retarded (their words) at birth and placed in the care of the State of Minnesota soon afterwards

Years later, I got a job at Monson State Hospital in Palmer, Massachusetts and worked on the wards of Simons Building which housed total care residents. Monson was founded for people suffering from epileptic seizures and it was common until the 1960’s to incarcerate those who were so afflicted. Most of them were eventually transferred to halfway houses when state institutions were shut down decades later. But the people imprisoned in Simons Building weren’t going anywhere. They were alone and forgotten, never visited and never loved.

Except for Stevie.

Stevie, like most others on the ward, was a baby in a 25-year old body. Each day, he remained in his crib, locked in a fetal position, seemingly oblivious to his surroundings. But every Sunday, something unusual would occur in his orbit. Stevie’s entire family would come for a visit, dressed in their best clothing. Surrounding his crib, they would talk to him like he was one of the family and try as best as they could to interact. There was nothing fake about it. It was an oasis of love that contrasted to the desperate apathy that otherwise encompassed his reality.

I marveled at this and began to think about my sister Deborah for the first time in years. Without revealing my intentions to my parents, I wrote to the state of Minnesota and learned that she was living at Faribault State Hospital right outside Minneapolis. On my first visit, I expected it to be grim like Monson but was amazed when I entered the wide, green campus. Faribault resembled a thriving, bustling college town. Patients, parents, and care workers were everywhere in abundance.

I went into Deborah’s ward in anticipation for our meeting, the workers there being glad to meet me, the first member of her family to appear in thirty-odd years. When she appeared, we went for a walk around the campus and I bought her a cup of coffee. She held and drank from it, much to the astonishment of her care workers who told me that she had never done such a thing before. But there was no conscious recognition of me on her part. I visited her again years later when she had been transferred to a pleasant half-way house in the tidy suburb of Maple Grove.

I also decided to tell my parents who were still unaware of my visits over the years. I wrote to my mother and reassured her that Deborah was in good hands. Weeks later, I received a heartfelt letter that must have released decades of fear and guilt. She was relieved to hear Deborah was doing fine and thanked me for staying in touch with her. My father was less than enthusiastic and tried to discourage me from further contact with her. “You might wake up one day and find her on your doorstep!” he warned.

But several years later, Dad’s own health began to deteriorate. Moved to a local nursing home, he spent most of his time semi-conscious. After my last visit with him, I got a call from the facility warning me that he was failing fast. The next night, I was in an irritable mood, so out of sorts that I went to bed early and fell into a fitful half-sleep.

A vision, not a dream, appeared.

In it, Dad was on a wooden wharf, stepping onto a small boat. Seeing me, he beckoned in my direction, inviting me to accompany him on his impending voyage. Without hesitating, I waved him off, thinking out loud, “There is no way I am getting on that boat with you!” a reaction to a less-than-ideal lifetime relationship. He departed and I fell into a deeper slumber. At 3am, the phone rang and I knew instantly what it was. It was the home. Dad was gone. He was 100 years old.

Three days afterwards, I got a call from Minnesota. Deborah, too, had passed away. A week later, I got her quarterly medical report, dated a few days before her death. There was nothing seemingly wrong with her. She was in good health and was expectedly to remain so.

Apparently, Dad had beckoned to her as well and she had agreed to go, freed at last from her physical servitude and finally reunited with a father she had never known.

Tuesday, June 8, 2010

Honor Your teachers


Honor Your Teachers

©2010 Daniel A. Brown


Ten years ago, I concluded my teaching career after a decade at a local rural, elementary school. It was perhaps the ten most constructive years of my life and as any teacher can tell you, one never completely leaves the classroom, even after retirement. Whenever I travel to somewhere exciting, view an illuminating movie or program on TV, or read an interesting article, my first thought is always “The kids would love this” and proceed to mentally craft a lesson plan on the topic. Then the present reality kicks in and I realize that “the kids” are long gone, some raising children of their own.

I’ve always felt that everyone should teach at least once in their lives. Not only does it show you what qualities you have as a person but you are truly (and I mean no disrespect to veterans) serving your country in the highest sense. Although I was surrounded by children, I felt that I grew up into a full adult during my years among them. Nothing teaches responsibility, diplomacy and communication skills better than a profession where you have to deal flawlessly with children, parents, peers and administrators on a daily basis for years at a time. Like combat, teaching is learned in the field and it’s a long process of trial and error, patience and frustration brought to eventual fulfillment by dedication and support.

During that period of time, I heard a fair amount of silly opinions voiced about teaching, mostly by people who wouldn’t last an hour in a typical classroom. The most common misconception is how “easy” teachers have it, working only six hours a day and having all those summers off. In reality, teachers spend hours at home grading papers and preparing lesson plans, researching information and gathering materials. Summers are spent taking professional development courses so that they can qualify for re-certification every five years. Beyond that, there is the mental quotient. Unlike most jobs, where you can leave your work after 5pm, we take it home with us because, as noted above, we are always thinking about the kids. And not only thinking about them but fussing, exulting, worrying, applauding and, quite frankly, praying over them because their needs and potentials are always in the backs of our minds. As teachers become fixtures in the local communities, we become friends with their families, attend their houses of worship, watch ballgames and dance recitals and participate in after-school programs. And when they leave our care and become adults, we go to their weddings and, sadly, sometimes their funerals. When they have kids of their own, I glow like a grandparent.

So that cliché about those who can’t do, teach; has it all wrong. Those who care, teach.

I’ve always considered teaching to be a creative art, not unlike music or art and I was lucky to depart before teaching to the test became the standard method of American education. While necessary in some venues, it mostly misses the gifts all children have, some in realms that have little to do with arid academic memorization. I once had a sixth-grade student who wrote on the first-grade level and could barely cough up a book report. But when asked to tell the story of what he had read, he verbally recounted the tale with a detailed thoroughness that would have made Shakespeare proud. Another child routinely flunked exams but was an expert on oceanography, an interest she discovered on her own while engaged in a dozen other creative endeavors. The boy who once looked like a candidate for reform school is now a successful and proud father of two and one girl who never attended college is a resourceful entrepreneur. I never met a kid who didn’t have a gift to share, whether the tests revealed it or not.

And there are some fine teachers out there whether or not they are publicly noticed or commended. One reason why my school worked so well was because of the diversity of the teachers who brought a wide range of backgrounds and experiences into the mix, an example to young people that those who have different styles and values can not only get along, but compliment each other as well. In particular, I owe a debt of gratitude to our master teacher who provided the kind of mentoring that is the backbone of successful education. She is a former Catholic nun; I’m a former Jewish hippie and our teaching styles could not have been more dissimilar. But because of our mutual love for our kids and respect for our profession, we not only developed an excellent working relationship but an enduring friendship as well.

In terms of pay, teaching stinks but that is beside the point. No one is in it for the money. While Wall Street bankers get annual million dollar bonuses for ruining people’s lives, teachers are lucky if they get several coffee mugs with “You’re an A+ Teacher!” printed on them. But what makes it all worthwhile are those rare moments when, years later, one of our grown-up students greets us on the street and casually mentions that we not only made them enjoy learning but changed their lives for the positive. These are their words and they are worth more than a million bucks.

So as the school term comes to an end, please take the time to thank the teacher who either made a difference in your life or the lives of your children. While the coffee mugs are nice, the recognition of a job well done is worth even more. Although the following term has been grossly overused, teachers are indeed true American heroes and should be recognized by our local communities as such.

That Old Nuclear Threat is Not Completely Gone


That Old Nuclear Threat is Not Completely Gone

©2010 Daniel A. Brown


Lately, I’ve been thinking about nuclear war.

I can thank the recent bomb found in Times Square, which could easily have been nuclear, for planting such a topic in my mind. True, it’s not the jolliest of subjects to contemplate but if you need a clue as to why the Baby Boomers are so neurotic, it’s worth noting that they were the first generation in history to grow up under the threat of total annihilation. I’ve personally considered it a minor miracle that there was never a thermonuclear war between the United States and the Soviet Union during those long decades when such a horror was not only likely, but considered inevitable. Not that we didn’t come close, especially during the Cuban Missile crisis of October 1962. We should all thank our lucky stars that we had the cool-thinking John F. Kennedy at the helm, deflecting the hawks who wanted us to invade Cuba. Had the likes of Sarah Palin or Dick Cheney been president at that juncture, it’s a good bet that the United States would still be a radioactive wasteland.

The concept that a full-scale nuclear exchange between the US and the USSR would result in what eventually became known by its apt acronym MAD, or “Mutually Assured Destruction”, was still far from being accepted during the early and middle decades of the Cold War. Herman Kahn, a RAND Corporation military strategist conjectured that a nuclear war would probably be “an unpleasant experience” but that things would eventually get back to normal. The Post Office would resume delivering the mail and we’d all go back to viewing our favorite programs on television. Kahn further suggested that the government should offer homeowner's insurance against nuclear bomb damage.

That such a monster in human clothing could be in a position of power and influence tells you just how closely we dodged the most lethal of bullets. Fortunately, he was countered by more sane analysts who predicted that a nuclear conflagration would instead reduce humanity to “medieval levels”.

The popular media has, at times, tried to portray atomic warfare in ways that transmit its full terror while still retaining some form of entertainment value. The most well-known is ABC’s “The Day After” released in 1983. Despite some dramatic and poignant scenes, it remains a sterile and forgettable offering. Not so, Peter Watkins’ “The War Game”, a World War III “documentary” made in Great Britain in 1965 which was deemed so horrifying that the BBC banned it from its airwaves for a full twenty years. Filmed in black and white with a flat-voiced narrator, the viewer is spared nothing. When I first saw it in college, I fled the theater in panic.

Two contemporary books explored the evils of nuclear warfare with equal realism. “Alas, Babylon”, written by Pat Frank in 1959, focuses on a small town in central Florida surviving amid a nation that has been completely destroyed by World War III. It is not until the final page of the tale that the ragged survivors learn that the United States has, in fact, won the war, but note that their “victory” is rendered meaningless by their current unendurable reality. Such a view was considered heretical during the Eisenhower era which might explain why the book was never a runaway best seller.

“Warday” by Whitley Strieber and James Kunetka enjoyed a better fate, receiving accolades from such distinguished personages as Senators Ted Kennedy and Mark Hatfield as well as our own Randy Kehler. “Warday” also plays as a documentary as the two authors travel across the United States five years after a “limited” nuclear attack by the Soviets that lasts a mere thirty minutes. The eventual result of that half hour is 60,000,000 American dead from blast, radiation sickness, famine and the effects of the annual flu on a severely weakened population. Although our nation survives, it is a radically transformed and impoverished society.

But since the Cold War ended, the threat of a worldwide nuclear holocaust has receded. Unfortunately, the planet has been rendered more vulnerable to the threat of nuclear terrorism as well as secondary nations arming themselves with the same weaponry. Be assured that even a minor nuclear conflict between Israel and Iran, two volatile nations bordering the Middle East oil fields, would result in reducing the American, if not the global economy, to “medieval levels” not to mention the horrific loss of life.

This further emphasizes that there is no longer any distinction between civilian and combatant in nuclear warfare. The human race, therefore, would do well by heeding the words of a fictional character in “Alas, Babylon”, a retired admiral, who makes the following observation about the old Cold War rivals. “Once both sides had the maximum capacity in hydrogen weapons and the means of delivering them, there was no sane alternative to peace.”

Fortunately, we Americans and our Russian adversaries eventually stumbled to this realization. Let’s pray that the newer nuclear-tipped powers follow suit as well.


Is the Tea Party Movement a Right-Wing Front?


Is the Tea Party Movement a Right-Wing Front?

©2010 Daniel A. Brown


Remember the “Militia” movement from the 1990’s? If you recall, these were armed groups of men and women, galumphing through the boonies in their cammies and vowing to defend Liberty from the excesses of “Big Government”. At the time, I predicted that their beef was with Bill Clinton’s quasi-liberal government and, if a right-wing conservative re-appeared in the White House, all these so-called “patriots” would strangely disappear.

Which they did, of course, though not before one of their crazier numbers perpetrated the Oklahoma City bombing in 1995, the worst act of terrorism on American soil before 9/11. And, true to my prediction, they remained completely silent during the Bush-Cheney years when the Republican majority assaulted the Constitution, deregulated Wall Street and replaced the Clinton surplus with a whopping deficit.

Well, I figured that they were back but maybe under a new guise, namely the Tea Party Movement which has gotten all kinds of press over the past year, especially after a summer of town hall meetings, gun-wavings, and rallies where some of the signs portraying President Obama harkened back to something you’d see in Mississippi during the Jim Crow era.

After reading all the punditry about them, I went on Facebook to find out what kind of people made up this movement which is supposedly a leadless, polyglot of concerned American citizens. Entering various Tea Party discussions, I found the personal contact I established there to be both illuminating and alarming. Illuminating, because there are many sincere people involved who are scared to death that their government debt is spiraling out of control. Most of them bent over backwards to convince me that they hold both Republicans and Democrats in equal low regard and some polls validate that sentiment. If anything, they were excited to be politically energized, some for the first time in their lives. I brought my usual left-wing conservative (yes, we exist) viewpoint into the mix which inspired several people to invite me to join their movement as a fellow traveler.

But I demurred on their kind offers because not one of them could honestly answer the following question, which was.

“Where were you when George W. Bush and the Republican neo-conservatives were expanding Big Government from 2000-2006, wasting billions of our dollars on tax cuts for the rich and bogus wars?”

A simple question, really, but one that was never really answered. Mostly, I was admonished for “harping on the past” as if four years ago is an era frozen in amber. Others promised to hold all politicians’ feet to the fire (a phrase they borrowed from the Progressive Left, who they consider their natural adversaries) regardless of party. Another question the Tea Partiers dodged was this. If Big Government is so awful, are you willing to give up all those goodies you get from it like Medicare, Social Security, the G.I. Bill, and farm subsidies? This, too, was met with silence or evasive double-talk.

Essentially, the Tea Party espouses three main themes; Throw the bums out, cut taxes and let the free market take care of itself. All of which sounds uncomfortably like the philosophy of arch-conservative Grover Norquist and his ilk.

One wonders if the Tea Party folks realize (or care) that many of the bums they want to throw out are the same bums they elected between 1994 – 2008 and if they eject them, what’s to prevent a new bunch of bums from taking their place in 2010? Part of this conundrum is the American voters’ limited attention span and their inability to realize that they need to participate in the democratic process, instead of merely complaining about it.

As far as tax cuts, Tea Partiers forget that if they want good schools, efficient police and fire departments and usable roads and bridges, you have to pay for them, and in our system, taxes is how that is done. Several individuals expressed sympathy with Joseph Stack, the man who killed himself flying his airplane into the IRS building in Texas, while others made a point of distancing him from the movement.

And the free market? The Tea Party would rather have Big Oil, Big Pharma, Big Insurance, and Big Banks play them for suckers and ruin their lives before they allowed the federal government to regulate them. This is considered “a return to free market principles on which this nation was founded”. Go figure.

Unfortunately, if you read the language on the various Tea Party websites and check out their espoused causes (not to mention all the ads promoting Sarah Palin’s book), you cannot help but conclude that this is a conservative movement which, at its extreme, echoes the same kind of right-wing paranoia that motivated Timothy McVeigh. Despite the honorable intentions of many of its members, historically, all populist movements tend to find its energy in fear and extremism, yielding demagogues who range from monsters like Adolph Hitler to minor league thugs like Huey Long and Hugo Chavez.

And herein lies the danger. Movements like the Tea Party, as noble as they appear to their followers, are eventually co-opted by sleazy pols who use that populist rage for their own benefit and ambition. And once elected, they will thank their erstwhile supporters, laugh at them behind their backs and then begin to initiate their own agenda, one that has little or nothing to do with the sentiments of those who placed their trust in them in the first place.

America Before the New Deal: How FDR's "Socialism" Saved Our Nation


America Before the New Deal: How FDR's "Socialism" saved our nation.

©2010 Daniel A. Brown



Chances are most of you reading this have never heard of Lorena Hickock. Although she is known in the gay-lesbian community as Eleanor Roosevelt’s confident and rumored lover, her impact on American history was far broader than that titillating tidbit.

Hickock grew up in the Midwest, became a newspaper reporter and had a reputation for being both physically and emotionally expansive. In that curious 1920’s definition of Feminism, she could reputedly out-cuss and out-drink most of her male companions while playing a mean game of poker. Her rise in the world of journalism culminated in her reportage of Mrs. Roosevelt during the 1932 presidential campaign.

Whose husband entered the White House the following year and faced a nation slowing sinking to the nadir of the Great Depression. FDR had hundreds of ideas and an equal amount of experts but what he really needed was someone on the ground who could describe what the average American was up against on a daily basis. Harry Hopkins, director of the Federal Relief Administration, charged Hickock with the task of traveling across the length and breadth of the United States, talking with people in all walks of life and, in too many cases, no walks at all. He didn’t need statistics or bureaucratic jargon. He wanted the straight truth, sent to him in weekly reports.

And so, Hickock loaded up her automobile nicknamed, “Bluette”, headed out on the road and gave him just that.

What she discovered was a rural America that resembled a dilapidated, impoverished Third World country with a population that was alternately terrified and apathetic about their dismal fate. The term “poverty” back then doesn’t depict current Americans who live below the poverty line but still have a car and a television. It referred to our ancestors who owned nothing more than the clothes on their backs and endured slow starvation. As her quest intensified, Hickock discovered that this level of misery was in place long before the Depression heightened its affect and brought it out into the light of day.

“Oh, the crushing drabness of life here”, Hickock reported from the Great Plains, “and the suffering of both people and animals. The people here are in a daze. A sort of nameless dread hangs over them. Half the people, farmers particularly, are scared to death”. The stark statistics gave them reason to be. In 1934, the per capita income of a typical American farmstead was $167. Families lived in hovels where the children’s’ shoeless feet were purple from cold. Only one home in ten had an indoor toilet and only one in five had electricity. Millions of rural Americans, afflicted with malnutrition, parasites, and deficiency diseases, had no hospital or even a public health nurse to turn to.

Not surprisingly, the most desperate conditions were found in both the American South and in Appalachia. In the first, she found people who were “half-starved” and “struggling in competition for less to eat than my dog gets at home.” She met children and their parents who were not only completely illiterate but, in some cases, inarticulate as well. Sharecropping, which was the fate of nearly the entire African-American population in the South, reduced them to a level of servitude merely a fraction above slavery. Secretary of Agriculture, Henry Wallace, on his own journey of discovery, witnessed cotton states poverty “so abject” that European peasants were better off by comparison.

Conditions were no better in coal mining country where miners were forced to work for less than a dollar a day while living on a diet of flour, water, and lard that was “actually below domestic animal standards,” according to United Mine Workers president John Lewis. Like the sharecroppers, these miners existed in a semi-feudal environment, living in company-owned towns and chained in debt to company-owned stores. Starvation, disease and malnutrition were widespread. “It’s fairly common to see children entirely naked,” Hickock observed. “Dysentery is so common that nobody says much about it.” She added, however, that every year, babies so stricken routinely perished.

Hickock ultimately returned home and her combined reports added fuel for FDR’s eventual New Deal policies. Derided as “Socialism” by conservative critics, flawed and possibly unconstitutional at times, they nevertheless allowed the United States to begin its transformation from a demoralized and destitute nation to one that enjoyed the highest standard of living in human history a mere twelve years later.

A fact that is conveniently forgotten in some quarters today. But Hopkins’s advice to Hickock as she began her hegira still rings true and would be sound advice for all who pontificate about the state of the American people.

“Go talk with the preachers and teachers, businessmen, workers, and farmers,” he told her. “Go talk with the unemployed, those who are on relief and those who aren’t. And when you talk with them, don’t ever forget that but for the grace of God you, I, or any of our friends might be in their shoes”.