Monthly Archives: September 2014

The end of Leslie Berlowitz’s reign at American Academy of Arts and Sciences – about academic integrity, management style, and?

(This article was originally posted on September 5, 2014, on my Facebook community page, History, Culture and Politics.)

On July 31, 2013, Leslie Berlowitz, president and chief executive of the prestigious American Academy of Arts and Sciences, resigned following reports she had embellished her resume.

Berlowitz, who had overseen the honorary society for 17 years, had been on paid leave for over a month after The Boston Globe reported that she had falsely claimed a New York University doctorate and misstated her work history in federal grant applications and other documents.

Berlowitz also came under fire for harshly treating staffers, micromanaging the Academy’s affairs, barring scholars from viewing the Academy’s historic archives, and receiving an outsized pay package—more than $598,000 in fiscal year 2012 alone for an organization with only a few dozen staffers, several times what her peers at other institutions were paid.

(“Embattled head of American Academy of Arts and Sciences resigns after questions about resume“, Todd Wallack, July 25, 2013, Boston.com)

In June, The Globe had reported that in at least two applications for federal grants over the past decade, Berlowitz had stated she received a doctorate in English from New York University in 1969. For example, in the application for funding from the National Endowment for the Humanities she claimed to have a “D. Phil” — the British abbreviation for a doctorate of philosophy or PhD — from NYU. In an employment ad, the Academy also repeatedly described her as a doctor.

(“No record of academy head’s doctoral degree: Where deeds are honored, one is in doubt“, Todd Wallack, June 4, 2013, Boston.com)

The nonexistent doctorate was also in a draft of an obituary the Academy prepared for use in the event of her death. The obituary praised her as “a scholar of American literature” who “received undergraduate and doctoral degrees from New York University”.

(Todd Wallack, June 4, 2013, Boston.com)

NYU spokesman James Devitt said the university had no record of Berlowitz receiving a doctorate or completing her dissertation. A resume on file at NYU from when Berlowitz worked there indicated she was still working on her doctorate in the late 1980s or early 1990s.

Academics typically have little tolerance for people exaggerating their educational credentials. At other academic institutions, people who fabricate degrees have often faced severe consequences. Marilee Jones, a popular admissions dean at the Massachusetts Institute of Technology, left in disgrace in 2007 after she admitted falsifying her degrees, and Doug Lynch, a vice dean at the University of Pennsylvania, resigned in 2012 after revelations that he had falsely claimed to have a doctorate from Columbia University.

“In most situations at a university, lying about a professional degree would be grounds for instantaneous dismissal”, said Ronald G. Ehrenberg, director of the Cornell Higher Education Research Institute. “In academia, academic integrity is what we hold most dearly.”

(Todd Wallack, June 4, 2013, Boston.com)

The controversy attracted national attention because of the institution’s prestige. Founded during the American Revolution by John Adams, John Hancock, and other Harvard College graduates as Boston’s answer to Benjamin Franklin’s American Philosophical Society in Philadelphia, American Academy of Arts and Sciences conducts research, holds lectures for members, and elects scores of the brightest scholars, artists and leaders every year.

As the controversy went public, New York University released a summary of Berlowitz’s NYU career record:

“Leslie Cohen (Tuttleton) Berlowitz began her career at NYU in 1968 as a graduate assistant and teaching fellow in the English Department at NYU’s University College at University Heights in the Bronx. In 1970, while a graduate student and teaching assistant, she became one of three assistants to the dean of University College. In 1971 Berlowitz also became a lecturer in the English department. With the closing of the Heights campus in 1973, she moved to the Washington Square campus and was Assistant Dean for Administration at Washington Square and University College from 1972 to 1974. From 1974 on, Berlowitz held several administrative positions in goals planning and academic planning and affairs including Assistant Vice President for Academic Affairs 1981-1984, Associate Vice President for Academic Affairs 1984-1988 and Deputy Vice President for Academic Affairs 1988-1991. In addition, she was Director of the Humanities Council from 1979 through 1997. Her last position at NYU was Vice President for Institutional Advancement from 1991 through 1997 when she left the university to become executive officer of the American Academy of Arts and Sciences.”

(“Guide to the Records of the Associate Vice President for Academic Affairs, Leslie Berlowitz 1970-1980“, June 14, 2013, New York University Archives)

The NYU record indicates a fast career launch and smooth rise for Berlowitz within NYU, on an administrative track: in 1970 as a graduate student she became an assistant to the Dean, and a year later was on the faculty and 2 years later Assistant Dean for Administration. From 1981 on, she was a university-level executive as Assistant Vice President, Associate Vice President, and Deputy Vice President for Academic Affairs, and in the 1990s prior to moving to the Academy she was Vice President for Institutional Advancement.

Others noticed that her Academy resume had identified herself as former NYU vice president for academic advancement – her most senior NYU position – when it was actually vice president for institutional advancement – management of fund-raising rather than academic programs.

It should have been of no surprise what Berlowitz was and wasn’t.

At the Academy, she impressed directors by raising money, balancing the budget, and launching new programs, according to several former members of the governing board (council). She helped to energize a once-sleepy institution by stepping up fund-raising and launching new initiatives, such as modernizing the categories of fellows, including adding the fields of Computer Science and Philanthropy.

While it appeared that her past resume fudging now ended her long reign at the Academy, Berlowitz’s critics had actually tried to oust her a number of times, for other reasons, unsuccessfully.

In 1997, the first year at the helm of the Academy, Berlowitz was almost fired because of her heavy-handed management style, according to a former member of the governance council. Robert Haselkorn, a professor of Molecular Genetics and Cell Biology at the University of Chicago, told the press in 2003, “I have been trying to get rid of her for the past seven years.”

(“A controversial subject at the academy: Is Leslie Berlowitz reviving the intellectual institution or just dividing it?“, Alex Beam, August 23, 2003, Boston.com)

In 2000, Roger Myerson, an Economics professor at the University of Chicago and vice president of the Academy’s Midwest Center, tried to get the council to move Berlowitz out of administration to concentrate on her forte, raising money. Myerson also opposed the appointment of Boston businessman Louis W. Cabot to the Academy’s vice presidency. According to Myerson, “The administration was not being monitored full time by somebody who really cares about scholarship”.

(Alex Beam, August 23, 2003, Boston.com)

Then Academy president Dan Tosteson, a former dean of the Harvard Medical School, had hired Berlowitz in 1997 along with commissioning a strategic plan to transform the Academy into a broader, more diverse national organization. But by 2003 Tosteson and Dudley Herschbach, a Nobel Prize-winning Harvard Chemistry professor, “made a thorough investigation of her performance and found it to be very uneven”, Tosteson said. “Everyone told us the same story”, Herschbach said. “She was an incredibly nasty person who chewed people out in unacceptable ways. She kisses up and kicks down.” 

(Alex Beam, August 23, 2003, Boston.com)

In 2003 faced with opposition within the Academy’s top level, Berlowitz mobilized support from other members of the Academy’s 17-member governing council, including former MIT professor Carl Kaysen, former Academy president Leo Beranek and former Harvard professor of Political Economy Francis Bator. Berlowitz pointed out that her 7th anniversary at the Academy was just around the corner. “Maybe we should have a party”, she mused. “A survival party.”

(Alex Beam, August 23, 2003, Boston.com)

Bwelowitz’s 7th anniversary survival then turned into a personal triumph. She was voted by the governing council to become a member of the Academy.

Every year, the members of Academy of Arts and Sciences elect some of the world’s most accomplished scholars, artists, and leaders to join their institution. The 2004 election took place in the spring, but shortly before the October induction ceremony the 17-member governing council decided to add one more name of its own: Leslie Cohen Berlowitz.

The Academy then quietly inserted Berlowitz’s name into the original 6-month-old announcement, making it look as though she had been voted in by the around 4,000 members in the spring. “It was a terrible thing to do”, Stanford University History professor emeritus Peter Stansky, a former council member, said. “It’s a lie.”

(“Academy’s council added its chief to honoree list: 2004 selection, executive’s role in annual process draw criticism“, Todd Wallack, June 18, 2013, Boston.com)

An Academy spokesman noted that the council had the option of electing one candidate a year on its own (since increased to two) under the Academy’s bylaws, and that Louis W. Cabot nominated Berlowitz based on her service to the Academy.

In 2009, Louis W. Cabot became chairman of the governing council, and in 2010 Berlowitz consolidated control of the Academy by also taking over the title of president, a position previously reserved for an honored scholar from outside the administration, such as Dan Tosteson who had hired Berlowitz and then tried unsuccessfully to remove her.

So it wasn’t a coincident that in July 2013 when Berlowitz was to step down, in a letter announcing the decision to the Academy members, council chairman Louis W. Cabot also announced his own departure in October, to be replaced by Don M. Randel, former president of the University of Chicago and of the Andrew W. Mellon Foundation, for a 3-year term.

(“Embattled President of American Academy of Arts and Sciences to Resign“, Jennifer Schuessler, July 25, 2013, The New York Times)

It appeared that “academic integrity”, which academics “hold most dearly”, was what mattered and prevailed once Berlowitz’s resume altering was discovered and publicized.

Berlowitz believed it was just a distraction. “I never intentionally misrepresented my accomplishments to obtain an improper benefit for the academy or for myself”, Berlowitz said, “the current debate has become a distraction for the academy”.

(Todd Wallack, June 18, 2013, Boston.com)

Still, some critics felt that Berlowitz had also become overly involved in the member-election process, acting as a gatekeeper for who gets in and who stays out based on her friendships or other reasons. Several former employees said she pushed committees to add or drop candidates, and demanded to see all the ballots before they were tallied by the membership office.

“There needs to be a complete inquiry into how the academy has been managed, across the board, including how the academy chooses fellows”, demanded Jean Strouse, a Biographer inducted into the Academy the same year as Berlowitz.

(Todd Wallack, June 18, 2013, Boston.com)

But even without a dig into the delicate election ballots issue, there has been a considerable amount of information in the press about what some of the larger controversial issues were during Berlowitz’s reign at the Academy.

During that time, the number of business executives and philanthropists inducted annually rose from roughly 7 to 11, including philanthropist Teresa Heinz Kerry, New England Patriots owner Robert Kraft, and former Liberty Mutual chief Edmund F. Kelly. In fact, for 5 of the Academy’s 6 biggest donors, accounting for more than 1/3 of the $39 million the Academy raised from 2006 to 2010, either they were inducted into the business and philanthropy category or their foundation heads were.

For example, Boston Scientific Corp. cofounder Peter Nicholas, who became a member in 1999, gave $2.4 million during the period. John Cogan, a Boston investment executive who joined the Academy in 2005, gave $1.9 million. And Gershon Kekst, who founded a prominent Wall Street communications firm and was elected in 2006, gave $1 million through his family’s foundation.

“Honoring the mere accumulation of wealth taints the honor of authentic achievements in the arts and sciences”, said James Miller, former editor of the Academy’s scholarly journal, Daedalus. “It’s supposed to be an academy, not a highfalutin club for the leisure class.”

(Todd Wallack, June 18, 2013, Boston.com)

An Academy spokesman, however, noted that the institution has always included business leaders. Ray Howell said that philanthropists and business leaders are typically among the most generous donors for nonprofits, but he declined to say who picked the executives to appear on the Academy’s ballot or what criteria they used. Several members said they did not know either.

In 2012, Hillary Clinton, Melinda Gates and Sanford “Sandy” Weill, a prominent New York businessman and corporate executive, were among the new members of the Academy.

(“Academy of Arts and Sciences Honors Creator of “Too Big to Fail”“, Noel Brinkerhoff and David Wallechinsky, April 20, 2012, AllGov.com)

A former president of American Express Company and Chairman and CEO of its subsidiary Fireman’s Fund Insurance Company, former Chairman of Primerica Corporation and former Chairman and CEO of Travelers Group, Weill became Chairman Emeritus of Citigroup after retiring from its CEO position on October 1, 2003, and then from its Chairman position on April 18, 2006, and was elected a member of the Academy in 2012 for his contributions to banking and philanthropy.

Honorary Chairman of the Committee Encouraging Corporate Philanthropy, a nonprofit forum of CEOs and Chairpersons, Sanford Weill and his wife Joan had donated more than $800 million to non-profit organizations, especially for healthcare, including to Weill Cornell Medical College and Memorial Sloan-Kettering Cancer Center in New York City.

(“Sanford I. Weill and Dr. Joseph J. Fins Elected to American Academy of Arts and Sciences“, April 17, 2012, New York Presbyterian Hospital/Weill Cornell Medical Center/Weill Cornell Medical College)

Noted political journalist Robert Scheer became really indignant about this one, writing:

“How evil is this? At a time when two-thirds of U.S. homeowners are drowning in mortgage debt and the American dream has crashed for tens of millions more, Sanford Weill, the banker most responsible for the nation’s economic collapse, has been elected to the American Academy of Arts & Sciences.

So much for the academy’s proclaimed “230-plus year history of recognizing some of the world’s most accomplished scholars, scientists, writers, artists, and civic, corporate, and philanthropic leaders.” George Washington, Ralph Waldo Emerson and Albert Einstein must be rolling in their graves at the news that Weill, “philanthropist and retired Citigroup Chairman,” has joined their ranks.

Weill is the Wall Street hustler who led the successful lobbying to reverse the Glass-Steagall law, which long had been a barrier between investment and commercial banks. That 1999 reversal permitted the merger of Travelers and Citibank, thereby creating Citigroup as the largest of the “too big to fail” banks eventually bailed out by taxpayers. Weill was instrumental in getting then-President Bill Clinton to sign off on the Republican-sponsored legislation that upended the sensible restraints on finance capital that had worked splendidly since the Great Depression.

At the signing ceremony Clinton presented Weill with one of the pens he used to “fine-tune” Glass-Steagall out of existence, proclaiming, “Today what we are doing is modernizing the financial services industry, tearing down those antiquated laws and granting banks significant new authority.” What a jerk.

Citigroup went on to be a major purveyor of toxic mortgage-based securities that required $45 billion in direct government investment and a $300 billion guarantee of its bad assets in order to avoid bankruptcy.”

(“For He’s a Jolly Good Scoundrel“, Robert Scheer, April 18, 2012, truthdig.com)

“The banker most responsible for the nation’s economic collapse” was elected a member of the American Academy of Arts and Sciences?! Now that’s surprising.

Let’s see how the mainstream press described Sanford “Sandy” Weill. Two years prior to his election by the 4,000 members of the Academy, journalist Katrina Brooker had interviewed Weill and The New York Times published her detailed story about him in January 2010, under the title “Citi’s Creator, Alone With His Regrets”:

“”Over the last two years, Mr. Weill has watched Citi — a company he built brick by brick during the final act of a 50-year career — nearly fall apart. Although every taxpayer in the country has paid for Citi’s outsize mistakes, for Mr. Weill the bank’s myriad woes are a commentary on his life’s work.

Citi’s troubles are well chronicled: a failure to integrate its disparate parts worldwide or to keep tabs on risky investments and free-wheeling operations. These lapses led to billions of dollars in losses and multiple bailouts, and the government now owns a quarter of the company. Citi’s shares fell from a high of $55.12 in 2007 to about a dollar early last spring, and now trade at $3.31.

“Sandy took advantage of changes in the industry to build a financial colossus,” says Michael Holland, founder of Holland & Company, a money management firm. “In the end it didn’t work, and we are now paying for that as taxpayers.”

One news item, in particular, was crushing: Last winter, The New York Post ran a picture of Mr. Weill on its front page with the headline, “Pigs Fly: Citi Jets Ex-C.E.O. to Cabo.” He had taken the corporate plane to vacation in Mexico, weeks after Citi had accepted a $45 billion taxpayer bailout. The flight provoked a public outcry and media frenzy.”

(“Citi’s Creator, Alone With His Regrets“, Katrina Brooker, January 2, 2010, The New York Times)

Wait, money manager Michael Holland said Weill merely “took advantage of changes in the industry to build a financial colossus”, in spite of all the media fuss about his ‘pig-flying’.

But some critics, not just Robert Scheer of Truthdig, saw Weill as the villain, or as Katrina Brooker referred to, “the architect” who created a chaos:

“Mr. Weill built his wealth, status and power by creating what was once the world’s largest bank. Now, as Citi struggles to regain its footing, Mr. Weill’s legacy has taken on a darker hue. Though he was once viewed as a brilliant dealmaker, some critics now cast him as the architect of a shoddily constructed, unmanageable financial supermarket whose troubles have sideswiped investors, employees and average citizens nationwide.

“The dream, the mirage has always been the global supermarket, but the reality is that it was a shopping mall,” says Chris Whalen, editor of The Institutional Risk Analyst, of Citi’s evolution over the last decade. “You can talk about synergies all day long. It never happened.”

Old accomplishments — once sources of admiration — now draw criticism. Mr. Weill’s successful push to repeal the Glass-Steagall Act is under attack. To create Citi, he fought to change laws that had prevented banks, insurers and brokerage firms from merging. But in the wake of the economic crisis last year, Congress has introduced laws to reinstate parts of the legislation. In November, Mr. Weill’s former co-C.E.O. at Citi, John Reed, told Bloomberg News that he was sorry for his role in helping to end Glass-Steagall.”

(Katrina Brooker, January 2, 2010, The New York Times)

Weill focused on talking about Citigroup, acknowledging the bank’s failure and expressing personal sadness about it, but otherwise remaining unrepentant about his responsibility, blaming it on having picked the wrong successor, Chuck Prince:

“During a series of recent interviews, Mr. Weill spoke candidly about the loss, frustration and humiliation caused by Citi’s fall. “I feel incredibly sad,” he says. He remains baronially wealthy, but says he has endured financial pain, too: until a year ago, he says, the bulk of his investment portfolio was split equally between Citi stock and Treasuries.

“It’s never going to be the same company that it was,” he said one morning shortly before Christmas.

Mr. Weill says that the model on which he built the company was not at fault, that it was the management that failed. For this, he accepts partial responsibility.

“One of the major mistakes that I made was my recommending Chuck Prince,” he says of his handpicked successor, who ran the company from 2003 to 2007. Mr. Weill blames Mr. Prince for letting Citi’s balance sheet balloon and taking on huge risks.

In addition to initially supporting Mr. Prince as C.E.O. — even though Mr. Prince had never run a bank — Mr. Weill also pushed out Jamie Dimon, a well-regarded banker who now runs JPMorgan Chase. And Mr. Weill personally recruited Robert Rubin to Citi after Mr. Rubin stepped down as Treasury secretary. Mr. Rubin, who has since left Citi and declined to comment about his tenure there, has been criticized as failing to help rein in the bank’s excesses.

“Look what it’s done,” he says. “It’s hurt the dreams of so many people.””

(Katrina Brooker, January 2, 2010, The New York Times)

Well, given that former Presdient Bill Clinton’s Treasury Secretary Robert Rubin, personally at CitiGroup at Weill’s invitation, couldn’t prevent the debacle, perhaps “Sandy” had some credit in his conviction.

By the time of the recent financial crisis Weill had retired; and he then focused his retirement energy on charitable fundraising and giving:

“These days, Mr. Weill keeps busy with charities and his personal investments. He is up at 5 a.m., reads all the papers, turns on CNBC. He is chairman of Carnegie Hall, Weill Cornell Medical College and the National Academy Foundation and is on the boards of six other institutions.

His foundation gave $170 million in cash last winter to Weill Cornell; such generosity has endeared him to the philanthropic world. He has raised $950 million for Weill Cornell’s $1.3 billion fund-raising campaign and recently put together a $110 million bond offering for Carnegie Hall.

“It was like being back in business again,” he says. “I get the same kind of kick by getting somebody to make a major charitable contribution. It’s the same kind of adrenaline rush.””

(Katrina Brooker, January 2, 2010, The New York Times)

When the world is all about winners not losers, Weill was right on. His business “adrenaline rush” took him to the top of the financial world, with a monster frenzy created and attributed to him; and “the same kind of adrenaline rush” in charitable causes has now brought him the honor of being elected to the prestigious American Academy of Arts and Sciences as a Philanthropist.

After all, unlike Leslie Cohen (Tuttleton) Berlowitz, “Sandy” didn’t fudge his resume, did he? We can all read.

9 Comments

Filed under Academia, Culture, Education, News and politics, Society

The baffling rise of suicides in the U.S. military — plausible theories and grim reality

(This article has been expanded from a posting on my Facebook community page, History, Culture and Politics.)

Of the crises facing American soldiers today, suicide ranks among the most emotionally wrenching — and baffling.

The New York Times’ James Dao and Andrew W. Lehren reported in May 2013 that during the past 12 years and two wars, suicide among active-duty troops has risen steadily, hitting a record of 350 in 2012, twice as many as a decade before, surpassing not only the number of U.S. troops killed in Afghanistan but also the number who died in transportation accidents.

Even with the withdrawals from Iraq and Afghanistan, the suicide rate within the military continued to rise significantly faster than within the general population, where it was also rising. In 2002, the military’s suicide rate was 10.3 per 100,000 troops, well below the comparable civilian rate. But by 2012 the rates are nearly the same, above 18 per 100,000 people.

Since 2001 – when the U.S. military operations in Afghanistan began following the September 11 terrorist attacks on U.S. soil – more than 2,700 service members have killed themselves, not counting National Guard and reserve troops who were not on active duty. Suicide among veterans has also risen somewhat to an estimated 22 a day.

According to The New York Times reporters, though the Pentagon commissioned numerous reports and invested tens of millions of dollars in research and prevention programs, experts concede they still do not understand the root causes of the suicide increase.

An emerging consensus among researchers is that, just as with civilians, a dauntingly complex web of factors underlie military suicide: mental illness, sexual or physical abuse, addictions, failed relationships, financial struggles. Indeed, a recent Pentagon report found that 1/2 of the troops who killed themselves in 2011 had experienced the failure of an intimate relationship and about 1/4 had received diagnoses of substance abuse.

The same Pentagon report also found: about 9 of 10 suicides involved enlisted personnel, not officers; 3 of 4 did not attend college; more than 1/2 were married; 8 in 10 died in the United States; and – most baffling of all – most did not communicate their suicide intent.

The New York Times’ James Dao and Andrew W. Lehren put forth a question the loved ones left behind by the suicides often anguished with:

“Each of those suicides comes with its unique set of circumstances, its own theory as to why. But in the voices of loved ones left behind, themes echo. Surprise. Confusion. A relentless question: Could we have done more?”

(“Baffling Rise in Suicides Plagues the U.S. Military”, James Dao and Andrew W. Lehren, May 15, 2013, The New York Times)

The article also pointed out that, just 12 years ago when the rate of military suicide was much lower, many experts believed the military culture insulated young people from self-harm: not only did it provide steady income and health care, structure and a sense of purpose, military service also screened personnel for criminal behavior as well as for basic physical and mental fitness.

But a recent medical study has identified something in the military culture that was missing in the screening as mentioned by The New York Times article: those who have served in the military are more likely than others to have suffered childhood abuse or to have lived in homes where there was violence. In other words, the military has been a refuge for young people, especially young men, who suffered traumatic childhood experience.

This could be a hidden factor in the suicides, as people who have experienced severe childhood abuse are at a higher risk of attempting suicide. But the researchers caution no actual link has been established between the childhood experience and the concrete cases of military suicides. “We don’t know anything about whether or not these early life adversities are actually impacting the health of service members”, said study co-author John Blosnich of the Veterans Affairs Pittsburgh Healthcare System.

(“Study: Military a refuge for those exposed to childhood abuse”,  John Vandiver, July 23, 2014, Stars and Stripes)

The previous perception of the military as a refuge was also changed by war. “There is a difference between a military at war and a military at peace”, said Dr. Jonathan Woodson, assistant secretary of defense for health affairs. “There is no doubt that war changes you.”
(May 15, 2013, The New York Times)

Another recent medical study has aimed to articulate the role played by deployment and combat experiences in the suicide rise. The researchers argue that high rates of depression or post-traumatic stress disorder from the combat experience can lead to suicidal behavior: the illnesses can lead to a sense of burdening others and social isolation; add to this loss of personal relationships a familiarity with firearms, and the resulting toxic stew can drive suicides among troops and veterans.

“It’s best to view the increase in military suicides as a result of an increase in mental health issues of service members driven in large part, but not entirely, (by) combat and deployment experiences”, wrote the University of Southern California researchers, retired Col. Carl Castro, former director of psychological health research for the Army, and Sara Kintzle. But they conceded that researchers have not found any specific reason with absolute certainty to explain the rise in military suicides.

(“Study: Indirect link between combat and suicide risk”, Gregg Zoroya, July 19, 2014, Stars and Stripes)

In the following, I review a select number of U.S. military suicide cases reported in the press: each of these cases had a clearer, sharper context than those described in the article by James Dao and Andrew W. Lehren, and together they may help shed new light onto the above, and possibly more, issues regarding U.S. military suicide rise.

The most infamous of these suicide cases is that of Steven Dale Green of the elite Army 101st Airborne Division.

In March 2006 while deployed to Mahmudiya, 20 miles south of Baghdad, Iraq, Green and three other soldiers — Jesse Spielman, Paul Cortez and James Barker — went to the home of the al-Janabi family near their checkpoint station, where Green shot and killed the mother, father and their 5-year-old daughter, and raped their 14-year-old daughter, Abeer Qassim al-Janabi, before shooting her and setting her body on fire.

Green became the first ex-soldier charged and convicted under the U.S. Military Extraterritorial Jurisdiction Act, which gave the U.S. civilian courts jurisdiction over crimes committed overseas. In 2009 he was sentenced to multiple life sentences without the possibility of parole.

(“Convicted US war criminal Steven Green dead in ‘suicide’”, February 19, 2014, BBC News)

During court proceedings, Green issued a public apology, a qualified one the relatives of the victims didn’t accept. “I helped to destroy a family and end the lives of four of my fellow human beings, and I wish that I could take it back, but I cannot”, Green read a statement at a victim impact hearing. “And, as inadequate as this apology is, it is all I can give you.”

(“Former soldier at center of murder of Iraqi family dies after suicide attempt”, Steve Almasy, February 18, 2003, CNN)

Green’s case is psychologically intriguing; despite the heinousness of the crimes, he showed some sensitive human qualities such as understanding, and caring.

In his statement to the victims’ relatives he said, “you wish I was dead, and I do not hold that against you. If I was in your place, I am convinced beyond any doubt that I would feel the same way”.
(Steve Almasy, February 18, 2003, CNN)

Green was discharged from the army due to a “personality disorder”, shortly after the crimes. When the law enforcement came to arrest him on June 30, 2006, he had taken his grandmother to dinner and also planned to take her to a movie; he said to the FBI agents, “Knew you guys were coming.”

(“Convicted soldier: ‘You probably think I’m a monster’”, Dave Alsup, May 11, 2009, CNN)

Green’s case is also politically intriguing; despite the undeniability of his crimes, he persistently tried to shift the blames to the political higher-ups.

At the time of his arrest, Green lamented to the FBI agents, “All of my buddies were getting killed over there. My lieutenant got his face blown off. … George Bush and Dick Cheney ought to be the ones that are arrested.”
(Dave Alsup, May 11, 2009, CNN)

Green also declared to the FBI agents, “Joining the Army was the worst decision I ever made.”
(Dave Alsup, May 11, 2009, CNN)

Ironically, Steven Green was a high school dropout from then President Bush’s hometown of Midland, Texas.

Then in an October 2013 press interview, Green argued, “I was made to pay for all the war crimes. I’m the only one here in federal prison”, “I’m not a victim, but I haven’t been treated fairly.
(February 19, 2014, BBC News)

4 months later in February 2014, Green was found unresponsive in his prison cell, and his death was ruled as suicide by hanging.

It’s unclear if Steven Dale Green died with a sense of regret, or of being unfairly singled out.

The suicide of former U.S. Army soldier Levi Derby was a case very much the opposite of Steven Green’s.

Derby was shocked, and then continued to be haunted, by the death of a young Afghan girl unwittingly caused by his friendly gesture: he offered the Afghan child a bottle of water, and when she came forward to accept it she stepped on a land mine.

After Derby returned home, he locked himself in a motel room for days, and his mother, Judy Casper, saw a vacant stare in his eyes. A while later, Derby was called up for duty in Iraq, but he didn’t want to kill again, refused, and eventually agreed to an “other than honorable” discharge.

On April 5, 2007, Levi Derby hanged himself in his grandfather’s garage in Illinois. According to his mother, by that time he had suffered 5 years of post-traumatic stress disorder following the Afghanistan experience.

(“Monticello Marine gets warm welcome home”, November 5, 2010, Will Brumleve, The Journal-Republican; and, “Why suicide rate among veterans may be more than 22 a day”, Moni Basu, November 14, 2013, CNN)

For both Steven Green and Levi Derby, their deployment and combat-related experiences directly affected their state of mind and behavior.

The circumstances of Green’s death have not been clearly reported, but his reaction to the harsh reality of war had resulted in his committing extreme crimes against Iraqi civilians, in particular the rape and murder of a young Iraqi girl, and his facing the consequence of his future life in prison when he died there.

On the other hand, Derby’s mother Judy Casper clearly indicated that her son’s death had to do with a lingering psychological disorder suffered due to his deployment experience, especially the unusual experience of a young Afghan girl’s death.

Although no traumatic childhood experience of their own is known, one may wonder what their frames of mind were at the time, that events were triggered in close relation to children: why, as a soldier, a young man like Green resorted to raping a 14-year-old Iraqi girl and killing her and her 5-year-old sister, and yet with an inkling the FBI was closing in he took his grandmother out to dinner and movie; and why the bizarrely freak death of a young Afghan girl touched by his gesture of friendship so permanently marred the psyche of Derby, a soldier in war where death was commonplace, often the norm.

In the other suicide cases I review, the U.S. military had failed those soldiers, that instead of being a refuge for them it had become a place of cruel suffering, suffering at the hands of some of their fellow soldiers.

Much publicity has been given to the 2011 suicides of Chinese American soldiers Danny Chen from New York and Harry Lew from California, partly because Lew happened to be a nephew of U.S. Congresswoman Judy Chu, who in August 2012 wrote about the horrific cruelties in a The New York Times opinion piece entitled, “Military Hazing Has Got to Stop”:

“Last fall, at an outpost in Kandahar, Afghanistan, Danny Chen, a 19-year-old Army private, was singled out for hazing by Sgt. Adam Holcomb and five other soldiers, all of whom were senior in rank to their victim. They believed Danny was a weak soldier, someone who fell asleep on guard duty, who forgot his helmet. So for six weeks, they dispensed “corrective training” that violated Army policy. When he failed to turn off the water pump in the shower, he was dragged across a gravel yard on his back until it bled. They threw rocks at him to simulate artillery. They called him “dragon lady,” “gook” and “chink.”

Finally, Danny could take it no longer. He put the barrel of his rifle to his chin and pulled the trigger. The pain was over.

Earlier this week, a jury of military personnel found Sergeant Holcomb guilty of one count of assault and two counts of maltreatment, for which he was sentenced to one month in jail — far less than the 17 years that he could have received.

On April 3, 2011, my nephew, 21-year-old Lance Cpl. Harry Lew, was serving his second year in the Marines in Afghanistan’s Helmand Province, when he was hazed for over three hours by two of his fellow soldiers because he, too, fell asleep on duty. At the urging of their sergeant, who told them that “peers should correct peers,” they punched and kicked him. They poured the contents of a full sandbag onto his face, causing him to choke and cough as it filled his nose and mouth. Twenty-two minutes after the hazing stopped, he, too, used his own gun to commit suicide, in a foxhole he had been forced to dig.”

(“Military Hazing Has Got to Stop”, Judy Chu, August 3, 2012, The New York Times)

Both Chen and Lew were in war-zone deployment when they committed suicide, due not to enemy hostility but to their comrades’ cruelty.

Representative Judy Chu referred to what Chen and Lew had suffered as “hazing”, something that typically occurs within a social group, especially a group of youths, particularly young men, as a form of initiation. One may wonder if Danny Chen and Harry Lew, in any prior childhood or young-adult experience of social hazing, had expected that it could be a ritual not of life’s acceptance, but rejection.

The suffering and death of U.S. Marine Carri Leigh Goodwin was subtler, involving rape by a fellow soldier, alcohol abuse and prescription-drug use for her mental-health problem — without her duty in a war zone let alone combat experience.

Enlisted in the Marine Corps in 2007, at only 18 to make her former Marine father proud, Goodwin was raped by a fellow Marine at Camp Pendleton, bullied by the commander whom she reported the rape to, and eventually forced out of the Corps with a “personality disorder” diagnosis — not unlike the medical basis on which Steven Green was discharged after his raping and murdering others.

After Goodwin returned home her father, Gary Noling, noticed that she was drinking heavily. But she did not tell her family that she had been raped, or that she had thought about suicide. She also did not tell them she was taking Zoloft, a drug prescribed for anxiety.

5 days after arriving home, Goodwin went drinking with her sister, who then left her intoxicated in a parked car; the Zoloft interacted with the alcohol, stopping it from going through her liver, and Goodwin died in the back seat, with her blood alcohol level 6 times the legal limit.

Goodwin was only 20 and police charged her sister, and a friend, for involuntary manslaughter and furnishing alcohol to an underaged person.

It was only later, when Noling went through Goodwin’s journals, that he learned of the rape his daughter had endured in the Marine Corps and her intention to drink herself to death.

(Moni Basu, November 14, 2013, CNN; and, Stormie Dunn, Silenced No More, December 2013, Author House)

Carri Leigh Goodwin never got to be deployed to the war zone. Her outlook on life was critically impacted by her Marine Corps experience, while her family remained the closest to her in her life, before and after, and was where it ended.

With the above 5 cases of US. soldier suicides, each of which has received considerable community attention and some national media exposure, one can clearly notice that deployment experience played a major role in most of them.

On the other hand, combat-level experience did not appear to be more influential or critical than the personal experiences within the military culture, in which these soldiers lived, worked and were deployed.

And while no direct childhood experiences were discussed by any of these four young men and one young woman – understandably so given a soldier’s expected gung-ho attitude – a personal or personality theme subtly associated with children, or with earlier life at a tender age, can be seen in each of their stories.

As discussed earlier, in their May 2013 article The New York Times’ James Dao and Andrew W. Lehren publicized a question the loved ones left behind by the suicides often asked: “Could we have done more?”

At the time of his arrest, Steven Dale Green sighed that joining the military was his worse decision ever.

Falling short of such a grandstand, the cases reviewed here seem to indicate that family intervention would have only limited capability of influence: in the case of Levi Derby, if not that of Carri Leigh Goodwin, the family appeared to have done what it could, while in the other cases critical events occurred inside a system the family obviously had little ability to directly affect.

Therefore, other questions should also be asked, such as: Can some things be done differently in the U.S. military? And, can some things be done less, if at all, in the military culture?

1 Comment

Filed under News and politics