The Lawyers’ Epidemic: Depression, Suicide, and Substance Abuse

In a departure from the usual at Abnormal Use, we offer this Abnormal Public Service Announcement.

A study by Johns Hopkins University found that among more than 100 occupations studied, lawyers were three times more likely to suffer from depression than any other profession.  Ted David, Can Lawyers Learn to Be Happy?, 57 No. 4 Prac. Law 29 (2011).  According to this piece,  “a quality-of-life survey conducted by the North Carolina Bar Association in 1991 reported that almost 26 percent of the bar’s members exhibited symptoms of clinical depression. Almost 12 percent of them said they contemplated suicide at least once each month.”  See Michael J. Sweeney, The Devastation of Depression.  The North Carolina study was prompted in part by the suicides of eight Mecklenburg County, North Carolina lawyers in a seven-year period.  Several years ago, in a period of just 18 months, six lawyers died by suicide in South Carolina.

Suicide is the third leading cause of death among attorneys, after cancer and heart disease.  Thus, the rate of death by suicide for lawyers is nearly six times the suicide rate for the general population.  Suicide can be prevented.  While some suicides occur without any outward warning, most do not.  We can prevent suicide among lawyers by learning to recognize the signs of someone at risk, taking those signs seriously, and knowing how to respond to them.

The National Institute on Alcohol and Alcohol Abuse estimates that 10 percent of the U.S. population is alcoholic or chemically dependent.  In the legal profession, the abuse may be as high as 20 percent.  David, supra.  According to this piece, “[a]lcoholism is a factor in 30 percent of all completed suicides.”  Reports from lawyer assistance programs indicate that 50 percent of lawyer discipline cases involve chemical dependency.

Whether you are the husband, wife, employee, judge, law student, law partner, law firm associate, friend, or colleague of a person challenged by depression or substance abuse, your understanding of the nature of the problem can play a vital part in helping that individual to achieve and maintain recovery.  Please remember that there is hope, and there is help.  You are not alone.

In South Carolina, call the Lawyers Helping Lawyers toll-free helpline at 866-545-9590.  Check with your State’s bar for a lawyer assistance program or click this link for the ABA directory of lawyer assistance programs.

(See also here for a recent similar article by Stuart Mauney in the January 2012 issue of the South Carolina Lawyer).

Can Attorneys Reclaim Civility?

Nationally syndicated columnist Kathleen Parker recently asked whether civility can be saved.  Parker noted that Americans have always been “a bunch of rowdies and rascals,” citing as a “perennial favorite,” the “caning administered by South Carolina Rep. Preston Brooks upon the person of Massachusetts Sen. Charles Sumner over a disagreement about slavery and a question of honor.”  Parker defined civility as “courtesy in behavior and speech, otherwise known as manners.  In the context of the public square, civility is manners for democracy.”  Parker then argued that our manners have deteriorated, particularly in recent years.  “Manners have become quaint, while behaviors once associated with rougher segments of society have become mainstream.”

How did Parker suggest we fix the civility problem?  She said that change “has to come from within, each according to his own conscience.”  The media must strive to be “honest, accurate and fair, and reward the coarsest among us with scant attention.”  Parker claimed that the greatest threat to civility is not the random outburst but “the elevation of nonsense, and the distribution of false information.”  She concluded by reminding us that the Golden Rule works well.  “Best taught in the home, it could use some burnishing.”

Parker’s column was published in my local newspaper, The Greenville News, on February 19.  Just two days later, that same newspaper published a column by another nationally syndicated columnist, Cal Thomas, titled “Learning a Civility Lesson.”  Thomas recently spoke at the Conservative Political Action Conference in Washington, and in his own words, “failed to live up to one of my highest principles.”  The story of the day was the Obama administration’s recent move to require faith-based institutions to provide contraception as a part of health care coverage.  A video clip was played from Rachel Maddow’s program on MSNBC in which she commented on the subject.  After the clip was played, Thomas told the audience, “I think she’s the best argument in favor of her parents using contraception, and all the rest of the crowd at MSNBC, too, for that matter.”  In his column, Thomas admitted that he spoke before thinking: “I am not supposed to behave like that.”  The morning after the speech, Thomas called Maddow to apologize.  Maddow graciously accepted the apology and commented on her show that she believed Thomas’ apology.

Thomas concluded his column by reminding his readers that he has had many liberal friends over the years.  “They became my friends because I stopped seeing them as labels and began seeing them as persons with innate worth.  That is what I failed to do in my first response to Maddow.”  Parker referred to “the food-fight formula that attracts viewers to cable TV” and would surely be pleased with Thomas’ apology.

Our friends at Legal Blog Watch noticed that the Fourth Circuit recently called out the U.S. Attorney’s Office for uncivil language in an appellate brief.  The court felt “compelled to note that advocates, including government lawyers, do themselves a disservice when their briefs contain disrespectful or uncivil language directed against the district court, the reviewing court, opposing counsel, parties, or witnesses.”

In South Carolina, lawyers are required to sign an oath, pledging “fairness, integrity and civility, not only in court, but also in all written and oral communications” to opposing parties and their counsel.  In striving to remain faithful to this oath, lawyers would do well to remember Parker’s reference to George Washington’s writings on this subject: “Let your Conversation be without Malice or Envy, for ‘tis a Sign of a Tractable and Commendable Nature: And in all Causes of Passion admit Reason to Govern.”  Finally, we would also do well to remember Thomas’ civility lesson, including the willingness to admit we are wrong and apologize for our behavior.

Ten Years Ago Today: Dedman Graduates From Baylor Law School

As you know, we here at Abnormal Use often pause to reflect upon sentimental anniversaries.  We can’t help it.

Today, we offer this piece on the tenth anniversary of my graduation from Baylor Law School.

It was February 9, 2002, in Waco, Texas, when I graduated from law school, ten years ago today.

First things first, yes, I graduated from law school in February.  This is due to the fact that Baylor Law School, which runs on quarters rather than semesters, occasionally prompts an odd graduation date.  So, there I was, in February of 2002, preparing to graduate and take the February bar exam later that month.  That’s just the way we roll at Baylor University.

I had always enjoyed my time at Baylor Law.  A relatively small institution, it boasted a total of 450 enrolled students at the time of my graduation a decade ago.  When I started at Baylor Law in May of 1999 (another unusual start date, due to the quarter system), I had only 30 or so students in my starting quarter.  You always hear the stories of cutthroat classmates at larger schools; but this was not the case at Baylor, as the school was simply too small for anyone to get away with such antics.  Really, there was an unusual esprit de corps in the student body, brought about both by the size of the institution but also the shared looming dread of Baylor’s very difficult third year curriculum (a mandatory year long advocacy and civil procedure program known as Practice Court).

For the occasion of my graduation, my parents, my brother, and even some friends, trekked to Waco.  Few of them had previously visited my fair city. Most only knew the town because of its relatively recent notoriety from the Branch Davidian standoff just a decade before.  But we all met at the brand new Sheila and Walter Umphrey Law Center, which had just opened a few months before in the fall of 2001.  (In the late 1990s, Baylor Law alum Walter Umphrey, a famous Plaintiff’s attorney from Beaumont, Texas, gave a $10 million gift to Baylor to fund most of the new building. There is a dash of historical irony in the funding source, as Baylor has traditionally been defense oriented in its legal philosophy but its palatial new building was funded mostly by a trial lawyer’s mighty gift.). However, in 1999, I began my legal education in the old Morrison Hall.  At that time, the administrators of the law school knew that they would soon be building a brand new law center, and so, most funds were earmarked for that purpose and general upkeep of  old Morrison Hall was – shall we say – not the highest priority.  It wasn’t until August of 2001 that the new building would be completed and opened.

In the autumn of 2001, the new law center was immense, immaculate, and quite simply, amazing.  So new was the building, in fact, that there were no televisions in the public areas of the building on September 11, 2001.  Many students sat in the student lounge by the radio, of all things, listening to the news in the same way people must have on December 7, 1941.

So it was, in February 2002, that we congregated at nearby Miller Chapel on the main campus for the graduation ceremony. Twenty six of us graduated that day, and the commencement speaker was Professor Gerald Powell, who taught me both Evidence and Advanced Evidence.  Just a few months before, in November, at the new law center’s first graduation ceremony, Umphrey himself was the commencement speaker. But Powell was someone all the graduating students knew well, as he had taught them all.  I had been his research assistant and in 2001 wrote a paper for him on the admissibility of email and Internet evidence, new topics back then.

Powell’s speech was weighty and very well received. It was just a few months after 9/11, and that tragedy was on every0ne’s mind.

That day, he said:

You can no longer focus on just yourself, on your career, or even on just your own family.  More will be asked of you.  As Americans, and especially as lawyers, you will carry with you great responsibilities.  After September 11, each of you must be willing to stand guard over our liberty, to serve your country selflessly, and, if the need arises, be a hero.

Each of us must take our turn as sentinels.  And as lawyers we have our own post to man.  Our watch is over the Constitution.  Our perimeter is the outposts of liberty.  Our weapon is the law.  Our mission is to see that justice is done.

[W]e also hope that each of you will have inside of you that seed of heroism perhaps dormant until a moment of truth, when it will spring forth in the energizing light of adversity to give us the hero we need.  And until that time comes, or whether it ever comes, we hope and pray that you will act heroically in the conduct of your everyday lives, professional, public and personal.

The speech was later circulated by email to those in attendance, likely by Baylor Law’s unofficial historian, Eric Nordstrom, who would graduate later that year.

After a reception at the law school, but before that evening’s festivities, I had a bit of free time, so my younger brother, Bert, and my old pal, Alistair Isaac, and I decided to do the one thing that I had never done in Waco but had always wondered about doing: visiting the remains of the infamous Branch Davidian compound.  In the late 1990’s and early 2000’s, and probably today, one cannot attend school in Waco and not be asked constantly by friends from other cities if you have visited “the compound.”  Prior to my graduation, I never got around to doing so, but it seemed like an appropriate final quest on the day of my graduation, my last official day as a student in the city.  So, we found a set of directions on the Internet (which are still online today!) and ventured out to find the compound.  I drove my 2000 Honda Civic with Bert and Alistair as passengers, and we followed the directions, but somehow, along the way, we found ourselves lost.  This was rural Central Texas.  We were in an area of large fields, farms, and farm houses.  There were not many commercial establishments at which to stop and ask directions.  In fact, as we slowed the car to look for places to ask for assistance, we saw one house with a large sign on it which exclaimed simply “Don’t ask!”  We took that advice.  A few minutes later, we drove past a field in which a farmer was plowing or riding a horse or doing something along those lines.  My brother hopped out of the vehicle and walked toward the man.  Before my brother could utter a word, the man said simply, “You’ve already passed it.  Go back a mile or two and take the left that you missed.”

How about that?

Even in February of 2002, the compound was no longer the structure you might recognize from the 1993 media coverage. There was a tree orchard planted to commemorate those who had not survived the standoff.  There was some minor portion of the housing structure still in place, but not really enough to recognize it for what it was.  On some level, the visit was anticlimactic; after being asked about the compound for all the years that I lived in Waco, it was just a field of sorts with a handful of derelict structures.  We saw a burned out passenger bus at the scene, which we later learned was the result of vandalism years afterward and not the standoff itself. (Alistair and I thought the old bus had something very cinematic about it, but that’s a different story for a different day). And that was basically it for the compound.  We returned back to the city and readied ourselves for the evening to come.

Later that night, we congregated at George’s Restaurant, a local watering hole that has been memorialized in Texas country songs in part for its Big O’s, large, very large glasses of beer.  The whole graduating class was there, as were many other friends and students, and I suppose that was the last time we were all together in the same room before scattering off to different corners of the world.

And that was ten years ago today.  At that time, I was 26 years old, having just reached that age a month before in late 2001.  My concept of being a lawyer was not completely uninformed, as Baylor focuses on the practical components of legal education (a topic we’ve discussed here on occasion).  Although I am confident that on that day I never paused to reflect upon what my career would be like ten years later, I certainly would not have predicted that I would be 1,000 miles away from Texas in North Carolina. But here I am.

It’s funny where life takes you.

So, what does it all mean? Like all the others who graduated that day, I’ve been a law school graduate for a decade.  For those of us who began and developed our careers during that time period, almost everything has always been online – whether it be treatises, the laws and statutes themselves, cases and orders, law review articles or other such things.  And, of course, as time has progressed, they have only become more accessible, with the advent of laptops, wifi, and of course, iPad apps.  However, unless graduates have been particularly lucky, trials have not been in abundance.  The older lawyers talk about the days in the 1970s when you could get called to court on a moment’s notice to try a case unexpectedly.  But those pesky discovery rules we learned in law school arm clients and advocates with enough information to accurately gauge exposure, and thus, trials can be (and are regularly) avoided.  There are fewer surprises, and the days of trial  by ambush are long in the past.   It’s a different world than the one our professors and bosses knew when they graduated.

The legal blogosphere came along just about ten years ago and facilitated great discussion about the (major and incredibly minor) issues of the day – which is a great boon to the profession.  But, really, when I look back at the last ten years, I don’t face some existential dilemma as to what might have been had I not become a lawyer.  Rather, I am reminded of the fun moments that the career has afforded me.  There are silly moments, and there are meaningful ones.  Most enjoyable are those moments, at a deposition, hearing, or trial, when you realize that your preparation and hard work are about to pay off and that no one else in the room has realized it yet.  That feeling, that sense of accomplishment and victory, moments before you officially prevail – is what makes being a lawyer fun and interesting.

This is not to say that every day offer such moments.  There are those weeks that we spend in faraway places reviewing documents in old warehouses without air conditioning.  There are long drives and long waits in airports and courthouse hallways.

But in the end, we realize that one appeal of this profession is that it is different every day.  There are new challenges to face with every case and every hearing and deposition.  Although fewer and fewer cases go to trial these days, we must remain vigilant and prepare in case the one we are working on at present does go that route. And that’s something I learned way back in Practice Court at Baylor Law.

(Special thanks to Jerri Cunningham, the Baylor Law School registrar, for confirming some details for me and forwarding me a copy of Professor Powell’s speech).

Litigating in the Arena

Today, in our last substantive post of the year, we remember the trial. For many reasons, there just aren’t as many trials as there used to be as in those days of yore. We often hear the older lawyers in our community tell tales of the old days – prior to the adoption of more formal discovery rules – when litigators litigated and juries rendered verdicts. These days, with the voluminous information yielded by years of discovery, both sides of a case know each others’ strengths and weaknesses such that they can readily evaluate the worth of that case. Coupled with the rise of mandatory mediation, most cases settle before being fought in the courtroom.

Sure, there are many cases that should not go to trial for a variety of reasons, most notably cost and uncertainty. But many that can and should see the courtroom are not tried. We, as lawyers, should not be afraid of trying cases. Nor should we refrain from advising our clients to take their meritorious defenses to trial if the circumstances warrant it. That’s part of our job, too.

Earlier this year, one of the shareholders at our firm defended a difficult case and prevailed after a long, hard-fought trial. Fresh off that victory, he sent the following email to all of our firm’s attorneys:

[W]e do not need to be afraid to try cases to juries. We need to properly evaluate the case for settlement purposes, but if a reasonable settlement cannot be obtained, we need to convince the client to try the case. At mediation, if the plaintiff doesn’t get into an acceptable range for settlement, simply advise the mediator and opposing counsel that we appreciate their attendance at the mediation but we will be delighted to see them at the courthouse for a jury trial. Juries almost always do the right thing. While there certainly have been bad jury verdicts, and occasionally a jury will do something crazy and deliver a runaway verdict, often those cases can be corrected on appeal or settled during the appeal for a much more reasonable amount, and these results are not typical.

As many of you know, I have decried the decline of jury trials over the last few years, and hope we can once again restore the jury trial to our arsenal of defense of civil litigation. There is simply nothing more grand than a jury trial, and no feeling more thrilling than a defense verdict after a hard-fought trial. . . . [W]hile trials are stressful and extremely hard work, the thrill of victory makes it all well worthwhile.

And, even if we don’t win, let’s always remember the immortal words of Teddy Roosevelt:

“It is not the critic who counts; not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better. The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood; who strives valiantly; who errs, who comes short again and again, because there is no effort without error and shortcomings; but who does actually strive to do the deeds; who knows great enthusiasms, the great devotions; who spends himself in a worthy cause; who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who neither know victory nor defeat.”

That quotation, of course, comes from Roosevelt’s 1910 “Citizenship in a Republic” speech.

Think about that the next time you’re drafting a case status update letter.

Views of 2011 From 1931

1931 was a long time ago, and few who live today can claim to remember it all too well. Just two years after the stock market crash of 1929, 1931 claimed Herbert Hoover as the President of the United States (which that year had 48 states). Movie monsters were the rage; Bela Lugosi starred in Tod Browning’s Dracula film and Boris Karloff did his star turn in Frankenstein. Cab Calloway recorded the classic “Minnie The Moocher” (and he was 49 years from performing it again in 1980’s The Blues Brothers). James Dean was born that year; so were William Shatner and Leonard Nimoy. That December, the first Christmas tree was placed at the construction site that would later become Rockefeller Center. The Lindbergh kidnapping was a year in the future, and the attack on Pearl Harbor – precipitating the country’s entry into World War II – was a full decade away.

It was a far different time culturally, socially, politically. The issue: What did the great minds of 1931 predict the rapidly approaching 2011 would be like?

There is actually an answer to that question.

Way back on September 13, 1931, The New York Times, founded in 1851, decided to celebrate its 80th anniversary by asking a few of the day’s visionaries about their predictions of 2011 – 80 years in their future. Those assembled were big names for 1931: physician and Mayo Clinic co-founder W. J. Mayo, famed industrialist Henry Ford, anatomist and anthropologist Arthur Keith, physicist and Nobel laureate Arthur Compton, chemist Willis R. Whitney, physicist and Nobel laureate Robert Millikan, physicist and chemist Michael Pupin, and sociologist William F. Ogburn. Since these guys all have their own Wikipedia entries so many decades later, they had to have been important for their time, right? Perhaps not a diverse lot, but it was 1931.

Ford, perhaps the most recognizable name to modern readers, set the tone of the project in his own editorial of prognostication:

To make an eighty-year forecast may be an interesting exercise, first of the imagination and then of our sense of humility, but its principal interest will probably be for the people eighty years on, who will measure our estimates against the accomplished fact. No doubt the seeds of 1931 were planted and possibly germinating in 1851, but did anyone forecast the harvest? And likewise the seeds of 2011 are with us now, but who discerns them?

We’re not certain why The Times chose to celebrate an arbitrary 80 years of existence. Whatever the case, the predictions are full of gems, so we encourage you to read the original articles (which, hopefully, The New York Times will unlock from its paywall as 2011 approaches). Today, we are just two weeks shy of 2011, so we must ask, how did some of these men fare in their predictions? Let us do as Ford suspected we would and measure their estimates against accomplished fact (at least as much as a humble products liability blog can do).

Dr. Mayo had this to say:

Contagious and infectious diseases have been largely overcome, and the average length of life of man has increased to fifty-eight years. The great causes of death in middle and later life are diseases of heart, blood vessels and kidneys, diseases of the nervous system, and cancer. The progress that is being made would suggest that within the measure of time for this forecast the average life time of civilized man would be raised to the biblical term of three-score and ten.

Dr. Mayo predicted the average life span in 2011 would be 70. He wasn’t far off. According to this post at the Centers for Disease Control and Prevention, it’s currently 77.9 years.

Interestingly, Keith warned of the coming perils of overspecialization in medicine:

Eighty years ago medicine was divided among three orders of specialists – physicians, surgeons, and midwives. Now there are more than fifty distinct special branches for the treatment of human ailments. It is this aspect of life – its ever growing specialization – which frightens me. Applying this law to The New York Times, I tremble when I think what its readers will find on their doorsteps every Sunday morning.

Any litigator who has ever attempted to secure a medical expert in an obscure field certainly understands the concerns espoused by Keith. All we can say is that Keith would probably not be pleased to see all the various branches of medicine that have arisen in the past eight decades. (But we here at Abnormal Use, as consumers of medicine, are pretty pleased about all the smart folks out there who know lots and lots about important fields and sub-fields of medicine.).

Ford, writing in 1931, just two years after the stock market crash, predicted that we as a nation might focus more on the intangibles of life than the bottom line:

We shall go over our economic machine and redesign it, not for the purpose of making something different than what we have, but to make the present machine do what we have said it could do. After all, the only profit of life is life itself, and I believe that the coming eighty years will see us more successful in passing around the real profit of life. The newest thing in the world is the human being. And the greatest changes are to be looked for in him.

Uh, okay. In these troubling economic times of ours today, we’ll just say, “No comment.”

Millikan observed:

Among the natural sciences it is rather in the field of biology than in physics that I myself look for the big changes in the coming century. Also, the spread of the scientific method, which has been so profoundly significant for physics, to the solution of our social problems is almost certain to come. The enormous possibilities inherent in the extension of that method, especially to governmental problems, has already apparently been grasped by Mr. Hoover as by no man who has heretofore presided over our national destinies, and I anticipate great advances for moving in the directions in which he is now leading.

Certainly, the scientific method has not solved all of our social problems (and Millikan would likely be displeased to learn how history now views President Herbert Hoover.).

Pupin was optimistic that workers would come to share in the profits of that they produced:

The great inventions which laid the foundation of our modern industries and of the resulting industrial civilization were all born during the last eighty years, the life time of The New York Times. This civilization is the greatest material achievement of applied science during this memorable period. Its power for creating wealth was never equaled in human history. But it lacks the wisdom of distributing equitably the wealth which it creates. One can safely prophesy that during the next eighty years this civilization will correct this deficiency by creating an industrial democracy which will guarantee to the worker an equitable share in the work produced by his work.

Er, not quite.

Compton predicted:

With better communication national boundaries will gradually cease to have their present importance. Because of racial differences a world union cannot be expected within eighty years. The best adjustment that we can hope for to this certain change would seem to be the voluntary union of neighboring nations under a centralized government of continental size.

Well, national boundaries are just as important as they were back in 1931. (And in fact, there have been a ton of wars in the past 80 years over just that issue). The United Nations would be formed fourteen years after Compton’s call for a “voluntary union of neighboring nations,” but its efforts and successes over the past 65 years have been, at best, a mixed bag. (Interestingly, Compton also predicted that China, “with its virile manhood and great nature resources,” would take “a more prominent part in world affairs.”).

Our favorite set of predictions, though, comes from Ogburn, who actually went out on a limb and made some bold predictions (some of which were dead on, other of which were not so much):

The population of the United States eighty years hence will be 160,000,000 and either stationary or declining, and will have a larger percentage of old people than there is today. Technological progress, with its exponential law of increase, holds the key to the future. Labor displacement will proceed even to automatic factories. The magic of remote control will be commonplace. Humanity’s most versatile servant will be the electron tube. The communication and transportation inventions will smooth out regional differences and level us in some respects to uniformity. But the heterogeneity of material culture will mean specialists and languages that only specialists can understand. The countryside will be transformed by technology and farmers will be more like city folk. There will be fewer farmers, more wooded land with wild life. Personal property in mechanical conveniences will be greatly extended. Some of these will be needed to prop up the weak who will survive.

Inevitable technological progress and abundant natural resources yield a higher standard of living. Poverty will be eliminated and hunger as a driving force of revolution will not be a danger. Inequality of income and problems of social justice will remain. Crises of life will be met by insurance.

The role of government is bound to grow. Technicians and special interest groups will leave only a shell of democracy. The family cannot be destroyed but will be less stable in the early years of married life, divorce being greater than now. The lives of woman will be more like those of men, spent more outside the home. The principle of expediency will be the dominating one in law and ethics.

Not too bad for a man born in 1886 who didn’t live to see 1960. Sure, he was off by about 150 million on the United States population for 2011. Sure, he didn’t predict the microchip or the Internet. Oh, and yeah, poverty hasn’t been eliminated and hunger is still a problem worldwide. But he generally seemed to understand the coming material leisure culture, the rise of big government, and the differences in the family unit in the world eight decades from his prediction.

Oh, and for the record, we here at Abnormal Use do not plan to use this occasion to make predictions about 2091, save for the lone augury that we here will still be toiling away at our desks in an effort to bring you fresh and insightful commentary each business day.


All of the articles listed below are linked and available online, but they’re also all behind The New York Times paywall archive. Unless you have access, all you’ll get is the abstract.

Compton, A.H. “Whole of the earth will be but one great neighborhood; Dr. Compton envisions the great development of our communications,” The New York Times, September 13, 1931.

Ford, Henry “The promise of the future makes the present seem drab; Mr. Ford foresees a better division of the profits to be found in life,” The New York Times, September 13, 1931.

Keith, Sir Arthur. “World we hope for runs away with the pen of the prophet; Sir Arthur Keith doubts if his individualist longings can be realized,” The New York Times, September 13, 1931.

Mayo, W.J. “The average life time of man may rise to the biblical 70; Dr. Mayo says also that a proper use of our leisure will be evolved,” The New York Times, September 13, 1931.

Millikan, Robert A. “Biology rather than physics will bring the big changes; Also, says Dr. Millikan, the scientific method will aid in government,” The New York Times, September 13, 1931.

Ogburn, William F. “The rapidity of social change will be greater than it is now; and hunger, says Dr. Ogburn, will not be a danger as a revolutionary force,” The New York Times, September 13, 1931.

Pupin, Michael. “Our civilization will create a new industrial democracy; it will give the workers a fair share in wealth, says Michael Pupin,” The New York Times, September 13, 1931.

Whitney, W.R. “Better world-wide education will serve our experiments, self-improvement is viewed by Dr. Whitney as the great task set for mankind,” The New York Times, September 13, 1931.

Thanksgiving in 1810, 1910, and 2010

Every Thanksgiving, American readers of newspapers and magazines are treated to similar nostalgic pieces about the origins of Thanksgiving and the uniqueness of the holiday.

It was no exception in 1910, one hundred years ago, in the pages of St. Nicholas: An Illustrated Magazine for Young Folks, a then popular family magazine. In that publication’s November 1910 issue, writer Clifford Howard authored a piece called “Thanksgiving in 1810,” in which he looked back a century to see how far the nation had progressed since that time. What a fun and intriguing article to stumble across exactly one hundred years later (particularly with the stellar illustrations by C.T. Hill, some of which we’ve embedded here in click to enlarge format).

“The world has changed more in the last 100 years than in any 1000 years that have gone before,” Howard wrote, not knowing how much that change would accelerate in the coming years. But surely, in writing such a piece, Howard wondered whether anyone a century from his time would look back to 1910 and comment upon similar changes in the culture. Of course he did. In fact, he ended his piece with the question, “[W]hat will it be in 2010? Who can tell?”

So, we here at Abnormal Use, denizens of 2010, will take it upon ourselves this Thanksgiving week to revisit Howard’s long forgotten article from that long forgotten magazine. (Considering the nature of his task, we think he would appreciate our responding via the Internet, a medium that he could not have imagined in his wildest dreams way back in 1910).

Most of Howard’s commentary concerned the huge advances in technology that occurred in the century preceding the publication of his piece. Thus, he began with the following premise:

A hundred years back may seem a long while ago, but when you remember that there are men living to-day whose fathers saw General Washington, a century does not seem so long a time after all. And up to the time of Washington a hundred years did not mean very much to the human race. The world moved very slowly. When Washington died, in 1799, people were using the same sort of appliances and doing the same things in the same way that they did in 1699 and even 1599. In former times, if a man could have returned to earth at the end of a hundred years, he would not have been very much surprised at any of the changes that had taken place during this absence. But if Washington or Franklin, or even Thomas Jefferson, who died less than a century ago, were to come back to earth now, he would not know where he was.

Howard notes the obvious, that the citizens of 1810 had no “air ships or automobiles or motor-cycles,” and so of course, travel was not nearly as speedy as it was for those of 1910. But then he ponders how those of 1810 would interpret the technological marvels of the early 20th century:

In fact, not only the humble farmer of that day, but the scientist and philosopher as well, would have found it impossible to believe all the wonderful things that were to take place within the century. If you could have lived then and looked ahead a hundred years and told your friends and neighbors that men would travel by steam and electricity, that they would fly in the air from London to Manchester, or from New York to Philadelphia, that they would talk to one another from Boston to Chicago, they would flash news across the ocean in the twinkling of an eye, that the great wilderness beyond the Mississippi would be populated with millions of people and contain some of the big cities of the world, that men and woman would go across the Atlantic and across the vast continent of America in perfect ease and comfort and in less time than it then took to journey from New York to Washington – if in 1810 you had foretold these marvelous things, your friends and neighbors would have shaken their heads and whispered sadly to one another that you were crazy. If the wonders you related to them were to come to pass during the next thousand years, they perhaps would have admitted that there might be truth in some of your stories; but to say that they would all come true inside of a hundred years and that some of the very people to whom you were talking would live to see many of these magical inventions, would have been really to much for any sane person to believe.

Fifty years later, Arthur C. Clarke would summarize the same sentiment when he wrote that “[a]ny sufficiently advanced technology is indistinguishable from magic.”

Of particular interest is Howard’s comparison of the communications infrastructure of both time periods. When we, as modern readers, study history, we have an omniscient view based upon the many events pieced together by the historian. We know what was occurring at all relevant times in all relevant places. But the participants of those historical events had no such luxury. News traveled very, very slowly in 1810, at a molasses like pace even by 1910 standards:

As there were no railroads, news traveled only as fast as a horse could run or a ship could sail. There were no wires to carry messages, for there was no telegraph and there was no telephone.

If the farmer of 1810 got a newspaper at all, it was a week or a month or perhaps three months old before it reached him.

Imagine what Howard would think of live television or the Internet. Would he be able to comprehend Facebook or Twitter? Or the technology which allows each of us, with everyday devices, to capture a moment on film or video and share it with the world instantly? What would he think of the notion that in this age we are all pamphleteers and publishers?

On a side note, we, as proprietors of a legal blog, can’t help but wonder just how different the practice of law was in 1910 based, in part, on the aforementioned differences in communications technology. It was certainly slower, in that litigators could not easily save and alter legal forms and blast them out instantly via fax or email. Never mind the fact that the information gathering process must have been slow, as well, simply because not everyone had telephones. Documents were locked away in dusty file rooms of courthouses, not available with a quick digital search. But the advantage of that may have been that lawyers weren’t scurrying about all the time in such great haste to perform this task or file that motion. Might the practice have been described as slow but rewarding? We can only surmise based on what we know in hindsight.

Some other fun bits:

  • Howard observes that in 1810, the states of Florida, Texas, and California were not yet a part of the nation and were, thus, merely “waste places or foreign lands.” Ouch.
  • Howard notes that Thanksgiving, as his generation knew it, was not celebrated officially outside of New England in 1810.
  • Most newspapers in 1810 were issued only weekly, and the would be news contained therein was a few days to half a year old.

What will it be in 2110? Who can tell?

So, what became of Howard the writer and the publication to which he submitted this piece?

The St. Nicholas magazine, which began publishing in the 1870’s, folded in the 1940’s. Howard, for his part, didn’t make it to the halfway point of the 20th century, either. He died in 1942, at the age of 73, apparently after spending some time in Hollywood writing movies. According to his brief New York Times obituary (behind that site’s paywall archive), Howard “worked with Cecile B. De Mille and his research was largely used for the film King of Kings.” He was the author of many magazine articles and a number of books (and his work wasn’t always family friendly, either).

Here’s the best nugget we discovered about Howard’s life and education: According to this 1895 mini-biography of Howard published in a poetry journal, he once studied the law! It notes: “Like many others in their gradus ad Parnassum, he devoted some time to the study of law, graduating with the title of L.L.B. from the Columbian University in 1890, only to find that Blackstone and Kent were uncongenial masters and that his literary aspirations would never be content within the narrow bounds of prosaic law.”

Well, at least that’s something that hasn’t changed since 1910.

Thoughts on a Practical Legal Education

Ah, law school curriculum reform. A popular topic, and one always worth discussing, though true reform rarely occurs. The Law School Innovation Blog alerted us to a recent discussion at the Prawfs Blawg about potential 1L curriculum reform. The author of the Prawfs Blawg post noted that students fail to read cases and statutes closely enough and seem ill-prepared even after their first year. As a part of its post, The Law School Innovation Blog highlighted two comments to the original Prawfs Blawg piece, including one from a law student who complained that law school utilized too much of a “hide the ball” approach, and a response from an attorney who declared that hiding the ball is, in fact, the best method of instruction.

These issues, though, are symptoms of a much larger curricular problem, one that is not solely confined to the first year. As we all know, the first year curriculum is mostly uniform throughout the nation, focusing on the law of no jurisdiction in particular and the common law as it supposedly existed at some point in the eighteenth century. Though first year students do not typically learn the substantive law of an actual jurisdiction, the curriculum is helpful in training students in legal thinking and disabusing them of any judicial system stereotypes they picked up from their poli-sci profs. Are we really going to utilize the Rule in Shelley’s Case in our daily practice? Probably not. But that’s not really the point of the first year, which is to teach law students legal reasoning and analysis.

Despite the occasional outcry against the Socratic method, the first year curriculum is actually quite unique and helpful, despite its natural stresses. Sure, there may be law professors who delight in frightening those young would-be attorneys. But that approach has worked for many a decade, and sometimes, tradition should be emphasized, especially when it is useful.

So, let’s forget first year curriculum reform. Really, it’s the second and third years that are in need of radical overhaul, anyway. After an intensive first year, students drift away from difficult and demanding courses to focus on finding a job (no small feat these days) or perfecting their putting stroke. Many others spend that time taking interesting, though impractical, elective courses which will not serve them in the future. Although some students relish the opportunity to take “bar courses,” few law school graduates take the opportunity – or even have the opportunity – to learn the practical skills that will become their bread and butter.

Licensing entities have taken notice. Some state bars have imposed additional requirements upon law school graduates based upon the assumption that there is still something left for them to learn before entering the profession. For example, here in South Carolina, bar applicants must take a course called “Bridge The Gap,” which by its very name, suggests that there is at least a minor deficit in the education of said graduates. (The Bar has also set up a mentoring program for new attorneys). Although some institutions have implemented practical components into their curriculum, why aren’t most of the law schools out there addressing these issues?

The burden also shifts to the law firm, as the employer of the gap-bridging graduate, to inculcate the tricks of the trade. Larger firms may simply absorb this responsibility as a part of its general associate training. But in these troubling economic times, this can become a problem of import. Many law school graduates, unable to find jobs, are starting their own firms without any practical skills or experienced guidance. How did they find themselves in that position?

What to do? Instead of tweaking first year curriculum, law school administrators should consider more dramatic changes in the law school paradigm. The second and third years can be transformed into true opportunities to learn practical legal skills (as well as everyday ethical issues and the business of law). State bars should be confident that students graduating from accredited institutions have been properly trained both in the nature of substantive law and legal thinking but also practical skills that will be employed on a daily basis as lawyers. Last but not least, law students are consumers of legal education. If we expect them to spend three years of their lives and hundreds of thousands of dollars to enter our profession, there should be no gap to bridge upon graduation.

Legal Lessons from The Magnificent Seven (1960) on its Fiftieth Anniversary

[Editor’s Note: This coming Saturday, October 23, 2010, marks the fiftieth anniversary of the release of the classic Western movie, The Magnificent Seven, which starred Yul Brynner, Robert Vaughn, Charles Bronson, James Coburn, and of course, Steve McQueen. Directed by John Sturges, the film was based upon the 1954 Japanese film, Seven Samurai, directed by Akira Kurosawa. To celebrate this occasion, we here at Abnormal Use asked our boss – senior partner Mills Gallivan – for his thoughts on the film and lessons we can learn from it as lawyers.]

“If God didn’t want them sheared, he would not have made them sheep.”

If you are familiar with this quote then you are probably a fan of Westerns and, in particular, The Magnificent Seven. This cult movie is on most, if not all lists of the Top Ten Westerns ever made. This week marks the fiftieth anniversary of the film’s American release. The movie was originally released in Europe and was so popular that it was re-released in America and immediately became a huge hit and financial success. The musical score for the movie was composed by Elmer Bernstein and nominated for an Academy Award in 1961. The theme song is stirring and has been used in numerous other movies, musical compositions and ads, including the old Marlboro commercials. Anyone over the age of fifty would immediately recognize it.

The movie is inspiring as you watch a small band of dedicated professional gunmen take on huge odds in the defense of a hapless Mexican village. I recently read about a college football coach who shows the movie to fire up his team the night before each game. Throughout his career, he has now shown it over 500 times to his various teams.

The quote above is from Calvera, the bandit who regularly pillages a small village in Mexico. He is speaking to the members of the Magnificent Seven, and trying to talk them out of defending the villagers, who he sees as his sheep. His is a great rationale if you are a bully and a thief! As you might expect, this argument does not persuade the seven professionals who have taken the job on a matter of principle. Consider the following exchange between Chris (Yul Brynner) and Vin (Steve McQueen), the two leaders, about their commitment:

Chris: You forget one thing. We took a contract.
Vin: It’s sure not the kind any court would enforce.
Chris: That’s just the kind you’ve got to keep.

So what does this have to do with products liability law?

Oftentimes, corporate defendants in products cases feel much like the villagers in the movie, victimized, bullied and about to be sheared. Certainly, the villagers are much more vulnerable and sympathetic than a corporate defendant. However, the often perceived motivation of the plaintiff’s trial bar is sometimes very similar to that of the bandit Calvera. This motivation can be greed, which is fueled by money and power. One has only to look at the tragic demise of the now infamous trial lawyer Dickie Scruggs to understand that for some plaintiff’s lawyers, justice is not the ultimate goal. Scruggs plead guilty to mail fraud and bribery and when Judge Glen Davidson imposed his sentence he quoted William Barclay, a Scottish philosopher, who said, “The Romans had a proverb that money was like sea water. The more you drink the thirstier you become.”

So what makes The Magnificent Seven magnificent? I like to think it is their courage in the face of insurmountable odds and their unwillingness to cut and run when given the opportunity. One definition of magnificent is noble, and these hired guns see their salvation in taking up the noble defense of the villagers. They are determined that Calvera will not shear the villagers again without paying a heavy price. The bandit releases them after the first skirmish, thinking that they really are not willing to die for peasants who can not pay them and who will not fight for themselves. The concept of a noble cause also resonates with good defense trial lawyers; as a group we believe in our clients’ positions, seek justice and will not be intimidated by an adversary or judicial hellhole. Calvera underestimated the commitment of the men he faced and it was a huge mistake.

Shortly after being released and ordered back across the border to Texas, James Coburn’s character Britt foretells the final showdown when he says: “Nobody throws me my own guns and says run. Nobody!”

A capable plaintiff’s lawyer will not underestimate the defense legal team, or at least, not more than once. So when you are in a battle to defend your products and keep your company from being sheared, where do you turn? After fifty years, six of the members of the Magnificent Seven are dead, and the lone survivor Robert Vaughn (Lee) is an actor not a lawyer.

When it is all on the line, you need defense trial attorneys with consummate skill, integrity, courage and a willingness to fight to the last barricade. Lawyers who know that ultimately justice can and will prevail and who are not afraid to say to the Calveras of the world:

Calvera: Somehow I don’t think you’ve solved my problem.
Chris: Solving your problems is not our line.

Bluejays and Mockingbirds

“Shoot all the bluejays you want, if you can hit’em, but remember it’s a sin to kill a mockingbird.” This advice by Atticus Finch to his young daughter Scout is as poignant today as it was when Harper Lee first published it 50 years ago this week in To Kill a Mockingbird. This Sunday, July 11th, 2010 marks the Golden Anniversary of Lee’s iconic novel about courage, racial prejudice, compassion, and access to justice. This beloved story is second only to the Bible on several reading lists for books that “make a difference.”

While he is fictional, Atticus Finch has served as a role model and inspiration for many lawyers since his creation in 1960. His unwavering courage in defending Tom Robinson is best summed up in his own words to his son, Jem: Courage is “when you’re licked before you begin but you begin anyway and you see it through no matter what.” Atticus was persistent. He never lost faith in the American jury system, notwithstanding the terrible injustice of the guilty verdict rendered against his client.

What does Atticus Finch have to do with products liability? Not much. What does access to a jury trial have to do with products liability? Everything! In the same 50 years since the publication of To Kill a Mockingbird, the Federal Rules of Civil Procedure have been amended 10 times. With the exception of the rewrite in 2007 to make them easier to understand (which is worthy of an entirely separate blog entry), it is challenging to argue that the amendments have made access to a jury trial in federal court easier or more efficient. These assertions are borne out by the dramatic fall off in civil cases tried to a verdict in U.S. District Courts across the country. In 2009, according to the Clerk’s Office for the U.S. District Court of South Carolina; 3,532 cases were filed and only 21 cases or .0059% were actually tried to a verdict. Using a cost benefit analysis one must at least ask the question: Does this level of utilization justify the cost of the system?

Blame it on ADR, the FRCP, run away verdicts, cost, or the reason du jour. The fact of the matter is that the civil jury trial is on the endangered species list. The primary distinguishing factor of the American civil justice system is our jury trial. Access to the jury trial in Federal court must not only be preserved, it must be improved. If we truly view Atticus Finch as role model then it is time for lawyers to step up, shoot some bluejays and save the mockingbird. How do we do this?

On May 10th and 11th the Committee on Rules of Practice and Procedure of the Judicial Conference of the United States sponsored a conference at Duke University School of Law. In addition to the judicial participants, over 70 panelists and speakers came together to discuss major substantive revisions to the FRCP, and ultimately, access to jury trials in federal court. Matters being considered included, but were not limited to: Pleadings, Discovery, Protective Orders, and Cost and Delay. Our firm’s lawyers through their membership in the Lawyers for Civil Justice have been involved in and supportive of this movement for improving the FRCP and access to the civil jury trial. Follow this link [PDF] to the LCJ White Paper submitted at the Duke Conference. If you have yet to get behind this effort, now is the time to make your position known. If you would like more information about this exciting opportunity for change please contact us.

And remember, while, “[i]t’s a sin to kill a mockingbird,” it may be a greater sin to let one die when there is an opportunity to revive it!

A Modest Proposal: Abolish Strict Liability

As I prepare to leave for Las Vegas to attend the annual DRI Products Liability Conference, I have been thinking about the current state of products liability law in the United States. As everyone knows, our current products liability law consists of separate laws – including a myriad of statutes, codes and case law – in every state, some of which conflict and some of which overlap, supplemented by various federal laws, rules and regulations. As a result of this conflicting system, U.S. products manufacturers face increasingly complex and expensive litigation which has expanded exponentially over the years. With a couple of possible exceptions, one would be hard-pressed to find an area of litigation that has become more expensive than products liability.

There is also no question that manufacturers who produce products for use in the United States are the most regulated, legislated and litigated industry in the world. The question is whether there is too much regulation and litigation and, if so, what can and/or should be done to ease this burden so as to ensure that U.S. products manufacturers can compete in the global economy. It is obvious that relief is needed. We have all read the news and it is not good. Jobs are being lost daily, the United States industrial and manufacturing community is shrinking rapidly, if not dying, and products manufacturers face substantial litigation exposure and expense, all of which makes it extremely difficult for them to compete.

This burden needs to be substantially reduced. So what to do? Some would say the answer is to take products liability law out of the hands of the states and place it under the control of the federal government in the name of uniformity and consistency. God forbid that this occur. While allowing states to generally control products law does lead to some problems and inconsistencies, the federal government has done nothing worthwhile in the legislative arena in the last several decades and what it has done generally creates more problems than it solves. The current health care fiasco will, I believe, prove this point conclusively. That legislation will most assuredly lead us down the path of substantially higher health care costs, increased taxes and decreased quality of care. Turning control of the health care system in America over to the likes of Congress, including congressmen who are afraid Guam might tip over, and whoever might be in the White House at any given moment is a terrible idea and allowing it to take over products liability law would be just as bad, if not worse.

Another, and I would submit, much more appropriate remedy is to abolish the doctrine of strict liability. Strict liability laws were introduced at a time when products manufacturers needed regulating. These laws have clearly served their purpose of requiring U.S. manufacturers to make the safest products in the world. That they do so is really without question. To coin a phrase – planes, trains and automobiles – as well as toys, food, electronics, pharmaceutical products, medical devices, you name it – if it is designed and manufactured to be sold here, it is the safest product in the world. However, the doctrine of strict liability is no longer used to ensure reasonable safety; rather, it has gone beyond reasonableness to the point where a degree of “defensive design and manufacturing” akin to the concept of defensive medicine, is required. This has driven up costs, both on the design and manufacturing side as well as the back-end cost of defending litigation involving strict liability claims.

Assuming this to be the case, one answer is to do away with strict liability laws. Would doing so result in manufacturers suddenly abandoning the concept of making safe products? I think not. Would it result in a multitude of defective products being dumped into the marketplace? I think not. Would it result in manufacturers being able to make sensible decisions in designing and manufacturing products without having to worry about the concept of “defensive design and manufacturing,” thus lowering costs? I think so. Would it result in fewer frivolous claims being filed and litigation costs being driven down substantially? I think so. Is this a bad thing? Absolutely not!

Let me hasten to say that I do not believe that manufacturers should be insulated from liability where they are negligent and/or grossly negligent in connection with the design or manufacture of products. If they are negligent and they cause harm, they should pay reasonable actual damages. If they are reckless and consciously indifferent in their conduct, they should be liable for reasonable punitive damages. However, should they be liable after having used all due and reasonable care in the design and manufacturing process simply because some paid expert somewhere says that he or she thinks the product is defective or unreasonably dangerous? It seems to me that the time for that cause of action has come and gone.

As society changes, laws which, when enacted, fulfilled a valid and societal purpose become unnecessary. It is no longer necessary for us to legislate the manufacture and use of buggy whips. Times change, and the need for laws change, as well. Has the time to do away with the concept of strict liability arrived? I think so.