Artificial intelligence doesn’t equal artificial perfection. I have argued for a while now both on this blog and in a forthcoming law review article here that lawyers (and the investigators who work for them) have little to fear and much to gain as artificial intelligence gets smarter.

Computers may be able to do a lot more than they used to, but there is so much more information for them to sort through that humans will long be required to pick through the results just as they are now. Right now, we have no quick way to word-search the billions of hours of YouTube videos and podcasts, but that time is coming soon.

The key point is that some AI programs will work better than others, but even the best ones will make mistakes or will only get us so far.

So argues British math professor Hannah Fry in a new book previewed in her recent essay in The Wall Street Journal, here. Fry argues that instead of having blind faith in algorithms and artificial intelligence, the best applications are the ones that we admit work somewhat well but are not perfect, and that require collaboration with human beings.

That’s collaboration, not simply implementation. Who has not been infuriated at the hands of some company, only to complain and be told, “that’s what the computer’s telling me.”

The fault may be less with the computer program than with the dumb company that doesn’t empower its people to work with and override computers that make mistakes at the expense of their customers.

Fry writes that some algorithms do great things – diagnose cancer, catch serial killers and avoid plane crashes. But, beware the modern snake-oil salesman:

Despite a lack of scientific evidence to support such claims, companies are selling algorithms to police forces and governments that can supposedly ‘predict’ whether someone is a terrorist, or a pedophile based on his or her facial characteristics alone. Others insist their algorithms can suggest a change to a single line of a screenplay that will make the movie more profitable at the box office. Matchmaking services insist their algorithm will locate your one true love.

As importantly for lawyers worried about losing their jobs, think about the successful AI applications above. Are we worried that oncologists, homicide detectives and air traffic controllers are endangered occupations? Until there is a cure for cancer, we are not.

We just think these people will be able to do their jobs better with the help of AI.

We’ve had a great response to an Above the Law op-ed here that outlined the kinds of skills lawyers will need as artificial intelligence increases its foothold in law firms.

The piece makes clear that without the right kinds of skills, many of the benefits of AI will be lost on law firms because you still need an engaged human brain to ask the computer the right questions and to analyze the results.

But too much passivity in the use of AI is not only inefficient. It also carries the risk of ethical violations. Once you deploy anything in the aid of a client, New York legal ethics guru Roy Simon says you need to ask,

“Has your firm designated a person (whether lawyer or nonlawyer) to vet, test or evaluate the AI products (and technology products generally) before using them to serve clients?”

We’ve written before about ABA Model Rule 5.3 that requires lawyers to supervise the investigators they hire (and “supervise” means more than saying “don’t break any rules” and then waiting for the results to roll in). See The Weinstein Saga: Now Featuring Lying Investigators, Duplicitous Journalists, Sloppy Lawyers.

But Rule 5.3 also pertains to supervising your IT department. It’s not enough to have some sales person convince you to buy new software (AI gets called software once we start using it). The lawyer or the firm paying for it should do more than rely on claims by the vendor.

Simon told a recent conference that you don’t have to understand the code or algorithms behind the product (just as you don’t have to know every feature of Word or Excel), but you do need to know what the limits of the product are and what can go wrong (especially how to protect confidential information).

In addition to leaking information it shouldn’t, what kinds of things are there to learn about how a program works that could have an impact on the quality of the work you do with it?

  • AI can be biased: Software works based on the assumptions of those who program it. You can never get a read in advance of what a program’s biases may do to output until you use the program. Far more advanced than the old saying “garbage in-garbage out,” but a related concept: there are thousands of decisions a computer needs to make based on definitions a person inserts either before the thing comes out of the box or during the machine-learning process where people refine results with new, corrective inputs.
  • Competing AI programs can do some things better than others. Which programs are best for Task X and which for Task Y? No salesperson will give you the complete answer. You learn by trying.
  • Control group testing can be very valuable. Ask someone at your firm to do a search for which you know the results and see how easy it is for them to come up with the results you know you should see. If the results they come up with are wrong, you may have a problem with the person, with the program, or both.

The person who should not be leading this portion the training is the sales representative of the software vendor. Someone competent at the law firm needs to do it, and if they are not a lawyer then a lawyer needs to be up on what’s happening.

[For more on our thoughts on AI, see the draft of my paper for the Savannah Law Review, Legal Jobs in the Age of Artificial Intelligence: Moving from Today’s Limited Universe of Data Toward the Great Beyond, available here: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3085263].

 

One lawyer we know has a stock answer when clients ask him how good their case is: “I don’t know. The courts are the most lawless place in America.”

What he means is that even though the law is supposed to foster predictability so that we will know how to act without breaking our society’s civil and criminal rules, there is a wide variety of opinion among judges even in the same jurisdictions about the matters that make or break a case on its way to a jury.

Our friend’s answer came to mind while reading an interesting roundup of experienced trial lawyers over the weekend about why the trial of Bill Cosby outside Philadelphia resulted in a deadlocked jury and mistrial, announced on Saturday.

In the New York Times, the attorneys mostly fell into two camps: those who thought lead witness Andrea Constand presented the jury with credibility problems because of inconsistent testimony, and those who thought the judge’s decision to limit the admission of evidence of many other similar allegations substantially weakened the prosecution’s case.

My view is that the two reasons are linked: evidence that many women have made claims similar to Constands’ could easily have overcome the credibility problem if the jury had been able to hear about many of the other women who alleged Cosby had drugged and had sexual contact with them too.

In another case with identical facts and a different judge, the other accusers may have made it in a great example of two things we tell clients all the time:

  1. Persuasive evidence is good, but admissible evidence is what you really want when you know you’re going to trial.
  2. A lot of legal jobs are now being done by computers, but while there are human judges they will differ the way humans always do: in a way that is never 100% predictable.

Admissibility

When we are assigned to gather facts in civil or criminal matters, all of the evidence we get must always be gathered legally and ethically. Otherwise it could easily turn out to be inadmissible. But even if you do everything right, admissibility is sometimes out of your control. The whole case can turn on it.

If all you are doing is trying to get as much information as you can without any thought of taking it to trial, then admissibility may not be much of a concern. Think about deciding whether someone is rich enough to bother suing using hearsay evidence; or finding personally damaging information that may be excluded as prejudicial, but even the thought of arguing a motion about that information would be too much for the other side to bear. It could increase the chance of a more favorable settlement for you.

In the Cosby case the information in question would have been very helpful to the prosecution.

Ordinarily the justice system doesn’t like to see evidence of other bad acts used in a case to paint a picture of  a defendant’s character. Rule 404 (b) of the Federal Rules of Evidence excludes this kind of thing, but allows admission of evidence of another act “as proving motive, opportunity, intent, preparation, plan, knowledge, identity, absence of mistake, or lack of accident.”

So the prosecution could have argued that all the other accusers making similar claims that they were drugged and subjected to sexual contact were evidence of Cosby’s intent, or a lack of accident, and may even have been seen as preparation for the time Constand went to Cosby’s home and was drugged.

But the judge wouldn’t let any of that in. In Pennsylvania, the rules in this section are tougher on the prosecution than are the federal rules. The state’s rule 404(b) (2) “requires that the probative value of the evidence must outweigh its potential for prejudice. When weighing the potential for prejudice of evidence of other crimes, wrongs, or acts, the trial court may consider whether and how much such potential for prejudice can be reduced by cautionary instructions.”

It seems that the judge was afraid that even warning the jury not to read too much into the other accusers would have prejudiced them even if he instructed them that the other accusers alone did not constitute proof of Cosby’s guilt — in this matter with Constand.

Unpredictability

The legal world is justifiably occupied in trying to figure out how to reduce costs by automating as many tasks as possible. Gathering of some facts can be automated, but not always, for the simple reason that facts are infinitely variable and therefore not wholly predictable.

Implicit in fact gathering is evaluating the facts you get, as you gather them. You are constantly evaluating because you can’t look everywhere, so promising leads get follow-up, the others don’t. Machines can scan millions of documents using optical character recognition because there are only so many combinations of letters out there. But the variety of human experience is limitless.

If machines can’t be trusted to properly evaluate someone’s story, imagine the problems if that story has never been written down. Think about all the things you would not want the world to know about you. How much of all of that has been written down? Probably very little. It was human effort alone that developed the other witnesses the prosecution wanted to call.

The only way a computer might have helped in this case would have been to predict – based on prior cases – which way the judge would rule in excluding the other evidence. Even that would be a tough program to write because these decisions turn on so many unique factors. But since judges are chosen at random, it wouldn’t have helped shape the decision about whether or not to charge Cosby.

Want to know more about our firm?

  • Visit charlesgriffinllc.com and see our two blogs, this one and The Divorce Asset Hunter;
  • Look at my book, The Art of Fact Investigation (available in free preview for Kindle at Amazon);
  • Watch me speak about Helping Lawyers with Fact Finding, here.
  • If you are member of the ABA’s Litigation Section, see my piece in the current issue of Litigation Journal, “Five Questions Litigators Should Ask: Before Hiring an Investigator (and Five Tips to Investigate It Yourself).

Lawyers need to find witnesses. They look for assets to see if it’s worth suing or if they can collect after they win. They want to profile opponents for weaknesses based on past litigation or business dealings.

Every legal matter turns on facts. Most cases don’t go to trial, fewer still go to appeal, but all need good facts. Without decent facts, they face dismissal or don’t even get to the complaint stage.Better innovation in law firms

Do law schools teach any of these skills? Ninety-nine percent do not.  Good fact-finding requires something not taught at a lot of law schools: innovation and creativity. Of course, good judges can maneuver the law through creative decisions, and good lawyers are rightly praised for creative ways to interpret a regulation or to structure a deal.

But when it comes to fact gathering, the idea for most lawyers seems to be that you can assign uncreative, non-innovative people to plug data into Google, Westlaw or Lexis, and out will come the data you need.

This is incorrect, as anyone with a complex matter who has tried just Googling and Westlaw research will tell you.

The innovative, creative fact finder follows these three rules:

  1. Free Yourself from Database Dependency. If there were a secret trove of legally obtained information, you would be able to buy it because this is America, where good products get packaged and sold if there is sufficient demand for them. And Google won’t do it all. Most documents in the U.S. are not on line, so Google won’t help you. For any given person, there could be documents sitting in one of the more than 3,000 counties in this country, in paper form.
  • If you use a database, do you know how to verify the output? Is your John C. Wong the same John C. Wong who got sued in Los Angeles? How will you tell the difference? You need a battle plan. Can your researcher arrange to have someone go into a courthouse 2,000 miles away from your office?
  • How will you cope with conflicting results when one source says John C. Wong set up three Delaware LLC’s last year, and another says he set up two in Delaware and two in New York?
  1. Fight Confirmation Bias. Ask, “What am I not seeing?” Computers are terrible at the kind of thought that comes naturally to people. No risk management program said about Bernard Madoff, “His auditor can’t be up to the task because his office is in a strip mall in the suburbs.”
  • For your researchers, find people who can put themselves in the shoes of those they are investigating. Not everyone can say, “This report must be wrong. If I were in the high-end jewelry business, I wouldn’t run it out of a tiny ranch house in Idaho. Either this is a small business or Idaho’s not the real HQ.” If someone doesn’t notice a discrepancy as glaring as this, they are the wrong person to be doing an investigation that requires open-mindedness.
  1. Don’t paint by numbers. Begin an investigation on a clean sheet of paper. Don’t base your investigation on what someone’s resume says he did. Verify the whole thing.
  • Look not just at what’s on the resume, but look for what was left off Jobs that didn’t go well, and people who don’t like the person.
  • Despite that your client tells you, they don’t know everything (if they did they wouldn’t hire you). If your client thinks you will never find a subject’s assets outside of Texas, look outside of Texas anyway. You owe it to your client.

Want to know more?

  • Visit charlesgriffinllc.com and see our two blogs, The Ethical Investigator and the Divorce Asset Hunter;
  • Look at my book, The Art of Fact Investigation (available in free preview for Kindle at Amazon);
  • Watch me speak about Helping Lawyers with Fact Finding, here.

Step one: don’t have a manual. That’s the message in an information-packed new book about the inner workings of the SEC just after the Madoff and now largely forgotten (but just as egregious) Allen Stanford frauds.

Step 1

In his memoir of five years at the agency, former SEC Director of Investment Management Norm Champ (now back in private practice) writes that he was stunned to arrive into public service in 2010 to find that examiners had no set procedures both when looking at regulated entities or in following up on their findings.

“If SEC inspectors ever arrived at a financial firm for an examination and discovered that the firm had no manual about how to comply with federal securities laws, that firm would immediately be cited for deficiencies and most likely subject to enforcement action,” he writes in Going Public (My Adventures Inside the SEC and How to Prevent the Next Devastating Crisis).

Among his proudest achievements were instituting such procedures at the SEC, and holding accountable anyone at the SEC who begins to follow up on a whistle-blower’s report – the kind that the Commission ignored in relation to Madoff and Stanford.

We’ve written and spoken lots about our methodology for due diligence. You start from scratch and look not just to verify what you’ve been handed, but for information the person or company don’t want you to see. You don’t close investigative doors prematurely even though human nature makes you want to do just that.

Starting from scratch means that you assume nothing. You don’t assume Madoff has all those assets under management unless you check. It would have been easy to do but nobody asked. Anyone who was suspicious of the absence of an independent custodian or a major auditor similarly let it slide.

This is what we refer to as a Paint-by-Numbers investigation: the forms and relationships are all taken as givens, and all you get to do is decide on color. In Madoff’s case, the “forms” (the existence of invested money) were illusory. Who cares about the color (say, the risk profile of the “securities”) of something that doesn’t exist?

In Stanford’s case, there was lots of information he wouldn’t have been proud of. An April 2007 FINRA report on the Stanford Group Company said the firm had been found to be operating a securities business while failing to maintain its required minimum net capital.  A former employee of Stanford’s alleged in an April 2006 complaint in Florida state court that Stanford was operating a Ponzi scheme.

Without internal accountability procedures in place, did all of the people at the SEC just sit there? No. Champ (who arrived post-Madoff and Stanford) describes an agency packed with a lot of dedicated professionals but with a good bit of deadwood immune to the disciplines of the private-sector job market. As we read about the federal budget proposals that seek to cut funding at a variety of agencies, this book contains two other pertinent messages:

  1. If you could fire people in government the way you can in the private sector, it would be easier for the government to save money.
  2. That battle is so tough that most people (including Champ) just try to work with the good people they can find and leave personnel reform for someone else.

Champ makes no promises that there won’t be more Ponzi schemes, but hopes that his organizational reforms will reduce the chances. As in any due diligence, you can’t promise that you will always catch everything – only that if there are repeated indications of a problem staring you in the face (complete with former employees blowing whistles), you will follow up.

Among Champ’s recommendations for blunting the damage of the next crisis, one is especially welcome: eliminate the scandalous government sponsorship of lotteries. Lotteries are the world’s worst investment, and yet the poorest members of society spend like crazy on them, all prompted by a lot of misleading and predatory government advertising “far beyond what private businesses are allowed.”

Champ asks us to imagine what could be done with all that money people waste if it were properly invested and devoted to investor education.

We agree. The millionaires who lost with Madoff could at least have afforded $2,000 of due diligence on their investment. The poor who play the lottery and who should be saving their money are the ones who need help the most help from the SEC and from state governments that need to find a less repugnant way to raise revenue.

Want to know more?

  • Visit charlesgriffinllc.com and see our two blogs, The Ethical Investigator and the Divorce Asset Hunter;
  • Look at my book, The Art of Fact Investigation (available in free preview for Kindle at Amazon);
  • Watch me speak about Helping Lawyers with Fact Finding, here.

 

What to do when the databases you rely on start stripping out the very data you are paying for?Due diligence databases

Word in today’s Wall Street Journal that the main credit reporting firms will be removing many civil judgments and tax liens from credit reports prompts us to restate one our core beliefs:

Not only do databases routinely mix people up, they are far from complete in the information they contain.

Now, they will be even farther away from complete, because in order to list adverse information the credit reporting companies want several identifiers on each piece of information before they include it in a credit report. Even if there is only one person in the United States with a particular name, if his address and Social Security number are not included in a court filing against him, that filing may never make it onto his report. From what we’ve seen, there are almost no SSN’s in most of the filings we review.

As a result of this new policy, the credit scores of a lot of people are about to go up, says the Journal.

To answer the question posed at the top of this posting: what you do is you go after the information yourself. You (or a competent pro you hire) looks at databases and courthouse records for liens, litigation and other information people use every day to evaluate prospective associates, counterparties and debtors. If there’s enough money at stake, you may want to conduct interviews, not only with references but with people not on the resume.

The idea that databases are missing a lot is old news to anyone who stops to take a careful look.

The next time you are searching in a paid database, you may notice a little question mark somewhere around the box where you enter your search terms. Click on that and prepare to be shocked.

“Nationwide” coverage of marriage licenses may include only a handful of states, because such licenses are not public information in many jurisdictions. In other cases, the information is public but the database doesn’t include it because it’s too expensive to gather data that has not been scanned and stored electronically.

Of course, sending someone to a courthouse costs more than a few clicks performed while sitting at your desk. But does it cost more than lending to the wrong person who defaulted on a big loan six months ago?

Want to know more?

  • Visit charlesgriffinllc.com and see our two blogs, The Ethical Investigator and the Divorce Asset Hunter;
  • Look at my book, The Art of Fact Investigation (available in free preview for Kindle at Amazon);
  • Watch me speak about Helping Lawyers with Fact Finding, here.

A story in the Wall Street Journal Google Uses Its Search Engine to Hawk Its Products serves as a useful reminder for something we tell clients all the time: Google is there to make money, and if your ideal search result won’t make them money, you may get a less-than-useful result.

Dollar sign filled with an electronic circuit. Blue background.

Google is an indispensable tool when searching for facts, but Google is not a disinterested party, like a good reference librarian. Google is in business to make money.

The story reports that Google buys some of its own ads, so when you search for a particular thing that Google’s parent company Alphabet sells, guess what? Alphabet’s products have a way of turning out on the top of the list.

One of the first things ever written on this blog more than five years ago was an entry called Google is Not a Substitute for Thinking, and it was one of the most read entries we’ve ever posted.

Among the arguments advanced there as to why a Google search is hardly ever going to suffice in any factual inquiry, we argued that Google’s search results are stacked in favor of the ones that are paid for or that Google judges to be commercially advantageous. A Google entry about a dry cleaner in Joplin, Missouri that has no website would not be very profitable for Google, but if that dry cleaner owes you $50,000, you would want him at the top of page one.

The best way to think about Google is to treat it as a meta-search engine. Imagine not that Google will be able to give you the final answer, but a clue as to where to find the final answer.

If your dry cleaner has no website, Google may point you to a site such as Yelp that rates a different dry cleaner in Joplin. Yelp may then have the dry cleaner you want, but that Yelp listing won’t necessarily come up on Google. Or, you may notice via Google that Joplin or the state of Missouri may require a permit to operate a dry cleaner. Google can help you find where to look up such a permit.

Remember, any dolt at the public library can use Google. It takes a person with the capacity to think creatively to use Google to its greatest potential.

Want to know more?

  • Visit charlesgriffinllc.com and see our two blogs, The Ethical Investigator and the Divorce Asset Hunter;
  • Look at my book, The Art of Fact Investigation (available in free preview for Kindle at Amazon);
  • Watch me speak about Helping Lawyers with Fact Finding, here.

Due diligence is all about following up on red flags, but if you don’t find them, there’s nothing on which to follow up.

Thus, our tireless refrain: turn over every piece of public record information you can about a person, and don’t leave it to others.finra DUE DILIGENCE.jpg

We were reminded of this by the story this weekend in the Wall Street Journal, which found that the brokerage industry regulator, FINRA, leaves a lot of red flags concerning members off its BrokerCheck website. While it’s laudable that FINRA recommends that investors check to see if brokers have ever been subjected to disciplinary action, that check is of limited use if bankruptcies, state-level actions and litigation are left off of BrokerCheck.

We have been writing for years about the need to do thorough searches when conducting due diligence on anyone – pre-employment, pre-deal, or during litigation. In Avoiding Due Diligence Failure: Following Up on Red Flags, we dealt with the problem of the Semmelweis reflex in due diligence.

This is when you want to confirm that someone who is supposed to be squeaky clean really is, and so you write off what looks like a problem in his past to database error. You can also see confirmation bias, in which people rely too heavily on bad or incomplete evidence that leads them to their desired conclusion.

But, before you even get to battling Semmelweis and bias in mishandling red flags, you have to see the flags in the first place. For that, we provide a non-exhaustive checklist to our clients of the kinds of sources we will check. We wrote about it here.  

It’s critically important to note that these sources are not checked on line much of the time, but on site. That’s because a lot of information at the county or state level is not available on the internet. You need a good network of on-site retrievers to go pull it at the courthouse and send it back to you. You then need to double check to make sure your retriever didn’t miss anything. While not everything is on line, abstracts of some matters may be. When you get your pile of documents back, it’s always good to make sure that everything you found on line is represented in the results.

 

We recently blogged here about surprising facts we’d found when doing diligence on expert witnesses.  We’ve looked into so many people and companies that we’re rarely taken aback when we find that someone left something off of a resume or lied about a degree.  But sometimes we come across a news story that manages to catch us a little by surprise, like this recent news article we read abouViolin on a Pedestal.jpgt the founder of the Suzuki violin method which has been studied by millions.  

According to several news articles, Shinichi Suzuki, the now deceased music teacher behind the world-renowned Suzuki method of violin is “the biggest fraud in musical history” for fabricating his training and background in violin.    

According to the articles, Suzuki never had any proper violin training, despite his claims that he studied at the Berlin Conservatory under a renowned teacher.   The articles state that Suzuki was rejected from the Berlin Conservatory in 1923, was essentially self-taught and never played in any orchestra. 

The revelation that Suzuki may not have had the musical background he touted does not necessarily undermine the efficacy of his teaching method, but it does highlight the fact that even well-reputed people fabricate or omit things in their background.  We recall in 2007 that the former Dean of Admissions at M.I.T. was forced to resign when the school discovered that she’d made up her education credentials. 

We should also note that, while it is important to do thorough background checks, it is equally important to pay attention to the sources of background information.  With Suzuki, the source of the negative information is Mark O’Connor, a violinist with a competing violin teaching method.  The Suzuki empire (Suzuki himself is now deceased) is fighting back and states that they “can only speculate as to why Mr. O’Connor, who publishes and sells his own approach to violin playing, is so eager to discredit Shinichi Suzuki and why he has chosen to manipulate media at this time.”  They also claim that Suzuki did not fabricate his credentials, and put forth some evidence to support this claim. 

When we do a background check, particularly a background check that includes interviews, we always make sure to speak with a balanced group of people because we are aware of the potential for bias.  If we speak to a negative reference like a litigation opponent, we will also to speak with a probable positive reference, such as someone that successfully did business alongside the person we are researching for many years.