Do you ever wonder why some gifted small children play Mozart, but you never see any child prodigy lawyers who can draft a complicated will?

The reason is that the rules of how to play the piano have far fewer permutations and judgment calls than deciding what should go into a will. “Do this, not that” works well with a limited number of keys in each octave. But the permutations of a will are infinite. And by the way, child prodigies can play the notes, but usually not as soulfully as an older pianist with more experience of the range of emotions an adult experiences over a lifetime.

You get to be good at something by doing a lot of it. You can play the Mozart over and over, but how do you know what other human beings may need in a will, covering events that have yet to happen?

Not by drafting the same kind of will over and over, that’s for sure.

Reviewing a lot of translations done by people is the way Google Translate can manage rudimentary translations in a split second. Reviewing a thousand decisions made in document discovery and learning from mistakes picked out by a person is the way e-discovery software looks smarter the longer you use it.

But you would never translate a complex, nuanced document with Google Translate, and you sure wouldn’t produce documents without having a partner look it all over.

The craziness that can result from the mindless following of rules is an issue on the forefront law today, as we debate how much we should rely on artificial intelligence.

Who should bear the cost if AI makes a decision that damages a client? The designers of the software? The lawyers who use it? Or will malpractice insurance evolve enough to spread the risk around so that clients pay in advance in the form of a slightly higher price to offset the premium paid by the lawyer?

Whatever we decide, my view is that human oversite of computer activity is something society will need far into the future. The Mozart line above was given to me by my property professor in law school and appeared in the preface of my book, The Art of Fact Investigation.

The Mozart line is appropriate when thinking about computers, too. And in visual art, I increasingly see parallels between the way artists and lawyers struggle to get at what is true and what is an outcome we find desirable. Take the recent exhibition at the Metropolitan Museum here in New York, called Delirious: Art at the Limits of Reason, 1950 to 1980.

It showed that our struggle with machines is hardly new, even though it would seem so with the flood of scary stories about AI and “The Singularity” that we get daily. The show was filled with the worrying of artists 50 and 60 years ago about what machines would do to the way we see the world, find facts, and how we remain rational. It seems funny to say that: computers seem to be ultra-rational in their production of purely logical “thinking.”

But what seems to be a sensible or logical premise doesn’t mean that you’ll end up with logical conclusions. On a very early AI level, consider the databases we use today that were the wonders of the world 20 years ago. LexisNexis or Westlaw are hugely powerful tools, but what if you don’t supervise them? If I put my name into Westlaw, it thinks I still live in the home I sold in 2011. All other reasoning Westlaw produces based on that “fact” will be wrong. Noise complaints brought against the residents there have nothing to do with me. A newspaper story about disorderly conduct resulting in many police visits to the home two years ago are also irrelevant when talking about me.[1]

The idea of suppositions running amok came home when I looked at a sculpture last month by Sol LeWitt (1928-2007) called 13/3. At first glance, this sculpture would seem to have little relationship to delirium. It sounds from the outset like a simple idea: a 13×13 grid from which three towers arise. What you get when it’s logically put into action is a disorienting building that few would want to occupy.

As the curators commented, LeWitt “did not consider his otherwise systematic work rational. Indeed, he aimed to ‘break out of the whole idea of rationality.’ ‘In a logical sequence,’ LeWitt wrote, in which a predetermined algorithm, not the artist, dictates the work of art, ‘you don’t think about it. It is a way of not thinking. It is irrational.’”

Another wonderful work in the show, Howardena Pindell’s Untitled #2, makes fun of the faith we sometimes have in what superficially looks to be the product of machine-driven logic. A vast array of numbered dots sits uneasily atop a grid, and at first, the dots appear to be the product of an algorithm. In the end, they “amount to nothing but diagrammatic babble.”

Setting a formula in motion is not deep thinking. The thinking comes in deciding whether the vast amount of information we’re processing results in something we like, want or need. Lawyers would do well to remember that.

[1] Imaginary stuff: while Westlaw does say I live there, the problems at the home are made up for illustrative purposes.

GettyImages_108219173.jpgAdam Davidson recently wrote “Making Choices in the Age of Information Overload,” for the New York Times magazine where he explained how consumer choices have changed in the Information Age.  With so much data about a potential purchase—from price comparisons to reviews by ostensibly objective consumers—we are drowning in a sea of information.  Consumers often feel overwhelmed by the mounds of data they have to sift through. This would be the proverbial “information overload” we assume is unique to our information age, but which media historians point out has been a constant throughout media revolutions. 

Does all this information help shape our choices? Well, not really

Davidson explains that in order to make a decision, consumers routinely tune out all this noise and instead rely on “signals”—cues that companies use to indicate their market prowess—to make their decisions.  For example, endorsements by high-profile celebrities send the subconscious signal that the product must be in high demand because otherwise the company wouldn’t be able to pay the celebrity’s hefty endorsement fee. 

As business Professor Hemant Bhargava explains in the article, if you don’t have the time to research a product or just can’t make heads or tails of all the information you can find out about it online, then you can set all that aside and trust the product’s signals instead.  

In other words, consumers aren’t drowning in information overload because they’ve found a way of effectively and efficiently filtering out the information they don’t need.  As media analyst Brian Solis of Altimeter Group recently explained in his blog, information overload is a fallacy.  That feeling of being overwhelmed by information is nothing more than a failure to filter, an unwillingness to really focus on what’s important. We all have the ability to filter out the information we don’t need and instead focus on what’s most significant to us before making a decision.  

This is certainly the case in fact investigation.  The odds are that if a client has hired an investigator, then the information they need is not readily available, or there is too much information and they don’t have the time or expertise to sift through it all.  Sometimes our searches lead us to scores of facts that we then have to analyze. Sometimes the facts don’t provide us with what we need, but they get us close. Other times we uncover information we weren’t expecting, but that after more digging turns out to be useful to our client nonetheless.  The process is not linear, and it’s often more time-consuming than our client anticipated.  

We have an obligation to work as efficiently as possible, and the right filters ensure maximum efficiency. 

For example, we recently did an asset search on a man who had declared bankruptcy but who we had reason to believe might have hidden assets.  At first glance, there weren’t that many facts to sort through:  The man had been savvy enough to avoid hiding assets in his name or via the various corporations he ran.  But there was no shortage of people associated with him.  This man had a number of relatives and associates he’d been in business with throughout the years.  Might he be hiding any assets through them?  Investigations of his family and his most recent business partners were fruitless.  So then we investigated the spouses of his ex-partners.  Sure enough, the wife of his most recent partner had incorporated a business less than a month after the partners had shut down their latest venture.  And our guy was linked to the business via a trademark application filed by the new company.  In this instance, filtering through the dozens of people linked to this man had yielded just the sort of information our client was looking for.

The next time you feel like you’re drowning in a sea of data, remember, there is no information overload.  As an investigator it’s all about coming up with the right filter.  Take a step back and make sure you have the correct filters in place.  With creativity, experience, and trial and error, it’s possible to dig out from underneath all that data and find the information your client needs.

The world is getting smaller in many ways, including for fact finders looking to get information about companies.

Sometimes, the company across the street will file more information about itself halfway around the world than it will in its own jurisdiction. With a computer or a good person on the ground far away, the information can be yours in a matter of minutes or hours.

file0002077177770.jpg

Most people know the way this usually works: a company from a country with rotten disclosure wants to raise money in the U.S., and so is subject to the rigorous reporting requirements by the Securities and Exchange Commission. Foreign companies can be forced to disclose executive pay packages, and that can sometimes give you the name of a private company the executive gets his pay sent to. Great stuff, and only available because the government forces the information out of the company.

But what about the other direction? Companies from the U.S. or other jurisdictions that have great disclosure, which nonetheless turn over more information overseas than they might at home?

Two cases in point:

Last month the EU unveiled legislation to require “transparency” from “extractive companies,” which means companies that dig or pump stuff out of the ground or chop down trees. Even private companies that ordinarily would have no major reporting requirements to non-shareholders would have to disclose payments they made country-by-country.

A time-honored gift to U.S. investigators is Companies House in the United Kingdom. Private companies from anywhere in the world that want a presence in the U.K. have to register. You get names of directors, addresses, shareholder information and financials.

So the next time you have to look up information on a company, ask not only where the company is incorporated. Ask also: “where does it do business?”