top of page
Search
Writer's pictureFrederick L Shelton

Pulling Back the Curtain on Legal AI for Attorneys.




Ah, the dazzling world of legal AI – a realm where the line between cutting-edge technology and clever marketing gimmicks is as thin as a subpoena on a diet! Today, let's embark on a digital odyssey to unmask some of these so-called "AI" tools that are about as intelligent as a box of rocks.


First, let's talk about generative AI. It's the hot new thing, right? Everyone's talking about it, but here's the kicker: a lot of it isn't AI at all. It's like calling a tricycle a Tesla just because they both have wheels. (I have owned both a tricycle and a Tesla, and I can verify that one is significantly faster than the other).


Take, for instance, sites like legalquestions.help. I posed a simple question:


"Is it illegal to throw rocks at police cars in Chicago?"


And what did I get? A digital shrug that said, "Uh Oh. Better call a lawyer. Something's gone wrong."


Fluency Does not Equal Accuracy

Worse, are sites that claim to offer solid legal advice and are now in need of it, because they are the subject of a class action lawsuit against sites like DoNotPay .com!

It's like asking a magic 8-ball for legal advice - you'll get an answer, you just don't want to count on it for accuracy. As my friend Barak Turovsky (VP of AI for Cisco) is fond of saying "Fluency does not equal accuracy."


So today, I'll wave a red flag and advise that you steer clear of these sites that claim to be safe, secure and AI driven but are not. Especially the platforms that use OpenAI (parent of Chat GPT). Oh, the horror! It's like inviting a paparazzo to a confidential meeting.

ICYMI Samsung had it's Master Code Leaked to the World, when a few of their programmers decided it would be a brilliant idea to input said code, into Chat GPT, to see if it could be improved upon. Guess who's not getting a Holiday bonus? Those guys.


But wait, there's more. I recently saw a post from a lawyer who's been using ChatGPT for legal matters – without telling his clients! That's like a chef secretly using peanuts in dishes prepared for people who are allergic to them. No bueno.


The Shelton & Steele Golden Rule for AI and the Law:


When it comes to AI, both transparency and HEF must rule.


You MUST inform clients in advance, when you're using AI, what specifically you're using it to do, and that HEF (Human Eyes Failsafe) will always be the final step. For example, Spellbook and Luminance are perfectly good examples of AI that can do simple, first draft contract generation. Clearbrief can save hours of time finding facts, precedents and links to REAL cases (as opposed to the ones hallucinated by Chat GPT in the Avianca Airlines brief). It can also automate your Tables of Authorities and all of your exhibits (for a hundred bucks a month!).


But if you're using them, you need to tell your clients upfront about it and that HEF will always be the final step in producing any legal product or strategy.


Yes, let's embrace the wonders of technology and AI, but let's not be fooled by the masquerade. Real AI will revolutionize the profession and the future of pricing, but while it may be able to pass the bar exam (which it has!), let's view it like a first year associate. Everything it produces needs to be supervised, reviewed and oftentimes, corrected.


Frederick Shelton is the CEO of Shelton & Steele, a national legal recruiting and consulting firm. He can be reached at fs@sheltonsteele.com 

Comments


bottom of page