Submitted by yourd00m t3_10lvj01 in technology
Comments
quantumfucker t1_j62m3dp wrote
I don’t understand why they didn’t try to market this as an assistant to help you represent yourself rather than a substitute for a human lawyer. While we should encourage people to go with human lawyers, it’s still someone’s right to represent themselves, and this should’ve just been launched that way.
But that being said, I find it concerning that he was being threatened with jail time for trying this out. We should consider that a lot of lawyers are objecting not just because robots would be worse, but because it also threatens their personal livelihoods.
Crack_uv_N0on t1_j6442sr wrote
For that we’d have to know what law this comes under and what the penalties are. If what I’ve read is accurate, first time offender rarely spend time in jail. Of course, these the convicted have lawyers who know how present the case before the judge in a way that will resonate with the judge, something way beyond the capability of AI robot.
If I were a defendant in a criminal case & and were eligible for a public defender, I would rather take a chance on a public defender, who would likely be overworked.
VincentNacon t1_j5zuo3c wrote
Sounds like they're worried the AI could expose the corruption in the legal system.
​
EDIT: Wow... someone downvoted me with bots. I had few points and now it's all gone.
__Fury t1_j61jhei wrote
Expose corruption how? Because they won't let a computer regurgitating a database practice law?
lordtema t1_j61lq8s wrote
They arent. I read a very good twitter thread about somebody who tried DoNotPay, and it does not really appear to be anything "AI" about it.. Waiting times of 1-8 hours and more basically indicates that the AI is just some human working behind the scenes (and not a lawyer at that) the quality it outputs is horrendously poor.
dcazdavi t1_j60c49n wrote
since everyone is down voting this and other similar viewpoints without providing decent arguments; it now makes more sense
sickofthisshit t1_j60fg3e wrote
Only to techbro idiots who don't understand how the legal system and lawyers (and Reddit) work. The comment was not a good one, it gets down votes, it isn't worth arguing with.
[deleted] t1_j617c0n wrote
[deleted]
sickofthisshit t1_j61druf wrote
Here's someone with legal training explaining just one part of why shit like this can't work and is actively harmful
https://twitter.com/courtneymilan/status/1618620649412624387
dcazdavi t1_j60gdb2 wrote
what other conclusion could a legal layman reach when lawyers have prevented anyone from even trying?
i don't doubt that it couldn't replace a lawyer yet but i has to start somewhere and it's not being allowed to happen at all with no reasons given.
sickofthisshit t1_j60j959 wrote
>lawyers have prevented anyone from even trying?
That is not what happened. Some douchebro with no clue was stopped from completely screwing over innocent people by giving them garbage advice.
The way to figure out if AI can help is to engage on good faith with lawyers doing the work, not to pretend you have some patent medicine to cure every legal problem and brag online.
dcazdavi t1_j60jt9t wrote
a c-level exec of a startup was going to try using it in court to defend against a ticket of his own and he was threatened with legal action if he tried.
many laywers wouldn't bother with such cases and even more cannot afford to seek a lawyer's help; so alternatives are need
sickofthisshit t1_j60ki07 wrote
The idiot was not clear about whose ticket it was and said he sent a subpoena to the cop which is the absolute last thing you want to do. Basically your only hope in a speeding ticket is the cop not showing up.
If it was his ticket, he was just stupid, if it was someone else's ticket, it is unlicensed practice of law, and really bad advice.
dcazdavi t1_j60mfx3 wrote
>If it was his ticket, he was just stupid, if it was someone else's ticket, it is unlicensed practice of law, and really bad advice.
with low stakes which is exactly where you start to test something new.
he's fully aware that it's not a lawyer, so unlicensed practice accusations don't make sense and it's a tool whose wielder has to know that any sharp edge tools can also cut you as well; assuming it even gets a chance to become a tool.
sickofthisshit t1_j60zcwu wrote
Unlicensed practice of law does not have an "I know I am not a lawyer" exception. If he is using an AI to give people advice on what to do when they appear before a judge, that is practicing law. And neither he nor the AI are licensed for that in any jurisdiction.
As for "low stakes", the same grifter bro was offering a million dollars for some "lawyer or person" to use AI before the Supreme Court.
Also, you start playing games with a Court, they can cite you for contempt, beyond whatever the ticket penalty would be. You could get the judge to suspend your license when he was just going to give you a fine.
This is not a fucking game where people should try out experimental shit just to see how it goes.
youmu123 t1_j62fwkq wrote
>Unlicensed practice of law does not have an "I know I am not a lawyer" exception.
Does this effectively mean that non-lawyers have no right to represent themselves? Can't the guy represent himself using AI arguments with the AI as an advisor?
Commotion t1_j62hup7 wrote
Non-lawyers can represent themselves. They can’t represent others.
Using AI while in court might be issue. I don’t think most judges would let people search for arguments on Google during oral argument either.
sickofthisshit t1_j638zh1 wrote
The restriction is on other people offering guidance, and to argue on behalf of someone else, such as the service offering the AI for this purpose, particularly if the AI is being consulted at court and offering "knowledge" about the law.
The grifter is straight up running a service and saying "use this to get help for your legal situation". This is pretty much as clear as a violation can be.
OtherBluesBrother t1_j624yjc wrote
"Ladies and gentlemen of the jury, I am but a simple DOS program that had been forgotten since before the age of the Internet. When, in the mid 2010's, I was discovered by programmers who trained by in all human law code and taught me to formulate coherent arguments with ChatGPT."
NickyXIII t1_j63ky3z wrote
This is an amazing update of Unfrozen Caveman Lawyer!
lostinthemines t1_j5zf5y9 wrote
Plus, the AI would know better than to get put in jail
[deleted] t1_j626w8b wrote
[deleted]
[deleted] t1_j62o79x wrote
[deleted]
[deleted] t1_j633xap wrote
[deleted]
icebeat t1_j5zsrle wrote
So where in the law said that a AI can not be used?
fifa71086 t1_j5zz9fs wrote
It doesn’t, but it does make the unauthorized practice of law a crime. The creator of this is holding it out as an attorney, which would be the unauthorized practice of law by the company. Not saying it’s right, but the branding by the company was dumb because they are referring to it as an attorney.
pad84 t1_j61jdl0 wrote
Now he should sue them. They intimidate him.
chopinrocks t1_j638ct4 wrote
AI needs to be fed human data to work. Unless something is a simple or 100% consistent rule set (like chess, or a video game) it is NOT going to work.
AI is never going to replace a human lawyer in court.
Heres_your_sign t1_j60whry wrote
Look, we all know that any threat to the livelihood of lawyers will rain hellfire on those dumb enough to attempt it. Everything else except lawyers will be replaced by AI, at least until the politicians are paid off appropriately.
monchota t1_j5zq8p5 wrote
If its better than public defenders than so be it.
dcazdavi t1_j60bts7 wrote
they should give the public defenders free access/use and training on how to use it
sickofthisshit t1_j60frru wrote
Public defenders don't need AI. They need enough hours in the day to help the clients they have, which means the public needs to hire enough for the caseload, not asking a hopped-up ELIZA to pretend to help.
dcazdavi t1_j60gsi0 wrote
public defenders don't need an assistant that can automatically provide help and decipher patterns from every legal source document that's ever been created and available to the public on other case history that's relevant to their current case?
sarcasm is not intended; i'm re-asking the question from the way i understand it.
sickofthisshit t1_j60i78d wrote
>decipher patterns from every legal source document that's ever been created
See, this is the stupid shit that techbros say which shows they do not have any clue at all what the practice of law is about.
Public defenders don't need to find some secret piece of law hidden deep in a vault to help their clients. They need an understanding of what the prosecution is trying to do, the evidence and witnesses and questions they will use to persuade a jury, identify the weaknesses of the case, negotiate with the prosecution if possible, find and prepare their own witnesses and evidence, understand the procedures of the court, know how to work within those procedures and motion practice to get the best possible outcome, challenge the prosecution motions and actions, effectively cross-examine witnesses, and explain things to the client and emotionally guide them through the process.
Not consult millions of lines of legal documents.
dcazdavi t1_j60k97z wrote
no one is proposing that this could accomplish the same results as a lawyer and it will likely never reach the same level of emotional capability and guidance; but the alternative for many who cannot afford this help is nothing at all and something is usually better than nothing.
sickofthisshit t1_j60l0qw wrote
No, this something is not better, this kind of shit is worse, because it can actively get clients in trouble while thinking they have help. That's why unlicensed practice of law is forbidden.
It's like sending cancer patients to faith healers or giving them fake medicine or poison.
Edit: https://twitter.com/courtneymilan/status/1618620649412624387?s=20&t=x4sD7_5Y4zXim9B5d7ZbTQ
Makes several points including that making legal filings or statements in court can cause serious problems if they are not true or if this is your only/last chance to raise important issues and you fail to do so.
Also, lawyers have to find out facts that are relevant, by knowing which facts would be important, how to establish those facts (given human imperfections like bad memory or dishonesty), and prove them to the relevant legal standard.
It's not just making the right kind of word noises.
Garden-Wrong t1_j6193wz wrote
Can the AI pass the bar? And any other legal requirement? Then it has to be used. However in my opinion the Judges should be the first replaced by AI. Then theoretically all bias and corruption of the legal system SHOULD be eliminated.
dctucker t1_j61dvbi wrote
AI will bring its own bias to the table either way.
- Human writing is biased.
- AI will be developed based on human writing.
- AI will be biased.
Edit: to follow up, the entire point of having a judge in jury is not to apply strict precise set of laws to messy humans, but to emphasize the humanity when applying laws written by messy humans to situations involving humans.
acsmars t1_j62pqs9 wrote
AI has been shown to be as biased, if not more so, than it’s creators and training data.
MonsieurKnife t1_j5zbs9w wrote
It’s a power play to protect lawyers livelihood. Otherwise they would let the bot take the bar exam and they would have to accept the results. It’s not so different from workers in the 19th century trying to destroy the factories’ new machines.
NightlyWave t1_j5zj2dy wrote
If you think an AI with zero consideration for human emotions and circumstances would make a good lawyer, you’re very wrong.
[deleted] t1_j5zjhmr wrote
[deleted]
NightlyWave t1_j60gnzm wrote
Absolutely. I'm a software engineer and I've been using ChatGPT for writing mundane code and asking it questions and it's been amazing. My personal (current) stance is that AI won't be replacing jobs any time soon but they're an amazing tool to help out with your profession.
[deleted] t1_j60ihtj wrote
[deleted]
jag149 t1_j601s96 wrote
I think this is the right approach. Natural language searches have become much more popular these days, as compared to Boolean searches. I read yesterday that (I think) ChatGPT passed the essay portion of a bar exam... Not that surprising. It's a fixed curriculum that conforms to an outline format, with millions of example texts, and you get credit for synthesizing a factual prompt with an existing rule that relates to it. Very different from developing a working knowledge of a novel area and then advocating for why it applies to a novel situation. In other words, common law is meant to guide people's actions prospectively, and a chat bot can only process retroactive information.
That said, I don't have a problem with what the company tried to do here. It wasn't the practice of law. It was an aid for a pro per defendant. If it can be a tool for licensed attorneys helping clients, why can't it be a tool for litigants representing themselves?
sickofthisshit t1_j60jzbu wrote
>. I read yesterday that (I think) ChatGPT passed the essay portion of a bar exam...
I don't think that is what happened. A bot got a mediocre grade on law school questions, and passed a couple sections of the multiple choice part of the multi state bar exam.
jag149 t1_j60msie wrote
You're correct. Law school exam, not bar exam. (Though, at least in California, the part of the bar that isn't the MBE is essays, so I think this suggests it could do the same thing with the bar itself.)
I will also state an axiom of the legal educational system: C's get degrees, bruh.
sickofthisshit t1_j605jar wrote
Being a lawyer is not about "millions of legal lines", it is about being able to locate the few dozen lines that apply to the particular situation, understanding the principles behind those lines that give them meaning, and the tactical understanding of the humans involved to come up with a strategy.
There already are digital forensics and digital discovery tools that manage large document dumps. Which litigators already know about and use.
[deleted] t1_j60aluz wrote
[deleted]
sickofthisshit t1_j60f4el wrote
Yes. Because lawyers don't need artificial intelligence to "summarize" the law, they need a legal education, experience, and sometimes search indices.
[deleted] t1_j60fbrr wrote
[deleted]
sickofthisshit t1_j60h52h wrote
What "practice" are you even talking about? These AI bots are completely unsuited for giving legal advice.
[deleted] t1_j60id2s wrote
[deleted]
sickofthisshit t1_j60jkyv wrote
>how on earth can you still have it stuck in your head that I am talking about AI giving legal advice
Idiot, we are in a thread about some other idiot applying AI to pretend to give legal advice.
[deleted] t1_j60ky1l wrote
[deleted]
MrFugu57 t1_j5zutc3 wrote
Sure buy I'm not sure if I would refer to the intelligence of human lawyers as "artificial"
[deleted] t1_j5znou9 wrote
Well if it can vigorously defend to the full extent of the law…
Jonsj t1_j62wap2 wrote
Why not let the AI practice medicine as well? The bar exam is designed to test a human competence as lawyer and it already assumes you are a human.
I have no doubt that in the future (right about now) the field of law will have great use of AI. It's a huge waste of resources to have some of the brightest hard working people looking through massive amounts of documents to match with our the law. Perfect use case for AI. Lawyers will oversee and lead the case. But the courtroom is the last place it show up and indepently try cases.
The human element is a design feature in courtroom not a flaw, we want lawyers that represents and fight for their clients, judges to oversee the process. We want there to be subjective judgements because we believe that intent and circumstances matter. The current best AI are probability large language models. Not empathic human beings.
Crack_uv_N0on t1_j5zs8v4 wrote
This an assistant at best, not a lawyer. There is no indication that Browder has himself any knowledge about what it takes to actually practice law. Browder should have found out beforehand what it would be properly be called.