Ethical considerations in the use of AI – Reuters.com

ChatGPT logo and AI Artificial Intelligence words are seen in this illustration taken May 4, 2023. REUTERS/Dado Ruvic/Illustration/File Photo/File Photo Acquire Licensing Rights
October 2, 2023 – The burgeoning use of artificial intelligence ("AI") platforms and tools such as ChatGPT creates both opportunities and risks for the practice of law. In particular, the use of AI in research, document drafting and other work product presents a number of ethical issues for lawyers to consider as they contemplate how the use of AI may benefit their practices. In California, as in other states, several ethics rules are particularly relevant to a discussion of the use of AI.
Although some ethical questions may lack clear answers, being mindful of these issues before integrating AI may help lawyers avoid issues in the future. This article will analyze AI questions through the lens of the California Rules of Professional Conduct.
Lawyers have a professional duty to maintain professional standards and ensure their use of AI is compatible with their ethical obligations under the State Bar of California's Rules of Professional Conduct (the "Rules") and applicable law. The Rules are "intended to regulate professional conduct of lawyers" and are "binding upon all lawyers" licensed in California. CRPC 1.0(a). Some Rules are particularly relevant to a discussion of the use of AI in the legal profession.
Competence
California Rule of Professional Conduct 1.1 imposes on lawyers a duty of competence, which, among other things, requires a lawyer to apply the "learning and skill…reasonably necessary" for the representation of a client. CRPC 1.1(b). The comments to Rule 1.1 further explain that the duty of competence "include[s] the duty to keep abreast of the changes in the law and its practice, including the benefits and risks associated with relevant technology." CRPC 1, Comment [1]. The use of AI in the practice of law presents at least two competence issues to consider.
First, lawyers have an ethical duty to understand the risks and benefits the use of AI tools present for both lawyers and clients, and how they may be used (or should not be used) to provide competent representation to clients.
Second, lawyers should consider how they can incorporate AI tools into their practices without compromising the competent representation of their clients. Although AI can be a powerful tool, the use of AI tools may have catastrophic results for both lawyers and clients if lawyers fail to vet any outputs prior to using them in their work. For example, two attorneys were sanctioned by a New York federal judge for submitting a brief authored by AI that referenced nonexistent case law. (For more information, see here)
Finally, as AI tools become more sophisticated and their use in the legal profession becomes more widespread, lawyers will need to consider whether the failure to use an available AI tool would itself be a failure to meet the duty of competence.
Communication with clients
Rule 1.2 imposes on lawyers a duty to communicate with their clients about the scope of the lawyer's representation, and Rule 1.4 requires a lawyer to consult with the client about how the lawyer intends to accomplish the client's objectives.
Accordingly, these Rules may require a lawyer to explain how and why the lawyer intends to use AI tools in the course of representing the client, and to discuss with the client such tools' associated benefits and risks. If a lawyer chooses not to use AI, that decision may also need to be communicated to the client.
Fees for legal services
Rule 1.5 establishes the ethical limitations on the reasonable fees a lawyer may charge a client.
Because the factors used in determining the reasonableness of a fee include time/labor, novelty of the issue, and customary fees, novel fee issues can arise if a lawyer employs AI tools to perform some tasks in his representation of a client. Can a lawyer ethically bill a client for the work that an AI tool performed? Can an AI tool have an hourly rate? And how would a lawyer account for the "time" the AI tool "expended" to perform a particular task?
Conversely, if a lawyer could use AI to perform certain tasks — such as completing the first draft of a routine document, or reviewing a contract to ensure defined terms are used consistently — but elects not to do so and instead performs the tasks himself and bills his client for the work at the lawyer's standard hourly rate, has the lawyer charged the client an unconscionable fee in violation of Rule 1.5? The answers to these questions are not clear, but a lawyer may have an ethical obligation to employ available technology to provide legal services to a client more efficiently.
Confidentiality
The duty of confidentiality codified in Business & Professions Code section 6068(e)(1) and Rule 1.6 requires a lawyer to maintain as confidential all information the lawyer learns from a client in the course of representing that client, unless the client authorizes its disclosure.
Some AI tools do not guarantee the confidentiality of user inputs. For example, OpenAI, the creator of ChatGPT, discloses in its Terms of Service and related documents that a user's "conversations may be reviewed" by OpenAI employees to "improve [OpenAI's] system," and OpenAI explicitly warns users not to "share any sensitive information in [their] conversations." (See OpenAI FAQs here: https://bit.ly/3qXj1tm.) Further, OpenAI's Privacy Policy places the burden of maintaining confidentiality on users: "[Y]ou should take special care in deciding what information you send to us via [ChatGPT]." (See Section 5 of OpenAI's Terms of Use)
In order to comply with Rule 1.6, it is important that lawyers ensure the AI tools they employ have implemented measures to protect client information. Lawyers should review the terms of use and privacy policies of an AI tool before using it, and only use a particular tool when the lawyer is confident that the client's confidential information is secure.
Supervision
Rule 5.1 imposes on more senior lawyers an obligation to ensure that more junior lawyers working under their supervision comply with the Rules of Professional Conduct. Rule 5.2 imposes on non-supervisory lawyers an obligation to comply with the Rules of Professional Conduct. Finally, Rule 5.3 imposes on law firm managers and supervisory lawyers a supervisory obligation with respect to non-lawyers.
Law firm management and supervising partners must ensure that subordinate lawyers and non-lawyers use AI tools in accordance with their professional obligations. Non-supervisory lawyers and non-lawyers have an ethical obligation to use AI tools consistent with the Rules of Professional Conduct and California law. Rules 5.1, 5.2 and 5.3 arguably also impose an obligation on lawyers to "supervise" the work of AI tools lawyers use in their representation of clients. This includes understanding which tasks are appropriate for AI tools and ensuring the accuracy of AI outputs.
While a review of the Rules may assist lawyers in identifying potential issues in the ethical use of AI tools in their practices, the Rules also provide helpful guidance in identifying practical suggestions for incorporating AI into the practice of law.
Lawyers should exercise care when deciding whether a particular AI tool would provide useful assistance in the representation of a client. Lawyers may, at times, need to consult with technology experts to understand an AI tool, how it works, and whether it can be usefully deployed in a particular client matter. Lawyers should also clearly communicate with their clients about the use of AI in the representation, including the risks and benefits of AI.
AI tools may be used as a starting point in generating content, but AI-generated work product should never be presented as finished content or a lawyer's final product. Lawyers have a professional obligation to thoroughly review any AI-generated work product to ensure the results are accurate.
Lawyers should also be cautious when sharing client or firm data with AI tools. If the tool lacks robust confidentiality and data security, obtaining the client's informed written consent is essential before using it. Additionally, lawyers should verify if any third parties can access the data to avoid compromising the attorney-client privilege.
Finally, lawyers should not directly quote output from AI tools in work product sent to clients, opposing parties, or the courts. As discussed above, any AI outputs should be reviewed thoroughly before being incorporated into a preliminary draft or version of any attorney work product. This recommendation includes confirming the accuracy of any cases cited to support a particular argument.
Brad Hise, partner and general counsel of Hanson Bridgett LLP, advises the firm's attorneys on conflicts of interest, legal ethics, and other professional responsibility and risk management issues. He is based in San Francisco and can be reached at BHise@hansonbridgett.com.
Jenny Dao, member of Hanson Bridgett LLP's land use practice group, helps clients navigate the complex legal landscape of land use and zoning law. Her work primarily focuses on land use entitlement, permit approvals, and compliance with local and state land use laws. She is based in Walnut Creek, California, and can be reached at JDao@hansonbridgett.com.
Diana Novak Jones
Andrew Goudsward
Sara Merken
Sara Merken
Reuters, the news and media division of Thomson Reuters, is the world’s largest multimedia news provider, reaching billions of people worldwide every day. Reuters provides business, financial, national and international news to professionals via desktop terminals, the world's media organizations, industry events and directly to consumers.
Build the strongest argument relying on authoritative content, attorney-editor expertise, and industry defining technology.
The most comprehensive solution to manage all your complex and ever-expanding tax and compliance needs.
The industry leader for online information for tax, accounting and finance professionals.
Access unmatched financial data, news and content in a highly-customised workflow experience on desktop, web and mobile.
Browse an unrivalled portfolio of real-time and historical market data and insights from worldwide sources and experts.
Screen for heightened risk individual and entities globally to help uncover hidden risks in business relationships and human networks.
All quotes delayed a minimum of 15 minutes. See here for a complete list of exchanges and delays.
© 2023 Reuters. All rights reserved

source

Jesse
https://playwithchatgtp.com