Digital Assistants and Professional Responsibility
Okay Google, can you handle my bills for me?” If only it were that easy.
Voice controlled digital assistants are transforming the way businesses in general operate, making daily tasks as easy as a simple voice command.
Digital voice assistants like Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, and Google Voice are now being deployed in law firms as well.
Scheduling appointments, getting answers to questions, conducting research, billing, and other mundane tasks may now be handled by voice using a desktop computer, smartphone, or one of several Internet of Things (IoT) devices.
Several articles have addressed the use of digital assistants for opening apps on your computer, as well as for dictation. This article addresses new developments in the use of digital assistants to conduct billing activities and the unique challenges they face.
A Challenge for Voice Controlled Digital Assistants
Law firms present a significant hurdle for voice controlled digital assistants—i.e., the requirement of strict confidentiality.
ABA Model Rule 1.6(a) protects the confidentiality of all “information relating to the representation of a client,” subject to certain exceptions, such that the disclosure of such information is only authorized if the client provides informed consent. According to the American Bar Association, as of September 29, 2017, every state has adopted some form of confidentiality requirement for attorneys.
In addition, ABA Model Rule 1.6(c) requires that an attorney must “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.”
In the context of internet communications, the American Bar Association’s Standing Committee on Ethics and Professional Responsibility recently concluded that “a lawyer may be required to take special security precautions to protect against the inadvertent or unauthorized disclosure of client information when required by an agreement with the client or by law, or when the nature of the information requires a higher degree of security.”
Among other things, ABA Formal Opinion 477R recommends that “[a] lawyer should understand how their firm’s electronic communications are created, where client data resides, and what avenues exist to access that information.” According to at least one commenter, “[t]his unfortunately means reading those EULA agreements [End User License Agreement] we all click past without a second thought.”
A lawyer’s confidentiality obligations can be separated into two categories: 1) preventing third-party disclosure; and 2) keeping client information secure from unauthorized access.
The issue of avoiding third-party disclosure of confidential client information is a significant concern. When a voice controlled digital assistant is used, everything that is being said gets sent over the internet to the digital assistant host company for processing, where the user’s speech is often analyzed and stored in order that the host company can improve its digital assistant.
The issue of third-party disclosure is well-established for various types of voice-to-text when the transcription feature processes the voice into text on remote servers. The same privacy concerns apply to digital assistants.
Moreover, most digital voice assistants, as well as IoT products such as Amazon Echo or Google Home, for example, require a trigger or “wake” word that fires them up to be able to respond to a question or command. That means the microphone must remain on in order to listen for the trigger word, potentially listening in on everything a lawyer is saying.
If a voice assistant or its host company is listening in on a lawyer talking about a case with their client, perhaps because the lawyer has a voice assistant device sitting on the desk in his or her office, the lawyer could potentially be violating the lawyer’s home state’s analogue to ABA Rule 1.6, or even compromising the attorney-client privilege due to third-party disclosure.
Security is also a concern because some digital assistants cannot distinguish one user from another. In her article “Amazon Echo Is Both Useful and Risky For Lawyers,” author Anna Massoglia says, “This means anyone within talking distance has access to every single account you’ve linked” to the digital assistant.
However, both Amazon’s Echo and Google Home recently added support for multiple users, allowing their devices to respond differently based on the user’s voice. This feature could presumably be deployed to limit unauthorized access to client information.
Security nevertheless remains a concern for a different reason, i.e., many voice controlled digital assistant services store the attorney’s voice data, whether for improving the quality of the service or otherwise.
Recent years have seen several news reports of data breaches at numerous high profile and reputable companies, and even of government agencies, including at Yahoo, Equifax, Target, Verizon, Uber, and the U.S. Securities and Exchange Commission.
If a data breach occurs while an attorney’s confidential speech data is stored with the victim company, the confidential data could be compromised.
Therefore, as great as it is to have a digital assistant to help you get things done, their use can pose an issue with lawyers when it comes to confidentiality, and a lawyer’s related obligations under Rule 1.6.
Can I Still Do My Bills By Voice Controlled Digital Assistants?
Generally, attorneys can still use voice recognition services for their billing. For example, Microsoft’s Speech Recognition app, Apple’s Enhance Dictation feature, and Nuance’s Dragon Legal software all appear to conduct their voice-to-text processing without sending any speech data to offsite processing services for conversion into text. Accordingly, these solutions should offer sufficient privacy and security to meet an attorney’s obligations under Rule 1.6.
Microsoft has long offered a speech recognition service built into its Windows operating systems. Similarly, Apple offers an enhanced dictation feature that will convert speech to text on your computer without processing it on Apple’s servers. These features should allow you to enter time into your billing system by voice from your desktop computer.
In addition, if you can access your billing system via a smartphone browser, Apple phones allow an offline dictation feature, meaning the conversion of voice to text is done on the phone and without sending information to Apple’s servers. Likewise, Google’s Android smartphones have an offline voice recognition feature that can be used for the same purpose.
Nuance’s Dragon Legal software is another inviting alternative. This software provides speech recognition capabilities that are tailored to legal terms. Because the software is locally installed, the attorney has greater control over all privacy and security concerns relating to maintaining the confidentiality of speech data provided to the app.
More recently released digital billing assistant apps might also be appealing. For example, Three Matts’ legal voice assistant, “Tali,” uses Amazon’s Echo or other Alexa-powered products to track and record time entries.
The app allows an attorney to ask Alexa to “tell Tali,” and then state the task the attorney is working on. Alexa will relay the command over to Tali.
The digital billing assistant will then keep track of time until the attorney tells Tali they are finished or to start a new task. Tali can also email the attorney a description of all the things the attorney tracked using the app, and the amount of time spent.
Similarly, Workspace Assistant by Thomson Reuters “allows the input of time entry and the querying of time statistics via Alexa and connects with the broader Thomson Reuters Workspace and Elite 3E system.” Like Tali, the Workspace Assistant app runs on Amazon Alexa-enabled devices and records time entries. When the attorney using the Workspace Assistant app finishes keeping time, Alexa asks if the attorney is finished and sees what matter should be billed for the task. Because Workspace Assistant is compatible with any Alexa-enabled device, it is practical on mobile as well.
As discussed above, because Amazon has the right to process and use the speech data an attorney sends to it, there are potential third-party disclosure concerns. Moreover, because Amazon has the right to retain the speech data, there are also security risks including possible data breach. Attorneys who might use Alexa or Alexa-enabled apps should, therefore, keep in mind the privacy and security risks relating to their confidentiality obligations under Rule 1.6.
Technology is changing, and if law firms are going to also improve their efficiency, implementing automated solutions will keep attorneys up to date and on top of their work.
Machine learning, AI, and IoT are here to help out, providing more time for more billable hours.
However, attorney ethical requirements create interesting challenges for digital billing assistants, and for the attorneys who might want to use them. Of particular importance is an attorney’s obligation to understand where and how his or her client’s confidential data is being stored and used.
This article was first published in Law Technology Today on 2/28/18.