Digital Assistants and Professional Responsibility

“Okay Google, can you handle my bills for me?” If only it were that easy.

Voice controlled digital assistants are transforming the way businesses in general operate, making daily tasks as easy as a simple voice command.

Digital voice assistants like Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, and Google Voice are now being deployed in law firms as well.

And, with the rise of Artificial Intellgence (AI) and natural language processing, digital assistants are likely to become more prevalent.

Scheduling appointments, getting answers to questions, conducting research, billing, and other mundane tasks may now be handled by voice using a desktop computer, smartphone, or one of several Internet of Things (IoT) devices.

Several articles have addressed the use of digital assistants for opening apps on your computer, as well as for dictation. This article addresses new developments in the use of digital assistants to conduct billing activities and the unique challenges they face.

A Challenge for Voice Controlled Digital Assistants

Law firms present a significant hurdle for voice controlled digital assistants—i.e., the requirement of strict confidentiality.

ABA Model Rule 1.6(a) protects the confidentiality of all “information relating to the representation of a client,” subject to certain exceptions, such that the disclosure of such information is only authorized if the client provides informed consent. According to the American Bar Association, as of September 29, 2017, every state has adopted some form of confidentiality requirement for attorneys.

In addition, ABA Model Rule 1.6(c) requires that an attorney must “make reasonable efforts to prevent the inadvertent or unauthorized disclosure of, or unauthorized access to, information relating to the representation of a client.”

In the context of internet communications, the American Bar Association’s Standing Committee on Ethics and Professional Responsibility recently concluded that “a lawyer may be required to take special security precautions to protect against the inadvertent or unauthorized disclosure of client information when required by an agreement with the client or by law, or when the nature of the information requires a higher degree of security.”

Among other things, ABA Formal Opinion 477R recommends that “[a] lawyer should understand how their firm’s electronic communications are created, where client data resides, and what avenues exist to access that information.” According to at least one commenter, “[t]his unfortunately means reading those EULA agreements [End User License Agreement] we all click past without a second thought.”

A lawyer’s confidentiality obligations can be separated into two categories: 1) preventing third-party disclosure; and 2) keeping client information secure from unauthorized access.

The issue of avoiding third-party disclosure of confidential client information is a significant concern. When a voice controlled digital assistant is used, everything that is being said gets sent over the internet to the digital assistant host company for processing, where the user’s speech is often analyzed and stored in order that the host company can improve its digital assistant.

The issue of third-party disclosure is well-established for various types of voice-to-text when the transcription feature processes the voice into text on remote servers. The same privacy concerns apply to digital assistants.

Moreover, most digital voice assistants, as well as IoT products such as Amazon Echo or Google Home, for example, require a trigger or “wake” word that fires them up to be able to respond to a question or command. 

That means the microphone must remain on in order to listen for the trigger word, potentially listening in on everything a lawyer is saying.

 If a voice assistant or its host company is listening in on a lawyer talking about a case with their client, perhaps because the lawyer has a voice assistant device sitting on the desk in his or her office, the lawyer could potentially be violating the lawyer’s home state’s analogue to ABA Rule 1.6, or even compromising the attorney-client privilege due to third-party disclosure.

Security is also a concern because some digital assistants cannot distinguish one user from another. In her article “Amazon Echo Is Both Useful and Risky For Lawyers,” author Anna Massoglia says, “This means anyone within talking distance has access to every single account you’ve linked” to the digital assistant.

However, both Amazon’s Echo and Google Home recently added support for multiple users, allowing their devices to respond differently based on the user’s voice. This feature could presumably be deployed to limit unauthorized access to client information.

Security nevertheless remains a concern for a different reason, i.e., many voice controlled digital assistant services store the attorney’s voice data, whether for improving the quality of the service or otherwise.

Recent years have seen several news reports of data breaches at numerous high profile and reputable companies, and even of government agencies, including at Yahoo, Equifax, Target, Verizon, Uber, and the U.S. Securities and Exchange Commission.

If a data breach occurs while an attorney’s confidential speech data is stored with the victim company, the confidential data could be compromised.

Therefore, as great as it is to have a digital assistant to help you get things done, their use can pose an issue with lawyers when it comes to confidentiality, and a lawyer’s related obligations under Rule 1.6.

Can I Still Do My Bills By Voice Controlled Digital Assistants?

Generally, attorneys can still use voice recognition services for their billing. For example, Microsoft’s Speech Recognition app, Apple’s Enhance Dictation feature, and Nuance’s Dragon Legal software all appear to conduct their voice-to-text processing without sending any speech data to offsite processing services for conversion into text.

Accordingly, these solutions should offer sufficient privacy and security to meet an attorney’s obligations under Rule 1.6.

Microsoft has long offered a speech recognition service built into its Windows operating systems. Similarly, Apple offers an enhanced dictation feature that will convert speech to text on your computer without processing it on Apple’s servers. These features should allow you to enter time into your billing system by voice from your desktop computer.

In addition, if you can access your billing system via a smartphone browser, Apple phones allow an offline dictation feature, meaning the conversion of voice to text is done on the phone and without sending information to Apple’s servers. Likewise, Google’s Android smartphones have an offline voice recognition feature that can be used for the same purpose.

Nuance’s Dragon Legal software is another inviting alternative. This software provides speech recognition capabilities that are tailored to legal terms. Because the software is locally installed, the attorney has greater control over all privacy and security concerns relating to maintaining the confidentiality of speech data provided to the app.

Nuance’s Dragon Anywhere Group also provides a web-based platform for mobile users, including via iOS and Android smartphone apps. However, use of Dragon Anywhere Group service requires sending the attorney’s speech data to Nuance’s servers for processing. Nuance’s Privacy Policy states that,“We may use the information that we collect for our internal purposes to develop, tune, enhance and improve our products and services and for advertising and marketing consistent with this Privacy Policy.”

Similarly, Workspace Assistant by Thomson Reuters “allows the input of time entry and the querying of time statistics via Alexa and connects with the broader Thomson Reuters Workspace and Elite 3E system.” The Workspace Assistant app runs on Amazon Alexa-enabled devices and records time entries. 

When the attorney using the Workspace Assistant app finishes keeping time, Alexa asks if the attorney is finished and sees what matter should be billed for the task. Because Workspace Assistant is compatible with any Alexa-enabled device, it is practical on mobile as well.

However, because Workplace Assistant use Amazon’s Alexa service for speech recognition, the app is subject to Amazon’s terms of use and privacy policy as to their speech recognition functions.

The Alexa Terms of Use state: “Alexa streams audio to the cloud when you interact with Alexa. Amazon processes and retains your Alexa Interactions, such as your voice inputs, music playlists, and your Alexa to-do and shopping lists, in the cloud to provide and improve our services.” 

In addition, the Alexa Terms of Use provide, “Amazon processes and retains your Alexa Interactions and related information in the cloud in order to respond to your requests (e.g., ‘Send a message to Mom’), to provide additional functionality (e.g., speech to text transcription and vice versa), and to improve our services. We also store your messages in the cloud so that they’re available on your Alexa App and select Alexa Enabled Products.”

As discussed above, because Amazon has the right to process and use the speech data an attorney sends to it, there are potential third-party disclosure concerns. Moreover, because Amazon has the right to retain the speech data, there are also security risks including possible data breach. 

Attorneys who might use Alexa or Alexa-enabled apps should, therefore, keep in mind the privacy and security risks relating to their confidentiality obligations under Rule 1.6.

Conclusion

Technology is changing, and if law firms are going to also improve their efficiency, implementing automated solutions will keep attorneys up to date and on top of their work.  Machine learning, AI, and IoT are here to help out, providing more time for more billable hours.

However, attorney ethical requirements create interesting challenges for digital billing assistants, and for the attorneys who might want to use them. Of particular importance is an attorney’s obligation to understand where and how his or her client’s confidential data is being stored and used.