Previously, my colleague Tanya Forsheit wrote a cautionary tale, “A Big Zooming Mess,” about the Zoom video conferencing service whose rise in popularity also brought increased scrutiny of its privacy and data security practices. That scrutiny came not just from media outlets and consumers, but also from government agencies such as the New York Attorney General and New York City Department of Education. The entire FKKS Privacy and Data Security team even had a round-table discussion (over WebEx) to unpack all the issues (recording available here). Now, both the New York Attorney General and the New York City Department of Education announced that they reached coordinated but independent agreements with Zoom to address various privacy and security issues, and paving the way for NYC DOE educators to resume using Zoom for virtual classroom instruction. This post looks at the terms of the NY AG agreement and discusses some of its key takeaways.
On April 29, 2020, Google and Apple released the first version of their COVID-19 contact tracing tools to public health organizations. The tools, first announced by the companies on April 10th, aim to help public health agencies build apps to track and contain the virus. This article discusses how the contact tracing tools work, the planned two-phase implementation for the tools, and some of the privacy questions around the tools.
How Do the Tools Work?
“Contact-tracing” is not a new concept. The concept is that a society can limit the spread of a virus by tracing whom a person who has tested positive with a virus has recently come in contact with, and notifying those individuals to further prevent the spread of the virus. For example, if John tests positive for the virus and visits a grocery store, part of the contact tracing process would be to find and notify those individuals who came close to him in the grocery store. As you can imagine, contact tracing has historically been a laborious and inaccurate process that requires a manual review of an infected person’s interactions.
Google and Apple’s partnership aims to dramatically improve the contact tracing process by using Bluetooth technology within an infected person’s cell phone to determine whom the person has interacted with and notifying those other people. The partnership is particularly notable because it involves the creation of shared standards between two tech giants that rarely allow for any interoperability. Below is an example of how the tools work: Continue Reading Google and Apple Release First Version of Contact Tracing Tools
Over the last few months, we’ve witnessed some major developments around SDKs and privacy. In February, the SDK defendants named in the consolidated McDonald/Rushing putative COPPA class action settled with plaintiffs. In late March, Zoom experienced a PR nightmare due, in part, to its inclusion of the Facebook SDK in its platform (discussed further in our Zoom blog). In mid-April, the Ninth Circuit reinstated a lawsuit against Facebook for alleged privacy violations in connection with its use of tracking technologies on third party websites. And this past Wednesday, the US District Court for New Mexico granted a motion to dismiss, the privacy claims against ad networks providing SDKs in child-directed apps.
In this blog, we’ll break down the New Mexico District Court order, and provide some observations from the decision. We are also using this blog as a springboard for a follow-up webinar that will discuss the state of affairs for SDKs and privacy. More to follow on the webinar soon.
- Background on the New Mexico District Court Case
Authored by Shely Berry and Amy Lawrence.
The creativity with which people around the world have responded, and continue to respond, to this pandemic in addressing the needs of others is remarkable. Virtual educational services, or “EdTech”, are one of the most visible needs as schools around the world transition to online learning. Many companies are highlighting the educational aspects of their current products and services or creating entirely new products and services that fall squarely within the EdTech industry. The goal: to assist those who now find themselves trying to figure out how to be safe at home, “teach children,” and focus on the ninety-nine other tasks that have to be completed at the exact same time.
It’s one thing if you made your online guitar lessons free for a general audience (thank you, Fender), but another if you provide products and services for educational purposes. You may find yourself subject to several state and federal privacy laws. At least 40 states have one or more such laws.
This blog post highlights the state laws that regulate the EdTech industry by aligning with California’s 2014 law, known as the Student Online Personal Information Protection Act (“SOPIPA”). Twenty-four states and the District of Columbia have SOPIPA-type laws aimed at limiting the use of personal information (and similarly defined terms) collected from students through EdTech products or services. Continue Reading When it Comes to Virtual Learning, Privacy Isn’t as Easy as 2 + 2 = 4
The Small Business Administration (SBA) is having some technical issues, to say the least. Small government agencies are notorious for suffering from technological inadequacy and poor information security measures, and the SBA appears to be no exception as it forms a bottleneck between small businesses and federal aid.
As part of its compliance with law, the SBA sent a “Data Breach” notification to as many as 8,000 Economic Injury Disaster Loan (EIDL) applicants. The SBA recently expanded the EIDL’s coverage to assist small businesses affected by the fallout of COVID-19. Though the loans were targeted at providing quick relief and funds were supposed to be delivered just a few days after application, many applicants waited weeks and continue to wait. The SBA seemingly did not have the technical processes in place to handle the deluge of applications it received. Unsurprisingly, delays, system crashes, and even a data breach occurred. Specifically, a flaw in the SBA’s loan application portal allowed applicants to see another user’s information if the back button was clicked. The SBA disabled that part of the site and fixed the bug, but not before inadvertent disclosures occurred.
The start of 2020 did not just bring us the effective date of the California Consumer Privacy Act (CCPA). It also lead to several state legislators introducing their own versions of potentially ground-breaking privacy and data security laws. Each law has nuances that will likely result in a compliance nightmare, particularly if all or most of the states and territories enact their own law. However, each also appears on its face to riff on either the EU’s General Data Protection Regulation (GDPR) or the CCPA.
The chart below provides a list (current as of April 14, 2020) of proposed state privacy legislation that could still be enacted this session. The purpose of the chart is to provide the broad strokes of each proposed law, show their similarities, and highlight key differences. The question is whether the GDPR and/or CCPA actually provide the most appropriate models to emulate? The CCPA is perceived and touted by many as the first and most comprehensive privacy and data security law of its kind in the US, but we can’t help but wonder: does first necessarily mean best?
States that considered but ultimately chose not to pass proposed privacy legislation in 2020 include: Florida, Maryland, Virginia, Washington, and Wisconsin. Continue Reading What’s the Deal with the Other State Privacy Bills?
On April 6, 2020, the Federal Trade Commission (FTC) announced a settlement with Tapplock, Inc., resolving allegations that the Canadian smart lock manufacturer violated Section 5 of the FTC Act by misrepresenting the security of its lock and of its consumers’ personal information. Following is a closer look at the settlement and underlying complaint, as well as an overview of the current recommendations for IoT device manufactures issued by the National Institute of Standards and Technology (NIST) in its most recent draft of the “Core Baseline” guide. Continue Reading FTC Taps into Tapplock’s Security Claims
Over the last several weeks, while Americans have grown accustomed to working from home, home schooling, and life in lockdown during the COVID-19 pandemic, the Zoom videoconferencing service has surged in popularity for every imaginable form of gathering, professional and personal. Zoom has become the service of choice – from team meetings to kids’ story times; from religious services to happy hours; from corporate onboarding to every manner of more “intimate” get-togethers for individuals who are following government-mandated social distancing guidelines.
The media and then, in quick succession, regulators, plaintiffs’ lawyers, and even Congress, began to scrutinize, publicize, and take legal action with respect to what were perceived as privacy or data security flaws from the latest technology darling. The result is a still-evolving case study in the classic reactionary American response to privacy and data security concerns, a phenomenon we have seen again and again in this practice space.
What sins has Zoom actually committed? Are they really so “shocking” from a privacy and data security perspective? In violation of law? Just not best practice? Creepy? And has Zoom’s iterative response served as a wet blanket or fuel for the inferno?
In this post, I explore the who, what, why, when, and how of this, at least as much as we can say as we sit here today. And because I am a hopeless nerd, I have chosen the format required by California’s data breach notification law, California Civil Code § 1798.82(d)(1), as the very best way to tell this story. We are going to use this blog post as a jumping off point for a free live and recorded roundtable discussion webinar (using WebEx [insert winking emoji here]) on April 14, 2020, at 12:30 pm Eastern/9:30 am Pacific. You can register here. Continue Reading A Big Zooming Mess: A Cautionary Tale
By Nicole Hyland and James Mariani
Every day, clients entrust their lawyers with confidential information. Whether in a matrimonial dispute, high-stakes corporate acquisition, commercial litigation, criminal defense matter, or any other sensitive legal issue, clients rely on their lawyers to safeguard information that could be detrimental or embarrassing to the client if disclosed. A lawyer’s ethical obligation to protect such confidential information is embodied in Rule 1.6 of the Rules of Professional Conduct (“RPCs”), which states in relevant part that “a lawyer shall not knowingly reveal confidential information.” The duty of confidentiality is not limited, however, to intentional disclosures. Rule 1.6(c) also requires a lawyer to “make reasonable efforts to prevent the inadvertent or unauthorized disclosure or use of, or unauthorized access to” confidential information. Continue Reading Once More Unto the Breach: A Timely Lawsuit Raises Questions About the Duty to Notify Clients of a Data Breach
Over the past several weeks, the California Attorney General (“AG”) published revisions to its proposed regulations implementing the CCPA (the “Modified Regulations”), and then further revised the Modified Regulations (“Version 2”). Despite earlier warnings to the business community that AG’s initial draft of Regulations would not materially change, we’ve now seen it happen twice. The full redlines of both the Modified Regulations and Version 2 are available here. This article highlights what’s new, what remains the same, what we expect to have the biggest impact on businesses working toward compliance, and the lack of predictability of next moves given the growing global health crisis. Continue Reading CCPA Update: Oops, the CA AG Did It Again