On June 20, Florida Governor Rick Scott signed SB 1524, which significantly revises and strengthens the state’s data breach notice law, making it among the toughest in the country. The bill shortens the timeline for providing notice of a data breach to require notice to consumers within 30 days of the “determination of a breach.” The bill also adds a parallel requirement to notify the state attorney general’s office for an incident affecting more than 500 state residents. The bill also provides that consumer notice by email will no longer require an E-SIGN consent. The new law clarifies the application of data breach requirements by amending the definition of “covered entity” to mean “a sole proprietorship, partnership, corporation, trust, estate, cooperative, association, or other commercial entity that acquires, maintains, stores, or uses personal information.” The bill also expands the definition of “personal information” to add, as was done in California last year, user name or e-mail address, in combination with a password or security question and answer that would permit access to an online account. The bill requires covered entities to take reasonable measures to (i) protect and secure data in electronic form containing personal information and (ii) dispose, or arrange for the disposal, of customer records containing personal information within its custody or control when the records are no longer to be retained. Finally, the bill revised the risk of harm provision in two noteworthy ways: (i) like Connecticut and Alaska, law enforcement must be consulted to employ the exemption to noticeand (ii) the exemption appears to cover only consumer notice, not AG notice. The changes take effect July 1, 2014.
On August 1, the FTC released a staff report on the agency’s review of shopping apps—those used for comparison shopping, to collect and redeem deals and discounts, and to complete in-store purchases. The FTC staff examined information available to consumers before they download the software onto their mobile devices—specifically, information describing how apps that enable consumers to make purchases dealt with fraudulent or unauthorized transactions, billing errors, or other payment-related disputes. The staff also assessed information on how the apps handled consumer data. The FTC staff determined that the apps studied “often failed to provide pre-download information on issues that are important to consumers.” For example, according to the report, few of the in-store purchase apps provided any information prior to download explaining consumers’ liability or describing the app’s process for handling payment-related disputes. In addition, according to the FTC, most linked privacy policies “used vague language that reserved broad rights to collect, use, and share consumer data, making it difficult for readers to understand how the apps actually used consumer data or to compare the apps’ data practices.” The FTC staff recommends that companies that provide mobile shopping apps to consumers: (i) disclose consumers’ rights and liability limits for unauthorized, fraudulent, or erroneous transactions; (ii) clearly describe how they collect, use, and share consumer data; and (iii) ensure that their strong data security promises translate into strong data security practices. The report also includes recommended practices for consumers.
On May 27, the FTC released a report that claims—based on a study of nine data brokers—that data brokers generally operate with a “fundamental lack of transparency.” The FTC describes data brokers as companies that collect personal information about consumers from a wide range of sources and then provide that data for purposes of verifying an individual’s identity, marketing products, and detecting fraud or otherwise mitigating risk. The report is based in part on the nine brokers’ responses to FTC orders that required the brokers to provide information about: (i) the nature and sources of the consumer information the data brokers collect; (ii) how they use, maintain, and disseminate the information; and (iii) the extent to which the data brokers allow consumers to access and correct their information or to opt out of having their personal information sold or shared. The report summarizes the companies’ data acquisition processes, their product development and the types of products they provide, the quality of the data collected and sold, the types of clients to whom the data is sold, and consumer controls over the information. The FTC recommends that Congress consider enacting data broker legislation that would, among other things: (i) require data brokers to give consumers access to their data and the ability to opt out of having it shared for marketing purposes; (ii) require data brokers to clearly disclose that they not only use raw data, but that they also derive certain inferences from the data; (iii) address gaps in FCRA to provide consumers with transparency when a company uses a data broker’s risk mitigation product that limits a consumer’s ability to complete a transaction; and (iv) require brokers who offer people search products to allow consumers to access their own information and opt out of the use of that information, and to disclose the sources of the information and any limitations of the opt out.
On May 13, the European Court of Justice held that an internet search operator is responsible for the processing of personal data that appear on web pages published by third parties, and that an individual has a right to ask a search engine operator to remove from search results specific links to materials that include the individual’s personal information. The court considered the issue in response to questions referred from a Spanish court about the scope of a 1995 E.U. directive designed to, among other things, protect individual privacy rights when personal data are processed. The court determined that “by searching automatically, constantly and systematically for information published on the internet, the operator of a search engine ‘collects’ data within the meaning of the directive,” and further determined that the operator “processes” and “controls” individual personal data within the meaning of the directive. The court held that a search engine operator “must ensure, within the framework of its responsibilities, powers and capabilities, that its activity complies with the directive’s requirements,” including by, in certain circumstances, removing “links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name,” even when publication of that person’s information on those pages is lawful. Further, the court held that although the search engine operator’s processing operations take place outside of the E.U., the operator is covered by the directive because the operator also has operations in an E.U. member state that were “intended to promote and sell, in the Member State in question, advertising space offered by the search engine in order to make the service offered by the engine profitable.”
On May 7, the CFPB issued a proposed rule that would provide financial institutions an alternative method for delivering annual privacy notices. The Gramm-Leach-Bliley Act (GLBA) and Regulation P require financial institutions to, among other things, provide annual privacy notices to customers—either in writing or electronically with consumer consent. Industry generally has criticized the current annual notice requirement as ineffective and burdensome, with most financial institutions providing the notices by U.S. postal mail. The proposed rule would allow financial institutions, under certain circumstances, to comply with the GLBA annual privacy notice delivery requirements by (i) continuously posting the notice in a clear and conspicuous manner on a page of their websites, without requiring a login or similar steps to access the notice; and (ii) mailing the notices promptly to customers who request them by phone. Read more…
On May 1, the White House’s working group on “big data” and privacy published a report on the findings of its 90-day review. In addition to considering privacy issues associated with big data, the group assessed the relationship between big data and discrimination, concluding, among other things, that “there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants, or recipients of credit” and that “big data could enable new forms of discrimination and predatory practices.” The report adds, “[t]he same algorithmic and data mining technologies that enable discrimination could also help groups enforce their rights by identifying and empirically confirming instances of discrimination and characterizing the harms they caused.” The working group recommends that the DOJ, the CFPB, and the FTC “expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases,” and adds that the President’s Council of Economic Advisers should assess “the evolving practices of differential pricing both online and offline, assess the implications for efficient operations of markets, and consider whether new practices are needed to ensure fairness.” The working group suggests that federal civil rights offices and the civil rights community should collaborate to “employ the new and powerful tools of big data to ensure that our most vulnerable communities are treated fairly.” With regard to privacy the report states that the “ubiquitous collection” of personal information and data, combined with the difficulty of keeping data anonymous, require policymakers to “look closely at the notice and consent framework that has been a central pillar of how privacy practices have been organized for more than four decades.” Among its policy recommendations, the working group urges (i) enactment of a Consumer Privacy Bill of Rights, informed by a Department of Commerce public comment process, and (ii) the adoption of a national data breach bill along the lines of the Administration’s May 2011 Cybersecurity legislative proposal. It also calls for data brokers to provide more transparency and consumer control of data.
On April 23, New York State Department of Financial Services (NYS DFS) Superintendent Benjamin Lawsky became the first state regulator to sue a financial services company to enforce the Dodd-Frank Act’s Title X prohibitions against unfair, deceptive, and abusive practices (UDAAP). Last month, Illinois Attorney General Lisa Madigan filed what appears to be the first suit by a state attorney general to enforce Dodd-Frank’s UDAAP provisions. Although state authorities generally are limited to enforcing Title X against state banks and non-bank financial service companies—except that state attorneys general may enforce rules of the CFPB against national banks and thrifts—these actions bring into sharp focus the full scope and reach of the Title X’s enforcement provisions and are likely to inspire similar state actions.
Mr. Lawsky’s complaint accuses a nonbank auto finance company of violating Sections 1031 and 1036 of the Dodd-Frank Act, as well as Section 408 of the New York Financial Services Law and Section 499 of the New York Banking Law by, among other things, “systematically hid[ing] from its customers the fact that they have refundable positive credit balances.” The complaint alleges that the company concealed its customers’ positive account balances—from insurance payoffs, overpayments, trade-ins, and other reasons—by programming its customer-facing web portal to shut down a customer’s access to his or her loan account once the loan was paid off, even if a positive credit balance existed. The company allegedly failed to refund such balances absent a specific request from a customer. In addition, the complaint charges that the company hid the existence of positive credit balances by submitting to the New York State Comptroller’s Office false and misleading “negative” unclaimed property reports, which represented under penalty of perjury that the company had no unrefunded customer credit balances. Read more…
On April 10, Kentucky Governor Steve Beshear signed into law HB 232 to establish a data breach notice requirement. The new law requires any person or business that operates in the state to provide written or electronic notice to affected state residents of any breach of a security system that exposes unencrypted personally identifiable information. The law requires notification “in the most expedient time possible and without unreasonable delay” upon discovery or notification of a breach, and permits certain substitute forms of notice if the person or business subject to the breach demonstrates that the notice exceeds certain cost or scope thresholds. The law does not require separate notice to the state attorney general, nor does it apply to entities subject to Title V of the Gramm-Leach-Bliley Act or HIPPA. The bill takes effect July 14, 2014. Kentucky’s adoption of a data breach notice law leaves only three states—Alabama, New Mexico, and South Dakota—without such a statutory requirement.
On April 10, the FFIEC issued an alert advising financial institutions of risks associated with “Heartbleed”, a recently discovered material security vulnerability in a commonly used encryption method known as the OpenSSL cryptographic library, which has existed since December 31, 2011. The alert states that the vulnerability could allow an attacker to access a server’s private cryptographic keys, thereby compromising the security of the server and its users, and potentially allowing attackers to impersonate bank services or users, steal login credentials, access sensitive email, or gain access to internal networks. Due to OpenSSL’s popularity, this vulnerability affects websites, e-mail servers, web servers, virtual private networks (VPN), instant messaging, and other applications. The FFIEC advises financial institutions to (i) ensure that third party vendors that use OpenSSL on their systems are aware of the vulnerability and take appropriate risk mitigation steps; (ii) monitor the status of their vendors’ efforts; (iii) identify and upgrade vulnerable internal systems and services; and (iv) follow appropriate patch management practices and test to ensure a secure configuration. Patch management, software maintenance, and security update practices are covered by a number of FFIEC IT Examination Handbooks. Finally the FFIEC states that institutions should operate with the assumption that encryption keys used on vulnerable servers are no longer viable for protecting sensitive information and should therefore strongly consider requiring users and administrators to change passwords after applying the patch.
On April 7, the U.S. District Court for the District of New Jersey denied a hotel company’s motion to dismiss the FTC’s claims that the company engaged in unfair and deceptive practices in violation of Section 5 of the FTC Act by failing to maintain reasonable and appropriate data security for customers’ personal information. FTC v. Wyndham Worldwide Corp., No. 13-1887, 2014 WL 1349019 (D.N.J. Apr. 7, 2014). The company moved to dismiss the FTC’s suit, arguing that the FTC (i) lacks statutory authority to enforce data security standards outside of its explicit data security authority under statutes such as the Gramm-Leach-Bliley Act (GLBA) and FCRA; (ii) violated fair notice principles by failing to first promulgate applicable regulations; and (iii) failed to sufficiently plead certain elements of the unfairness and deception claims. The court rejected each of these arguments. First, the court held that the FTC does not need specific authority under Section 5 to enforce data security standards. The court reasoned that the data-security legislation the followed the FTC Act, such as GLBA and FCRA, provide the FTC additional data security tools that complement, rather than preclude, the FTC’s general authority under Section 5. Second, the court held that, to bring a Section 5 data security claim, the FTC is not required to provide notice of reasonable standards by issuing a new regulation because regulations are not the only means of providing sufficient fair notice. According to the court, industry standards, past FTC enforcement actions, and FTC business guidance provided sufficient notice of what constitutes reasonable security measures. Third, the court held that the FTC properly pled its unfairness and deception claims under the FTC Act.
Data Breach Class Settlement Approved After Eleventh Circuit Held Identity Theft Following Breach Presents Cognizable Injury
Recently, the U.S. District Court for the Southern District of Florida approved a class settlement in a case in which the plaintiffs claimed financial harm from a health care company’s failure to protect their personal information. Resnick v. AvMed Inc., No. 10-24513 (S.D. Fla. Feb. 28, 2014). The settlement follows a September 2012 decision from the U.S. Court of Appeals for the Eleventh Circuit, in which the court reversed the district court’s dismissal of the case and held that because the complaint alleged financial injury, and because monetary loss is cognizable under Florida law, the plaintiffs alleged a cognizable injury. The court explained that the plaintiffs demonstrated “a sufficient nexus between the data breach and the identity theft beyond allegations of time and sequence” because the plaintiffs plead that they were careful in protecting their identities and had never been victims of identity theft. The settlement requires the company to pay $3 million, with each class member receiving up to $10 for each year they paid an insurance premium, up to a maximum of $30. The company also agreed to implement new data security measures.
Last week, as part of the White House’s initiative on “big data” and privacy (led by John Podesta), the White House Office of Science and Technology Policy issued a request for information seeking public input regarding broad privacy-related issues. The request defines “big data” as “datasets so large, diverse, and/or complex, that conventional technologies cannot adequately capture, store, or analyze them.” It seeks comments on a number of issues, including: (i) the public policy implications of the collection, storage, analysis, and use of big data; (ii) the types of uses of big data that could measurably improve outcomes or productivity with further government action, funding, or research, and uses of big data that raise the most public policy concerns; (iii) the technological trends or key technologies which will affect the collection, storage, analysis and use of big data, and whether any are particularly promising for safeguarding privacy; (iv) how the policy frameworks or regulations for handling big data should differ between the government and the private sector; and (v) issues raised by the use of big data across jurisdictions. Comments are due by March 31, 2014.
Recently, the CFTC’s Division of Swaps Oversight issued Staff Advisory No. 14-21, which recommends best practices for CFTC-regulated intermediaries to comply with applicable Gramm-Leach-Bliley (GLB) Act privacy requirements, consistent with the Division’s intention to focus more resources on GLB privacy compliance. The advisory states that its recommendations are generally consistent with guidelines and regulations issued by other federal financial regulators, and the majority of the specific best practices are supported with references to prior rules and guidance. A number of the best practices cite the Interagency Guidelines Establishing Standards for Safeguarding Customer Information and Rescission of Year 2000 Standards for Safety and Soundness and a parallel FTC rule. Notably, several of the recommendations rely on a rule proposed by the SEC in 2008 but which has not yet been finalized. For example, the CFTC recommends based on that SEC proposal and the Interagency Guidelines that covered entities establish a breach investigation and notice process to alert potentially impacted individuals and to notify the CFTC. In addition, without referencing any other federal rule or guidance the Staff Advisory recommends that covered entities engage at least once every two years an independent party to test and monitor the safeguards’ controls, systems, policies and procedures, maintaining written records of the effectiveness of the controls.
On March 6, the FTC released a memorandum of understanding (MOU) it signed with the UK’s Information Commissioner’s Office (ICO), which is designed to strengthen the agencies’ privacy enforcement partnership. The FTC stated that over the last several years it has worked with the ICO on numerous investigations and international initiatives to increase global privacy cooperation. The MOU establishes a formal framework for the agencies to provide mutual assistance and exchange of information for the purpose of investigating, enforcing, and/or securing compliance with certain privacy violations. The FTC also announced a joint project with the European Union (EU) and Asia-Pacific Economic Cooperation (APEC) economies to map together the requirements for APEC Cross Border Privacy Rules and EU Binding Corporate Rules, which is designed to provide a practical reference tool for companies that seek “double certification” under the APEC and EU systems, and shows the substantial overlap between the two.