On August 12, Delaware Governor Jack A. Markell signed the Digital Access and Digital Accounts Act, the first law in the nation to comprehensively govern access to a person’s digital assets, including social media and email accounts, after the person dies or becomes incapacitated. Under the new law, a Delaware resident’s digital assets will become part of his or her estate after death, and these assets will be accessible to heirs to the same extent as the deceased person’s physical, tangible assets. Digital assets are defined broadly to include data, texts, email, audio, video, images, sounds, social media and social networking content, health care and insurance records, computer codes and programs, software and software licenses, and databases, along with usernames and passwords. The law expressly does not apply to digital accounts of an employer regularly used by an employee in the usual course of business. The law requires any company that controls a person’s digital assets to give the legal fiduciary for the deceased’s estate the usernames, passwords, and any other information needed to gain access to the digital assets upon a valid written request. Any contrary provisions in service agreements or privacy policies that limit a fiduciary’s access to digital accounts are void, although the account owner can specify that the account should remain private after death. The law also grants the company controlling the digit assets immunity for complying with valid requests for account access. The new law takes effect January 1, 2015.
On October 20, the CFPB finalized its amendment to Regulation P, which requires that financial institutions meet specific consumer data-sharing requirements, including the delivery of annual privacy notices. Under the new rule, bank and nonbank institutions under the CFPB’s jurisdiction will now be allowed to post privacy notices online, rather than deliver an annual paper copy. Institutions that choose to post notices online must meet certain conditions, including (i) providing notice to consumers if the institution shares any data to third parties, in addition to providing an opportunity to opt out of such sharing; and, (ii) using the 2009 model disclosure form developed by federal regulatory agencies. The institutions that choose to rely on the new delivery method must (i) ensure that customers are aware of the notices posted online; (ii) provide paper copies within ten days of a customer’s request; and, (iii) make customers aware that the privacy notice(s) are available online—and that a paper copy will be provided at the customer’s request—by inserting a “clear and conspicuous statement at least once per year on an account statement, coupon book, or a notice or disclosure.” As outlined when the proposed rule was issued in May, the CFPB anticipates that the rule will: (i) provide consumers with constant access to privacy notices; (ii) limit the amount of an institution’s data sharing with third parties; (iii) educate consumers on the various types of privacy policies available to them; and, (iv) reduce the cost for companies to provide privacy notices.
Nebraska Federal Court Refuses To Dismiss Suit Claiming Breach Of Contract, Violation of State Law for Unauthorized Credit Card Transactions Following Bank Data Breach
On August 20, the U.S. District Court for the District of Nebraska denied motions to dismiss filed by a Nebraska bank and two credit card processing companies in response to a purported class action filed by a merchant alleging that it suffered damages following a data breach at the defendants’ premises. Wines, Vines & Corks, LLC v. First Nat’l of Neb., Inc., No. 8:14CV82 (D. Neb. Aug. 20, 2014). According to the merchant’s complaint, the merchant maintained a credit card processing account with the defendants and, following the breach, had unauthorized credit card transactions processed and fees withdrawn from its account. The merchant alleged breach of contract, negligence, and violations of the Nebraska Consumer Protection Act and the Nebraska Uniform Deceptive Trade Practices Act based on the defendants’ failure to adequately secure and protect account information and refusal to refund the fees. In denying the motions to dismiss, the court determined that the merchant sufficiently pled the existence of a contract and resulting damages in support of its breach of contract claim, as well as a breach of the duty of due care in support of its negligence claim. Also, the court found that the merchant’s state law claims were adequately supported and determined that the defendants’ argument that the economic loss doctrine barred these claims was misplaced.
On August 1, the FTC released a staff report on the agency’s review of shopping apps—those used for comparison shopping, to collect and redeem deals and discounts, and to complete in-store purchases. The FTC staff examined information available to consumers before they download the software onto their mobile devices—specifically, information describing how apps that enable consumers to make purchases dealt with fraudulent or unauthorized transactions, billing errors, or other payment-related disputes. The staff also assessed information on how the apps handled consumer data. The FTC staff determined that the apps studied “often failed to provide pre-download information on issues that are important to consumers.” For example, according to the report, few of the in-store purchase apps provided any information prior to download explaining consumers’ liability or describing the app’s process for handling payment-related disputes. In addition, according to the FTC, most linked privacy policies “used vague language that reserved broad rights to collect, use, and share consumer data, making it difficult for readers to understand how the apps actually used consumer data or to compare the apps’ data practices.” The FTC staff recommends that companies that provide mobile shopping apps to consumers: (i) disclose consumers’ rights and liability limits for unauthorized, fraudulent, or erroneous transactions; (ii) clearly describe how they collect, use, and share consumer data; and (iii) ensure that their strong data security promises translate into strong data security practices. The report also includes recommended practices for consumers.
On June 20, Florida Governor Rick Scott signed SB 1524, which significantly revises and strengthens the state’s data breach notice law, making it among the toughest in the country. The bill shortens the timeline for providing notice of a data breach to require notice to consumers within 30 days of the “determination of a breach.” The bill also adds a parallel requirement to notify the state attorney general’s office for an incident affecting more than 500 state residents. The bill also provides that consumer notice by email will no longer require an E-SIGN consent. The new law clarifies the application of data breach requirements by amending the definition of “covered entity” to mean “a sole proprietorship, partnership, corporation, trust, estate, cooperative, association, or other commercial entity that acquires, maintains, stores, or uses personal information.” The bill also expands the definition of “personal information” to add, as was done in California last year, user name or e-mail address, in combination with a password or security question and answer that would permit access to an online account. The bill requires covered entities to take reasonable measures to (i) protect and secure data in electronic form containing personal information and (ii) dispose, or arrange for the disposal, of customer records containing personal information within its custody or control when the records are no longer to be retained. Finally, the bill revised the risk of harm provision in two noteworthy ways: (i) like Connecticut and Alaska, law enforcement must be consulted to employ the exemption to noticeand (ii) the exemption appears to cover only consumer notice, not AG notice. The changes take effect July 1, 2014.
On May 27, the FTC released a report that claims—based on a study of nine data brokers—that data brokers generally operate with a “fundamental lack of transparency.” The FTC describes data brokers as companies that collect personal information about consumers from a wide range of sources and then provide that data for purposes of verifying an individual’s identity, marketing products, and detecting fraud or otherwise mitigating risk. The report is based in part on the nine brokers’ responses to FTC orders that required the brokers to provide information about: (i) the nature and sources of the consumer information the data brokers collect; (ii) how they use, maintain, and disseminate the information; and (iii) the extent to which the data brokers allow consumers to access and correct their information or to opt out of having their personal information sold or shared. The report summarizes the companies’ data acquisition processes, their product development and the types of products they provide, the quality of the data collected and sold, the types of clients to whom the data is sold, and consumer controls over the information. The FTC recommends that Congress consider enacting data broker legislation that would, among other things: (i) require data brokers to give consumers access to their data and the ability to opt out of having it shared for marketing purposes; (ii) require data brokers to clearly disclose that they not only use raw data, but that they also derive certain inferences from the data; (iii) address gaps in FCRA to provide consumers with transparency when a company uses a data broker’s risk mitigation product that limits a consumer’s ability to complete a transaction; and (iv) require brokers who offer people search products to allow consumers to access their own information and opt out of the use of that information, and to disclose the sources of the information and any limitations of the opt out.
On May 13, the European Court of Justice held that an internet search operator is responsible for the processing of personal data that appear on web pages published by third parties, and that an individual has a right to ask a search engine operator to remove from search results specific links to materials that include the individual’s personal information. The court considered the issue in response to questions referred from a Spanish court about the scope of a 1995 E.U. directive designed to, among other things, protect individual privacy rights when personal data are processed. The court determined that “by searching automatically, constantly and systematically for information published on the internet, the operator of a search engine ‘collects’ data within the meaning of the directive,” and further determined that the operator “processes” and “controls” individual personal data within the meaning of the directive. The court held that a search engine operator “must ensure, within the framework of its responsibilities, powers and capabilities, that its activity complies with the directive’s requirements,” including by, in certain circumstances, removing “links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name,” even when publication of that person’s information on those pages is lawful. Further, the court held that although the search engine operator’s processing operations take place outside of the E.U., the operator is covered by the directive because the operator also has operations in an E.U. member state that were “intended to promote and sell, in the Member State in question, advertising space offered by the search engine in order to make the service offered by the engine profitable.”
On May 7, the CFPB issued a proposed rule that would provide financial institutions an alternative method for delivering annual privacy notices. The Gramm-Leach-Bliley Act (GLBA) and Regulation P require financial institutions to, among other things, provide annual privacy notices to customers—either in writing or electronically with consumer consent. Industry generally has criticized the current annual notice requirement as ineffective and burdensome, with most financial institutions providing the notices by U.S. postal mail. The proposed rule would allow financial institutions, under certain circumstances, to comply with the GLBA annual privacy notice delivery requirements by (i) continuously posting the notice in a clear and conspicuous manner on a page of their websites, without requiring a login or similar steps to access the notice; and (ii) mailing the notices promptly to customers who request them by phone. Read more…
On May 1, the White House’s working group on “big data” and privacy published a report on the findings of its 90-day review. In addition to considering privacy issues associated with big data, the group assessed the relationship between big data and discrimination, concluding, among other things, that “there are new worries that big data technologies could be used to ‘digitally redline’ unwanted groups, either as customers, employees, tenants, or recipients of credit” and that “big data could enable new forms of discrimination and predatory practices.” The report adds, “[t]he same algorithmic and data mining technologies that enable discrimination could also help groups enforce their rights by identifying and empirically confirming instances of discrimination and characterizing the harms they caused.” The working group recommends that the DOJ, the CFPB, and the FTC “expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases,” and adds that the President’s Council of Economic Advisers should assess “the evolving practices of differential pricing both online and offline, assess the implications for efficient operations of markets, and consider whether new practices are needed to ensure fairness.” The working group suggests that federal civil rights offices and the civil rights community should collaborate to “employ the new and powerful tools of big data to ensure that our most vulnerable communities are treated fairly.” With regard to privacy the report states that the “ubiquitous collection” of personal information and data, combined with the difficulty of keeping data anonymous, require policymakers to “look closely at the notice and consent framework that has been a central pillar of how privacy practices have been organized for more than four decades.” Among its policy recommendations, the working group urges (i) enactment of a Consumer Privacy Bill of Rights, informed by a Department of Commerce public comment process, and (ii) the adoption of a national data breach bill along the lines of the Administration’s May 2011 Cybersecurity legislative proposal. It also calls for data brokers to provide more transparency and consumer control of data.
On April 23, New York State Department of Financial Services (NYS DFS) Superintendent Benjamin Lawsky became the first state regulator to sue a financial services company to enforce the Dodd-Frank Act’s Title X prohibitions against unfair, deceptive, and abusive practices (UDAAP). Last month, Illinois Attorney General Lisa Madigan filed what appears to be the first suit by a state attorney general to enforce Dodd-Frank’s UDAAP provisions. Although state authorities generally are limited to enforcing Title X against state banks and non-bank financial service companies—except that state attorneys general may enforce rules of the CFPB against national banks and thrifts—these actions bring into sharp focus the full scope and reach of the Title X’s enforcement provisions and are likely to inspire similar state actions.
Mr. Lawsky’s complaint accuses a nonbank auto finance company of violating Sections 1031 and 1036 of the Dodd-Frank Act, as well as Section 408 of the New York Financial Services Law and Section 499 of the New York Banking Law by, among other things, “systematically hid[ing] from its customers the fact that they have refundable positive credit balances.” The complaint alleges that the company concealed its customers’ positive account balances—from insurance payoffs, overpayments, trade-ins, and other reasons—by programming its customer-facing web portal to shut down a customer’s access to his or her loan account once the loan was paid off, even if a positive credit balance existed. The company allegedly failed to refund such balances absent a specific request from a customer. In addition, the complaint charges that the company hid the existence of positive credit balances by submitting to the New York State Comptroller’s Office false and misleading “negative” unclaimed property reports, which represented under penalty of perjury that the company had no unrefunded customer credit balances. Read more…
On April 10, Kentucky Governor Steve Beshear signed into law HB 232 to establish a data breach notice requirement. The new law requires any person or business that operates in the state to provide written or electronic notice to affected state residents of any breach of a security system that exposes unencrypted personally identifiable information. The law requires notification “in the most expedient time possible and without unreasonable delay” upon discovery or notification of a breach, and permits certain substitute forms of notice if the person or business subject to the breach demonstrates that the notice exceeds certain cost or scope thresholds. The law does not require separate notice to the state attorney general, nor does it apply to entities subject to Title V of the Gramm-Leach-Bliley Act or HIPPA. The bill takes effect July 14, 2014. Kentucky’s adoption of a data breach notice law leaves only three states—Alabama, New Mexico, and South Dakota—without such a statutory requirement.
On April 10, the FFIEC issued an alert advising financial institutions of risks associated with “Heartbleed”, a recently discovered material security vulnerability in a commonly used encryption method known as the OpenSSL cryptographic library, which has existed since December 31, 2011. The alert states that the vulnerability could allow an attacker to access a server’s private cryptographic keys, thereby compromising the security of the server and its users, and potentially allowing attackers to impersonate bank services or users, steal login credentials, access sensitive email, or gain access to internal networks. Due to OpenSSL’s popularity, this vulnerability affects websites, e-mail servers, web servers, virtual private networks (VPN), instant messaging, and other applications. The FFIEC advises financial institutions to (i) ensure that third party vendors that use OpenSSL on their systems are aware of the vulnerability and take appropriate risk mitigation steps; (ii) monitor the status of their vendors’ efforts; (iii) identify and upgrade vulnerable internal systems and services; and (iv) follow appropriate patch management practices and test to ensure a secure configuration. Patch management, software maintenance, and security update practices are covered by a number of FFIEC IT Examination Handbooks. Finally the FFIEC states that institutions should operate with the assumption that encryption keys used on vulnerable servers are no longer viable for protecting sensitive information and should therefore strongly consider requiring users and administrators to change passwords after applying the patch.
On April 7, the U.S. District Court for the District of New Jersey denied a hotel company’s motion to dismiss the FTC’s claims that the company engaged in unfair and deceptive practices in violation of Section 5 of the FTC Act by failing to maintain reasonable and appropriate data security for customers’ personal information. FTC v. Wyndham Worldwide Corp., No. 13-1887, 2014 WL 1349019 (D.N.J. Apr. 7, 2014). The company moved to dismiss the FTC’s suit, arguing that the FTC (i) lacks statutory authority to enforce data security standards outside of its explicit data security authority under statutes such as the Gramm-Leach-Bliley Act (GLBA) and FCRA; (ii) violated fair notice principles by failing to first promulgate applicable regulations; and (iii) failed to sufficiently plead certain elements of the unfairness and deception claims. The court rejected each of these arguments. First, the court held that the FTC does not need specific authority under Section 5 to enforce data security standards. The court reasoned that the data-security legislation the followed the FTC Act, such as GLBA and FCRA, provide the FTC additional data security tools that complement, rather than preclude, the FTC’s general authority under Section 5. Second, the court held that, to bring a Section 5 data security claim, the FTC is not required to provide notice of reasonable standards by issuing a new regulation because regulations are not the only means of providing sufficient fair notice. According to the court, industry standards, past FTC enforcement actions, and FTC business guidance provided sufficient notice of what constitutes reasonable security measures. Third, the court held that the FTC properly pled its unfairness and deception claims under the FTC Act.
Data Breach Class Settlement Approved After Eleventh Circuit Held Identity Theft Following Breach Presents Cognizable Injury
Recently, the U.S. District Court for the Southern District of Florida approved a class settlement in a case in which the plaintiffs claimed financial harm from a health care company’s failure to protect their personal information. Resnick v. AvMed Inc., No. 10-24513 (S.D. Fla. Feb. 28, 2014). The settlement follows a September 2012 decision from the U.S. Court of Appeals for the Eleventh Circuit, in which the court reversed the district court’s dismissal of the case and held that because the complaint alleged financial injury, and because monetary loss is cognizable under Florida law, the plaintiffs alleged a cognizable injury. The court explained that the plaintiffs demonstrated “a sufficient nexus between the data breach and the identity theft beyond allegations of time and sequence” because the plaintiffs plead that they were careful in protecting their identities and had never been victims of identity theft. The settlement requires the company to pay $3 million, with each class member receiving up to $10 for each year they paid an insurance premium, up to a maximum of $30. The company also agreed to implement new data security measures.