Cyber Security and Insider Threat Management

Washington State is considering sweeping legislation (SB 5376) to govern the security and privacy of personal data similar to the requirements of the European Union’s General Data Protection Regulation (“GDPR”). Under the proposed legislation, Washington residents will gain comprehensive rights in their personal data. Residents will have the right, subject to certain exceptions, to request that data errors be corrected, to withdraw consent to continued processing and to deletion of their data. Residents may require an organization to confirm whether it is processing their personal information and to receive a copy of their personal data in electronic form.

Covered organizations will be required to provide consumers with a conspicuous privacy notice disclosing the categories of personal data collected or shared with third parties and the consumers’ rights to control use of their personal data. Significantly, covered businesses must conduct documented risk assessments to identify the personal data to be collected and weigh the risks in collection and mitigation of those risks through privacy and cybersecurity safeguards.

Washington’s proposal follows the recent enactment of the California Consumer Privacy Act (see EBG’s Act Now AdvisoryCalifornia’s Consumer Privacy Act What Employer’s Need to Know). Washington’s legislation, however, will grant rights beyond those contained in the California Act and is more closely aligned with the GDPR’s framework. The heightened protections are grounded in the sponsors’ recognition of the detrimental effect of data breaches and the resulting loss of privacy. The Act cites to the GDPR as providing for “the strongest privacy protections in the world” and adopts the GDPR’s expansive definition of “personal data” – any information relating to any identified or identifiable natural person.

Businesses that process the personal data of more than 100,000 Washington residents are covered, as well as “data brokers” that derive 50 percent of their revenue from the brokered sale of personal information. Notably, “data sets” (i.e., Protected Health Information (“PHI”)) regulated by the federal Health Insurance Portability and Accountability Act of 1996, Health Information Technology for Economic and Clinical Health (“HITECH”) Act, or the Gramm-Leach-Bliley Act of 1999 are not covered. Financial and health care institutions may need to comply as to other personal data not protected under these statutes. If a health care or financial institution collects or processes other personal data and meets the thresholds above, then it is likely covered.

Employers should take note that data sets maintained only for employment records purposes are excluded. Notably, the Act excludes from coverage “an employee or contractor of a business acting in their role as an employee or contractor.” The Act will impact organizations that use facial recognition technology for profiling consumers with effects on “employment purposes” and “health care services” requiring human review prior to final decisions. Organizations who contract with facial recognition firms may see pass through contractual restrictions prohibiting use for unlawful bias.

There is no private right of action. Enforcement actions may be brought by the Attorney General to obtain injunctive relief and to impose civil penalties. If enacted, the Act, scheduled to become effective December 31, 2020, will have wide-ranging impacts requiring significant advance planning, risk assessments and consideration of privacy and security by design principles.

As we previously reported, since 2017 employees have filed dozens of employment class actions claiming violations of Illinois’ 2008 Biometric Information Privacy Act (“BIPA”). In short, BIPA protects the privacy rights of employees, customers, and others in Illinois against the improper collection, usage, storage, transmission, and destruction of biometric information, including biometric identifiers, such as retina or iris scans, fingerprints, voiceprints, and scans of face or hand geometry. Before collecting such biometric information, BIPA requires an entity to: (1) provide written notice to each individual of the collection; (2) obtain a signed release from each individual for the collection of biometric data; and (3) make available a policy that contains a retention schedule and guidelines for the permanent destruction of the biometric data.

One of the unresolved legal issues was whether an entity’s failure to comply with BIPA’s requirements, absent an actual injury, was sufficient to sustain a claim under that law. On January 25, 2019, the Illinois Supreme Court weighed in on this issue in Rosenbach v. Six Flags Entertainment Corp., holding that mere collection of an individual’s biometric information may be enough to state a claim under BIPA.

In Rosenbach, a parent sued on behalf of her child after he was fingerprinted entering a Six Flags theme park. Neither the parent nor the child signed a release, Six Flags did not provide a written notice provided to the child or the parent, and Six Flags did not have a publicly available policy regarding the retention or destruction of the biometric information. Nonetheless, there have been no known data breaches on Six Flags systems, and the complaint did not allege any other harm to the parent or her son.

The Illinois Supreme Court found that the legislative intent behind BIPA dictated that a technical violation of the law, such as failure to provide notice or obtain a release, is sufficient to state a claim under the Act. Under BIPA, an “aggrieved” party is similar to the concept of the injury-in-fact requirement for standing in federal court. There, the Court found that the “injury is real and significant.”

In light of the Rosenbach decision, it is even more important that employers with operations in Illinois consider taking the following action:

(1)  First, determine if your company collects, uses, stores, or transmits any employee’s (or other individual’s) biometric information or identifiers that may be covered by BIPA (e.g., using fingerprint recognition technology for time keeping purposes or to access a company-issued property or devices).

(2)  If your company does collect, use, store, or transmit biometric data/identifiers, you should:

(a)  develop or review existing, written policies concerning the collection, storage, use, transmission, and destruction of that information, consistent with industry standards;

(b)  implement policies concerning proper notice to employees (and other affected individuals) about the company’s use, storage, etc., of such data and obtain written and signed consent forms from all affected persons; and

(c)  establish practices to protect individuals’ privacy against improper disclosure of biometric data/identifiers, using the methods and standard of care that they would apply to other material deemed confidential and sensitive.

Importantly, providing proper notice includes identifying the specific reason for the collection, storage, and use of the biometric data, as well as how long the employer will use or retain such data. 740 Ill. Comp. Stat. 14/15(a), (b); 14/10.

There is a visceral and palpable dynamic emerging in global workplaces: tension.

Tension between what is potentially knowable—and what is actually known.   Tension between the present and the future state of work.  Tension between what was, is, and what might become (and when).  Tension between the nature, function, and limits of data and technology.

The present-future of work is being shaped daily, dynamically, and profoundly by a host of factors—led by the exponential proliferation of data, new technologies, and artificial intelligence (“AI”)—whose impact cannot be understated.  Modern employers have access to an unprecedented amount of data impacting their workforce, from data concerning the trends and patterns in employee behaviors and data concerning the people analytics used in hiring, compensation, and employee benefits, to data that analyzes the composition of the employee workforce itself.  To be sure, AI will continue to disrupt how virtually every employer views its human capital model on an enterprise basis. On a micro level, employers are already analyzing which functions or groups of roles might be automated, augmented, or better aligned to meet their future business models.

And, yet, there is an equal, counterbalancing force at play—the increased demand for accountability, transparency, civility, and equity.  We have already seen this force playing out in real time, most notably in the #MeToo, pay equity, and data privacy and security movements.  We expect that these movements and trends will continue to gain traction and momentum in litigation, regulation, and international conversation into 2019 and beyond.

We have invited Epstein Becker Green attorneys from our Technology, Media & Telecommunications (“TMT”) service team to reflect and opine on the most significant developments of the year.  In each, we endeavor to provide practical insights to enable employers to think strategically through these emergent tensions and business realities—to continue to deliver value to their organizations and safeguard their goodwill and reputation.

Continue Reading <i>Take 5</i> Newsletter – The Present-Future of Work: 2018 Trends and 2019 Predictions

Join Epstein Becker Green attorneys, Brian G. Cesaratto and Brian E. Spang, for a discussion of how employers can best protect their critical technologies and trade secrets from employee and other insider threats. Topics to be discussed include:

  • Determining your biggest threat by using available data
  • What keeps you up at night?
  • Foreseeing the escalation in risk, from insider and cyber threats to critical technologies
  • New protections and remedies under the Trade Secret Protection Act of 2014
  • Where are your trade secrets located, and what existing protections are in place?
  • What types of administrative and technical controls should your firm consider implementing for the key material on your network to protect against an insider threat?
  • What legal requirements may apply under applicable data protection laws?
  • How do you best protect trade secrets and other critical technologies as information increasingly moves into the cloud?
  • Using workforce management and personnel techniques to gain protection
  • The importance of an incident response plan
  • Developing and implementing an effective litigation response strategy to employee theft

Wednesday, October 3, 2018.
12:30 p.m. – 2:00 p.m. ET
Register for this complimentary webinar today!

We published an article with NYSBA Labor and Employment Law Journal, titled “Employee Threats to Critical Technologies Are Best Addressed Through a Formalized Insider Threat Risk Assessment Process and Program.” With the New York State Bar Association’s permission, we have linked it here.

Our colleague  at Epstein Becker Green has a post on the Health Law Advisor blog that will be of interest to our readers in the technology industry: “NIST Seeks Comments on Cybersecurity Standards for Patient Imaging Devices.”

Following is an excerpt:

The National Institute of Standards and Technology (“NIST) has announced that it will be seeking industry input on developing “use cases” for its framework of cybersecurity standards related to patient imaging devices. NIST, a component of the Department of Commerce, is the agency assigned to the development and promulgation of policies, guidelines and regulations dealing with cybersecurity standards and best practices.  NIST claims that its cybersecurity program promotes innovation and competitiveness by advancing measurement science, standards, and related technology in ways that enhance economic security and quality of life. Its standards and best practices address interoperability, usability and privacy continues to be critical for the nation. NIST’s latest announcement is directed at eventually providing security guidance for the healthcare sector’s most common uses of data, inasmuch as that industry has increasingly come under attack. …

Read the full post here.

The European Union’s (“EU’s”) General Data Protection Regulations (“GDPR”) go into effect on May 25, 2018, and they clearly apply to U.S. companies doing business in Europe or offering goods and services online that EU residents can purchase. Given that many U.S. companies, particularly in the health care space, increasingly are establishing operations and commercial relationships outside the United States generally, and in Europe particularly, many may be asking questions akin to the following recent inquiries that I have fielded concerning the reach of the GDPR:

What does the GDPR do? The GDPR unifies European data and privacy protection laws as to companies that collect or process the personally identifiable information (“PII” or, as the GDPR calls it, “personal data”) of European residents (not just citizens).

Who must comply? The GDPR applies to any company that has personal information of EU residents or citizens or that conducts business in the EU, regardless of its home country.

What is the risk of non-compliance? The GDPR mandates documented compliance. The regulations provide for substantial fines of up to €20 million or 4 percent of global revenues for noncompliance. Willful non-compliance is most heavily fined under this tiered system.

How far along are most companies as to compliance? The consulting firm Gartner estimates that more than half of the companies that are subject to the GDPR will not be in compliance throughout this year. They will be at risk.

Who will adopt the regulations? All 28 EU members, plus Iceland, Norway, and Liechtenstein (collectively known as the “European Economic Area”), and likely the United Kingdom, will adopt the regulations.

Will the regulations be enforced extraterritorially? The GDPR applies worldwide as to any company that offers goods or services (even if they are free) within the EU or collects, processes, or maintains (anywhere) personal data about European residents (again, not just citizens).

How is “personal data” defined? The definition includes any information as to a human being (called a “data subject”) that can directly or indirectly identify him or her, including, but not limited to, names; birthdates; physical addresses; email and IP addresses; and health, biometric, and demographic information.

What constitutes compliance? In general terms, a subject company must limit the use of the retained personal data and maintain it securely.

  • Explicit consent is required for each processing activity as to any covered datum.
  • Access, free of charge, must be afforded to any data subject on request to a “data controller” (a person at the company charged with maintaining data), who, in turn, must assure that any “data processor” (any person or company that takes data from consumers and manipulates or uses it in some way to then pass along information to a third party) is compliant as to the requested action.
  • Data subjects have the right to be “forgotten, i.e., to have their data expunged, and may revoke consent at will.

What does the GDPR require if there is a data breach? Data breaches that “may” pose a risk to individuals must be notified officially within 72 hours and to affected persons without undue delay.

This, of course, is only an outline of GDPR requirements and procedures. Any specific advice only can be provided knowing an individual company’s circumstances and needs. One does note that, as is the case in other regards, for example, antitrust, the assumptions prevalent within the EU are decidedly different from those in the United States. As a number of commentators have observed, while there is no defined “right of privacy” in the United States, a company is required to preserve information, including PII and personal health information, or PHI, in the event of litigation. In Europe, which has very limited litigation discovery, there is a defined right of privacy and individuals can cause data describing them to be erased (“forgotten”).

Many of you know also that there is a case pending a decision in the Supreme Court of the United States involving whether the U.S. government can compel Microsoft to produce PII that is collected and stored outside of the United States. An affirmative decision might create a conflict of law that will complicate the data retention abilities of American companies doing business overseas. So stay tuned.

As 2017 comes to a close, recent headlines have underscored the importance of compliance and training. In this Take 5, we review major workforce management issues in 2017, and their impact, and offer critical actions that employers should consider to minimize exposure:

  1. Addressing Workplace Sexual Harassment in the Wake of #MeToo
  2. A Busy 2017 Sets the Stage for Further Wage-Hour Developments
  3. Your “Top Ten” Cybersecurity Vulnerabilities
  4. 2017: The Year of the Comprehensive Paid Leave Laws
  5. Efforts Continue to Strengthen Equal Pay Laws in 2017

Read the full Take 5 online or download the PDF.

Employers continue to incorporate the use of biometric information for several employee management purposes, such as in systems managing time keeping and security access that use fingerprints, handprints, or facial scans.  Recently, Illinois state courts have encountered a substantial increase in the amount of privacy class action complaints under the Illinois Biometric Information Privacy Act (“BIPA”), which requires employers to provide written notice and obtain consent from employees (as well as customers) prior to collecting and storing any biometric data.  Under the BIPA, the employer must also maintain a written policy identifying the “specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used.”  740 ILC 14/15(b)(2).

Although the BIPA was enacted almost 10 years ago, individuals did not start filing lawsuits until 2015.  Since September 2017, there have been over twenty-five new filings in Illinois state courts including class actions against prominent international hotel and restaurant chains.  These lawsuits tend to target employers utilizing finger print recognition machines as part of their time keeping systems.  Where the employer uses a third-party supplier for its time-tracking system, the claims have also included allegations that the employer improperly shared the biometric information with the supplier without obtaining the proper consent.  In these cases, the claims generally allege that the employer failed to provide proper notice.

Though there is no definitive reason for the increase in filings over the past months, the claims may be related to the increased use of biometric information in the workplace since the initial case filings in 2015.  While Texas and Washington also have laws governing employer use of biometric information, Illinois is the only state that currently provides a private right of action, including class actions.  Additionally, potential damages associated with BIPA violations, particularly for class actions, can be extensive, including liquidated damages of $1,000 per negligent violation (or the amount of actual damages, whichever is greater), liquidated damages of $5,000 per intentional or reckless violation (or the actual damages, whichever is greater) and attorney’s fees.

What Can Employers Do?

  • Prior to collecting or storing biometric data, employers in Illinois should: (1) create a written policy regarding the retention and destruction of biometric data; (2) obtain written acknowledgment and release from the employees; and (3) store the biometric information securely, similar to other confidential information, such as personal health information or personally identifiable information.
  • Employers who use a third party to assist with the collection or storage of biometric data should include the third party in the acknowledgement and release, which employees execute.
  • Employers also should be aware that most states, including Illinois, have legislation governing how employers respond to data breaches and the required notifications to employees. If a data breach occurs, employers are advised to immediately contact counsel to devise and implement a response plan.
  • In the event of litigation, employers should remove BIPA cases to federal courts when possible, particularly where the allegations focus on notice and consent issues, as employers can argue that plaintiffs cannot establish the necessary harm to establish standing as required by the Supreme Court case Spokeo, Inc. v. Robins, 136 S. Ct. 1540 (2016) (requiring more than a “bare procedural violation” to establish harm). Because employees likely will have difficulty establishing actual harm where the biometric data was stored in a confidential and secure manner, employers may be successful in getting such claims dismissed.

As the laws regulating biometric data continues to evolve, employers should monitor this issue closely and consult with counsel as further developments occur to ensure compliance with any relevant regulations.

It is highly likely that the National Association of Insurance Commissioners (“NAIC”) will adopt a model data cyber security law premised largely on the New York State Department of Financial Services (“NYSDFS”) cyber security regulations.  Recently, we discussed the NYSDFS’ proposed extension of its cyber security regulations to credit reporting agencies in the wake of the Equifax breach.  New York Governor Andrew Cuomo has announced, “The Equifax breach was a wakeup call and with this action New York is raising the bar for consumer protections that we hope will be replicated across the nation.”  Upon adoption by the NAIC, the NYSDFS regulations requiring that NYS financial organizations have in place a written and implemented cyber security program will gain further traction toward setting a nationwide standard for cyber security and breach notification.  Indeed, although there are differences, the NAIC drafters emphasized that any Licensee in compliance with the NYSDFS “Cybersecurity Requirements for Financial Services Companies” will also be in compliance with the model law.

The NAIC Working Committee expressed a preference for a uniform nationwide standard: “This new model, the Insurance Data Security Model Law, will establish standards for data security and investigation and notification of a breach of data security that will apply to insurance companies, producers and other persons licensed or required to be licensed under state law. This model, specific to the insurance industry, is intended to supersede state and federal laws of general applicability that address data security and data breach notification. Regulated entities need clarity on what they are expected to do to protect sensitive data and what is expected if there is a data breach.  This can be accomplished by establishing a national standard and uniform application across the nation.”  Other than small licensees, the only exemption is for Licensees certifying that they have in place an information security program that meets the requirements of the Health Insurance Portability and Accountability Act.  According to the Committee, following adoption, it is likely that state legislatures throughout the nation will move to adopt the model law.

The model law is intended to protect against both data loss negatively impacting individual insureds, policy holders and other consumers, as well as loss that would cause a material adverse impact to the business, operations or security of the Licensee (e.g., trade secrets).  Each Licensee is required to develop, implement and maintain a comprehensive written information security program based on a risk assessment and containing administrative, technical and physical safeguards for the protection of non-public information and the Licensee’s information system.  The formalized risk assessment must identify both internal threats from employees and other trusted insiders, as well as external hacking threats.  Significantly, the model law recognizes the increasing trend toward cloud based services by requiring that the program address the security of non-public information held by the Licensee’s third-party service providers.  The model law permits a scalable approach that may include best practices of access controls, encryption, multi-factor authentication, monitoring, penetration testing, employee training and audit trails.

In the event of unauthorized access to, disruption or misuse of the Licensee’s electronic information system or non-public information stored on such system, notice must be provided to the Licensee’s home State within 72 hours.  Other impacted States must be notified where the non-public information involves at least 250 consumers and there is a reasonable likelihood of material harm.  The notice must specifically and transparently describe, among other items, the event date, the description of the information breached, how the event was discovered, the period during which the information system was compromised, and remediation efforts.  Applicable data breach notification laws requiring notice to the affected individuals must also be complied with.