This week, a one-year “revival” period of statute of limitations began for individuals who assert civil claims of child abuse to file claims against institutions and individuals pursuant to New York’s Child Victims Act, even if those claims had already expired and/or were dismissed because they were filed late. The premise behind the Child Victims Act is that children are often prevented from disclosing abuse due to the social, psychological and emotional trauma they experience.

Additionally, the Child Victims Act, also expands the statute of limitations for bringing criminal claims against alleged perpetrators of child sexual abuse, and  permits alleged victims of these crimes to file civil lawsuits up until they reach age 55. This aspect of the legislation will have a significant impact on the volume of criminal cases, and even more so civil lawsuits, 385 of which were filed in the first hours of the revival period, with hundreds more geared up for filing in the upcoming weeks and months. Indeed, the New York State court system has set aside 45 judges specifically to handle the expected crush of cases.

Institutional Changes Following the New Child Victim’s Act

Religious, educational and other institutions that are committed to providing a safe environment for children should be thinking about how they can implement safeguards against child abuse within their institutions. An important step is keeping internal lines of communication with staff and families open, as well as educating staff and leadership as to their reporting obligations under New York law and on how to provide appropriate support if child abuse is suspected.

The Child Victims Act joins the Sex Harassment Bill also signed into law by Gov. Cuomo as significant changes by New York Legislators involving sexual abuse and harassment in New York State.

Our Employee Benefits and Executive Compensation practice now offers on-demand “crash courses” on diverse topics. You can access these courses on your own schedule. Keep up to date with the latest trends in benefits and compensation, or obtain an overview of an important topic addressing your programs.

In each compact, 15-minute installment, a member of our team will guide you through a topic. This on-demand series should be of interest to all employers that sponsor benefits and compensation programs.

In our newest installmentTzvia Feiertag, Member of the Firm in the Employee Benefits and Executive Compensation practice, in the Newark office, presents “HIPAA Privacy and Security Rule Compliance.”

While employers themselves are not directly regulated by the Privacy and Security Rules of the Health Insurance Portability and Accountability Act (“HIPAA”), most employers that sponsor group health plans have ongoing compliance obligations. This crash course offers a brief overview of who and what is covered by these rules, why employers should care about HIPAA compliance, and five tips to maintain compliance.

Click here to request complimentary access to the webinar recording and presentation slides.

This Employment Law This Week® Monthly Rundown discusses the most important developments for employers in August 2019.

This episode includes:

  • Increased Employee Protections for Cannabis Users
  • First Opinion Letters Released Under New Wage and Hour Leadership
  • New Jersey and Illinois Enact Salary History Inquiry Bans
  • Deadline for New York State Anti-Harassment Training Approaches
  • Tip of the Week

See below to watch the full episode – click here for story details and video.

We invite you to view Employment Law This Week® – tracking the latest developments that could impact you and your workforce. The series features three components: Trending News, Deep Dives, and Monthly Rundowns. Follow us on LinkedInFacebookYouTubeInstagram, and Twitter and subscribe for email notifications.

New York is the latest state to adopt a law that requires businesses that collect private information on its residents to implement reasonable cybersecurity safeguards to protect that information. New York now joins California, Massachusetts and Colorado in setting these standards. New York’s law mandates the implementation of a data security program, including measures such as risk assessments, workforce training and incident response planning and testing. Businesses should immediately begin the process to comply with the Act’s requirements effective March 21, 2020. Notably, New York’s law covers all employers, individuals or organizations, regardless of size or location, which collect private information on New York State residents.

The “Stop Hacks and Improve Electronic Data Security Act” (SHIELD ACT), signed into law on July 25, 2019, requires implementation of an information security program to protect “private information” defined as:

  • any individually identifiable information such as name, number or other identifier coupled with social security number, driver’s or non-driver identification card number or account number, credit or debit card number in combination with any security code, access code, password or other information that would permit access to the individual’s financial account, or biometric information (such as fingerprint, voice print, retina or iris image);
  • individually identifiable information coupled with an account number, credit or debit card number if circumstances exist wherein such number could be used to access an individual’s financial account even without additional identifying information, or a security code, access code or password; or
  • a username or email address in combination with a password or security question and answer that would permit access to an online account.

The law broadly requires that “any person or business” that owns or licenses computerized data which includes private information of a New York State resident “shall develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the private information.”

In order to achieve compliance, an organization must implement a data security program that includes:

  • reasonable administrative safeguards that may include designation of one or more employees to coordinate the security program, identification of reasonably foreseeable external and insider risks, assessment of existing safeguards, workforce cybersecurity training, and selection of service providers capable of maintaining appropriate safeguards and requiring those safeguards by contract;
  • reasonable technical safeguards that may include risk assessments of network, software design and information processing, transmission and storage, implementation of measures to detect, prevent and respond to system failures, and regular testing and monitoring of the effectiveness of key controls; and
  • reasonable physical safeguards that may include detection, prevention and response to intrusions, and protections against unauthorized access to or use of private information during or after collection, transportation and destruction or disposal of the information.

Small businesses of fewer than 50 employees, less than three million dollars in gross revenues in each of last three fiscal years, or less than five million dollars in year-end total assets may scale their data security program according to their size and complexity, the nature and scope of its business activities and the nature and sensitivity of the information collected.

Organizations that are covered by and in compliance with the Gramm-Leach-Bliley Act, the Health Insurance Portability and Accountability Act (HIPAA), and/or the New York State Department of Financial Services cybersecurity regulations shall be deemed in compliance with the SHIELD Act.

Failure to implement a compliant information security program is enforced by the New York State Attorney General and may result in injunctive relief and civil penalties of up to $5,000 imposed against an organization and individual employees for “each violation.” Depending on how the Attorney General seeks to apply this provision, this could potentially lead to significant monetary penalties for entities and their employees who fail to take required protective measures, including when those failures lead to a data breach. We can expect vigorous enforcement because the Attorney General submitted the SHIELD Act as an agency sponsored bill to keep pace with the use and dissemination of private information. Indeed, absent future clarification, the Attorney General may seek civil penalties to enforce reasonable cybersecurity safeguards even in the absence of a data breach. Of course, any enforcement activity by the Attorney General’s office will also have other damaging consequences, such as reputational harm and raise supply chain issues with the firm’s business partners.

We have long counseled employers using or contemplating using artificial intelligence (“AI”) algorithms in their employee selection processes to validate the AI-based selection procedure using an appropriate validation strategy approved by the Uniform Guidelines on Employee Selection Procedures (“Uniform Guidelines”).  Our advice has been primarily based on minimizing legal risk and complying with best practices.  A recently updated Frequently Asked Questions (“FAQ”) from the Office of Federal Contract Compliance Programs (“OFCCP”) provides further support for validating AI-based selection procedures in compliance with the Uniform Guidelines.

On July 23, 2019, the OFCCP updated the FAQ section on its website to provide guidance on the validation of employee selection procedures.  Under the Uniform Guidelines, any selection procedure resulting in a “selection rate for any race, sex, or ethnic group which is less than four-fifths (4/5) (or eighty percent) of the rate for the group with the highest rate will generally be regarded by Federal enforcement agencies as evidence of adverse impact,” which in turn requires the validation of the selection procedure.  These validation requirements are equally applicable to any AI-based selection procedure used to make any employment decision, including hiring, termination, promotion, and demotion.

As stated in the Uniform Guidelines, and emphasized in the FAQ, the OFCCP recognizes three methods of validation:

  1. Content validation – a showing that the content of the selection procedure is representative of important aspects of performance on the job in question;
  2. Criterion-related validation – production of empirical data demonstrating that the selection procedure is predictive or significantly correlated with important aspects of job performance; and
  3. Construct validation – a showing that the procedure measures the degree to which candidates possess identifiable characteristics that have been determined to be important in successful performance on the job.

With the exception of criterion-related validating studies, which can be “transported” from other entities under certain circumstances, the Uniform Guidelines require local validation at the employer’s own facilities.

If a selection procedure adversely impacts a protected group, the employer must provide evidence of validity for the selection procedure(s) that caused the adverse impact. Thus, it is crucial that employers considering the implementation of AI-based algorithms in the selection process both conduct adverse impact studies and be prepared to produce one or more validation studies.

The new FAQ also provides important guidelines on the statistical methods utilized by OFCCP in evaluating potential adverse impact.  In accordance with the Uniform Guidelines, OFCCP will analyze the Impact Ratio – the disfavored group’s selection rate divided by the favored group’s selection rate.  Any Impact Ratio of less than 0.80 (referred to as the “Four – Fifths Rule”) constitutes an initial indication of adverse impact, but OFCCP will not pursue enforcement without evidence of statistical and practical significance.  For statistical significance, the OFCCP’s standard statistical tests are the Fisher’s Exact Test (for groups with fewer than 30 subjects) and the Two Independent-Sample Binomial Z-Test (for groups with 30 or more subjects).

With the publication of this new FAQ, employers – and particularly federal contractors – should be sure to evaluate their use of AI-based algorithms and properly validate all selection procedures under the Uniform Guidelines.  Moreover, although not addressed in the OFCCP’s new FAQ, employers should also ensure that their AI-based algorithms are compliant with all other state and federal laws and regulations.  Additional recommendations from Epstein Becker & Green’s Artificial Intelligence strategic industry group can be found here and here.

Our colleagues Maxine NeuhauserNathaniel M. GlasserDenise Dadika, & Anastasia A. Regne

Following is an excerpt:

In Wild, which we discussed in a recent client alert, plaintiff Justin Wild (“Wild”) alleged that his employer, Carriage Funeral Holdings (“Carriage Funeral”) failed to reasonably accommodate his disability (cancer) and unlawfully discharged him in violation of the LAD because he used medical marijuana, as legally permitted by CUMMA. Carriage Funeral terminated Wild’s employment after he tested positive for cannabis following an on-duty motor vehicle accident.

The trial court dismissed the lawsuit holding that the fact Wild tested positive for cannabis  constituted a legitimate business reason for his discharge because cannabis use (medical or otherwise) remains prohibited under federal law. In rendering its decision the trial court relied on a provision in the law stating that CUMMA did not require employers to reasonably accommodate licensed use of medical marijuana in the workplace. The Appellate Division reversed, holding that the fact that CUMMA did not “require” employers to accommodate an employee’s use of  medical marijuana in the workplace, did not affect an employer’s requirement under the LAD to reasonably accommodate an employee’s disability, which could include an employee’s off-duty and off-site use of medical cannabis. …

Read the full article here.

The recently proposed amendment to the California Consumer Privacy Act (CCPA) should be a wake up call to those employers who are not already actively planning for the January 1, 2020 compliance deadline.

The amendment reaffirms that employers must (i) provide employees with notice of the categories of personal information collected and the purposes for which the information shall be used at or before collection; and (ii) implement reasonable cybersecurity safeguards to protect certain employee personal information or risk employee lawsuits, including class actions seeking statutory damages, for data breach under a private right of action provision. Employers cannot collect additional employee information or use collected information for different purposes than originally noticed without giving supplemental notice.

Although the amendment would grant a one-year moratorium before certain rights of employees contained in the original legislation are effective (e.g., right by employees to receive a copy of the personal information collected and to deletion in certain circumstances), the private right of action to recover minimum statutory damages or actual damages for unauthorized access and exfiltration due to a failure to maintain reasonable cybersecurity safeguards, and notice of collection requirements, were retained in the employment context.

In June 2018, California enacted the CCPA to protect California residents’ personal privacy from organizations that are in the business of buying and selling personal information or might otherwise collect personal information in their business activities.  For an in-depth analysis of the Act’s provisions, see here. The Act becomes effective on January 1, 2020, so businesses still have time to become compliant. EBG has prepared a compliance flow chart highlighting key thresholds and requirements, see here.

After the Act’s passage, objections were raised by the business community who complained about certain of the Act’s requirements. Of particular concern was that the Act covered personal information collected in the course of the employment relationship. Employers pushed for relief from the CCPA’s requirements as proposed in the original bill.

Recently, there has been a legislative effort to address these concerns from employers, with a proposed amendment providing that employee personal information collected “solely” for employment purposes is exempt from certain of the Act’s requirements until January 1, 2021.  See 7/8/2019 Senate Judiciary Committee and 4/19/2019 Assembly Committee on Privacy and Consumer Protection Reports. In other words, should this amendment pass, the rights by employees to deletion of and to receive copies of their personal information (see 1798.100(c); 1798.105)) and requirements of the Act other than 1798.100(b) (notice of collection) and 1798.150 (private right of action for data breach) would not apply to solely employment-related data for one additional year.

The legislators, however, retained intact the provision providing employees with a private right of action for data breach while also emphasizing that the cybersecurity protections apply to the collection of certain employee personal information as defined in Section 1798.81.5 (e.g., social security number, medical information, health insurance information, username and password). Although the exemption from certain of the Act’s requirements is garnering attention, the reaffirmation of the employer’s “duty to implement reasonable security practices and procedures” and providing a private right of action with minimum statutory penalties “per consumer per incident” (even in the absence of actual damage) for the failure to do so leading to a data breach is more notable.  Employers should immediately proceed to conducting a risk assessment of its collection and use of employee personal information and implementing reasonable cybersecurity safeguards. Employers should also prepare for providing employees with notices of collection practices required by January 1, 2020, and develop written policies and procedures concerning the collection and use of employee personal information.

Sanchita Bose, a 2019 Summer Associate (not admitted to the practice of law) in the firm’s Washington, DC office, contributed significantly to the preparation of this post.

Our colleague Amanda M. Gomez 

Following is an excerpt:

Additionally, employers that can demonstrate a good faith effort through proactive measures to comply with the Act may be able to mitigate liability should a claim arise. Similar to “safe harbor” provisions in equal pay laws in Massachusetts and Oregon, such proactive measures should include regular audits of compensation practices. While these measures do not create a complete defense, employers that successfully present evidence of a “thorough and comprehensive pay audit” with the “specific goal of identifying and remedying unlawful pay disparities” may avoid liquidated damages. The key word here is “remedying”; employers that conduct pay audits, but then fail to take steps to correct unlawful pay discrepancies revealed by the audit, will not reap the benefits of the “safe harbor” defense and could instead find themselves without the proverbial port in a storm.

Notably, the Act goes further than most other comparable state wage discrimination laws by mandating notification to employees of employment opportunities. Employers must make reasonable efforts to provide notice of internal opportunities for promotion on the same calendar day the opening occurs. These announcements must disclose the hourly or salary compensation, or at the very least a pay range, as well as a description of benefits and other compensation being offered. Failure to comply with these provisions could result in fines of between $500 and $10,000 per violation. …

Read the full post here.

Increasingly companies are using third-party digital hiring platforms to recruit and select job applicants.  These products, explicitly or implicitly, promise to reduce or eliminate the bias of hiring managers in making selection decisions.  Instead, the platforms grade applicants based on a variety of purportedly objective factors.  For example, a platform may scan thousands of resumes and select applicants based on education level, work experience, or interests, or rank applicants based on their performance on an aptitude test – whatever data point(s) the platform has been trained to evaluate based on the job opening.

Video interviews constitute one type of product offered by certain digital hiring platforms.  Video interviews may be offered in a variety of forms – from live interviews conducted by a hiring manager but simultaneously recorded for future audiences, to recorded interviews conducted by the computer program, giving applicants a limited time (e.g., 30 seconds) to record an answer to each question.  In any recorded form, these digital hiring platforms use artificial intelligence (“AI”) to analyze an applicant’s answers.  AI may be used to analyze facial expressions or eye contact, or even the speed of an individual’s response, in order to evaluate the quality of an applicant’s answers.

Such products raise a host of legal issues, including questions about hidden biases, disparate impact, disability accommodation, and data privacy.

One state has taken an initial step to put employees on notice of the use of these products. The Illinois Assembly and Senate recently passed the Artificial Intelligence Video Interview Act, a bill that creates disclosure requirements for companies that utilize video interview technology that relies on AI.  Specifically, the bill, which is expected to be signed into law by Governor J.B. Pritzker but has enough votes in the legislature to survive a veto, requires an employer seeking to use AI-enabled video interviewing technology to do the following before hiring for an Illinois-based position:

  1. Notify each applicant before the interview that AI may be used to analyze the applicant’s video interview and consider the applicant’s fitness for the position;
  2. Provide each applicant with information before the interview explaining how the AI works and what general types of characteristics it uses to evaluate applicants; and
  3. Obtain prior consent from the applicant to be evaluated by the AI program.

The bill also requires employers to take steps to protect applicants’ privacy.  Under the bill, video interview recordings can only be shared “with persons whose expertise or technology is necessary in order to evaluate an applicant’s fitness for a position.”  In addition, upon request from the applicant, employers must destroy all copies of the videos (including backups) no later than 30 days after the applicant requests the company do so.  This destruction requirement may be burdensome for employers, who should work with the vendor to ensure proper storage and timely destruction of any such videos.  Employers should also be prepared for conflicts between this provision and legal requirements to maintain copies of relevant information if litigation relating to such information is reasonably anticipated.

Illinois has a history of passing expansive laws protecting employees’ privacy, such as its 2008 Biometric Information Privacy Act (“BIPA”).  BIPA was one of the first acts to require notification and consent in collecting employee biometric data, and now the Artificial Intelligence Video Interview Act appears to be a first-of-its-kind law in the nation with similar notification and consent procedures.  While BIPA was an often ignored statute for almost a decade, recently there has been a slew of litigation involving the statute.  The Artificial Intelligence Video Interview Act could result in a similar wave of lawsuits, provided the Act allows for a private right of action (which is not clear, as currently drafted).

Assuming Governor Pritzker signs the bill into law as written, there are many questions left unanswered.  For instance, the bill does not define what AI means.  It also does not provide guidance on the specific information an employer must provide to a candidate to satisfy its obligation to describe “how” it works.  The 30-day deletion requirement is similarly vague and may conflict with other legal, statutory and/or regulatory obligations.  Nevertheless, it is likely that Illinois’ Artificial Intelligence Video Interview Act will not be an outlier.  Other jurisdictions may quickly follow suit.  Accordingly, employers using AI technology for video interviewing should, at a minimum, start considering how to provide notice and obtain consent from applicants before conducting interviews.  Notably, however, compliance with this Illinois law will not absolve an employer from liability for a product that exhibits other legal deficiencies.  Employers are advised to consult with counsel before implementing any type of digital hiring platform.

Our colleagues 

In Diaz, the plaintiff, who asserted she is visually impaired, alleged that the defendant – a supermarket chain based in Ohio – failed to make its website accessible to individuals who were blind. As a result, plaintiff claimed that she was unable to learn about certain products on the site, as well as promotions and coupons.

Defendant sought to dismiss the lawsuit on two grounds: (i) lack of subject matter jurisdiction, because its remediation of the barriers identified in the complaint rendered plaintiff’s claims moot; and (ii) lack of personal jurisdiction, because the Ohio-based defendant does not transact business in New York State, and accordingly, New York’s long-arm statute does not subject it to the court’s review.

Read the full post here.