Employment

EEOC Releases Technical Document on AI and Title VII

On Thursday May 18, 2023, the Equal Employment Opportunity Commission (“EEOC”) released a new technical assistance document titled Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964. The document was released as a part of the EEOC’s Artificial Intelligence and Algorithmic Fairness Initiative and outlines considerations for incorporating automated systems into employment decisions. EEOC Chair Charlotte A. Burrows called the technical assistance document “another step in helping employers and vendors understand how civil rights laws apply to automated systems used in employment.”

The document assesses whether an employer’s selection procedures (the procedures it uses to make employment decisions such as hiring, promotion, and firing) “have a disproportionately large negative effect on a basis prohibited by Title VII.” Essentially, whether the employment practice has a disparate impact on the basis of race, color, religion, sex, or national origin, as protected by Title VII.

The document encourages application of the Uniform Guidelines on Employee Selection Procedures (the “Guidelines”) to determine if any tests or selection procedures, including the use of an algorithmic decision-making tool, are lawful under Title VII’s disparate impact analysis.

The Questions and Answers section of the document provides interesting insight. The EEOC takes the position that use of an algorithmic decision-making tool that has an adverse impact will violate Title VII unless the employer can show the use is “job related and consistent with business necessity.” The EEOC states that an employer can be held responsible under Title VII for selection procedures that use an algorithmic decision-making tool if the procedure discriminates on a basis prohibited by Title VII, even if the tool is designed or administered by another entity, such as a software vendor. The EEOC affirms that if an outside entity has an employer’s authority to administer or use an algorithmic decision-making tool on its behalf, the employer may be held responsible for the actions. Therefore, employers are encouraged to ask vendors, before using an algorithmic decision-making tool, whether they have evaluated if the use of the tool causes a substantially lower selection rate for individuals with any characteristics protected by Title VII. However, even with the vendor’s assurance, the employer can still be held liable for any disparate impact or disparate treatment discrimination if the vendor is incorrect about its assessment. Further, the document explains that an employer developing a selection tool can take steps to reduce the impact or select a different tool to avoid violating Title VII. However, an employer may be liable if it fails to adopt a less discriminatory algorithm that was considered during the development process.

Notably, the EEOC does not address disparate treatment under Title VII or other stages of the disparate impact analysis including if an algorithmic decision-making tool can determine job-related traits or characteristics.

Story originally seen here

Editorial Staff

The American Legal Journal Provides The Latest Legal News From Across The Country To Our Readership Of Attorneys And Other Legal Professionals. Our Mission Is To Keep Our Legal Professionals Up-To-Date, And Well Informed, So They Can Operate At Their Highest Levels.

The American Legal Journal Favicon

Leave a Reply