Bloomberg Law
June 17, 2024, 8:30 AM UTC

Eliminating Law School Admissions Tests Won’t Help DEI Challenges

David Klieger
David Klieger
JD-Next

The US Supreme Court’s decision a year ago banning affirmative action in higher education admissions fed fears of plummeting racial and ethnic minority representation in law schools—including from key players in the legal education pipeline.

To maintain diversity and validity—the accuracy in identifying which applicants will be successful if admitted—law schools should incorporate more innovative approaches into their admissions processes. Schools should employ new types of assessments and ways of combining admissions information to minimize, if not eliminate, the racial and ethnic score gaps that have long plagued law school admissions.

Diversity Challenges

Law schools already were navigating a complex affirmative action environment with judicial decisions such as Bakke, Hopwood, Schuette, and Fisher. But until the Supreme Court ruling, there hadn’t been a national ban.

Based on data from 1980 to 2021, previous state-level prohibitions reduced law student racial diversity by 20%, impacting exclusively Black and Hispanic students in a portent of a large national decline.

Decreases in validity increase the risk of misfit between applicants and educational opportunities, harming applicants, law schools, the legal profession, and society.

While test scores historically have been the most accurate predictors of early law school grades, with undergraduate GPA second, there have been persistently large standardized test score and UGPA gaps between White candidates and those from underrepresented groups.

Given the emphasis on standardized test scores and UGPA, research reveals far lower admission rates for students from marginalized groups due to the size of these gaps, especially at elite law schools with the highest score requirements.

Diversity and Validity

Admissions tests can predict who will succeed academically in law school significantly more than any other test. A lack of valid and reliable testing will make it difficult to maintain equity and diversity in a post-affirmative action world.

While schools can achieve greater diversity by introducing randomness or low-reliability assessments such as personality tests into admissions, these methods reduce accuracy. Several evaluation methods in holistic admissions look reliable and valid on the surface, but they’re as dependable as a roulette wheel.

Considering applicants’ undergraduate institution also worsens the negative impacts of undermatching—in high-achieving students with lower incomes aren’t matched with competitive colleges—because highly qualified Black and Hispanic applicants attend elite colleges less often than white and Asian students do.

Relying on personal statements, recommendation letters, and interviews in admissions invites bias based on race/ethnicity and socioeconomic status. And these measures generally aren’t as accurate as test scores and UGPA are for predicting academic success.

Schools can improve their approach to personal statements, recommendation letters, and interviews—but in admissions, there haven’t been real improvements deployed on a large enough scale. The legality, efficacy, and cost-effectiveness of using proxies such as socioeconomic status and geography to increase diverse representation is uncertain at best.

Enhancing Representation

Schools must add valid, diversity-enhancing information into the admissions process. Instead of just using tools such as standardized tests, assessments that measure domain-specific knowledge and skills hold promise, as do multidimensional forced-choice measures that assess determinants of success such as grit and detail orientation. Unlike most other personality-based assessments, multidimensional forced choice measures substantially limit candidates’ ability to successfully fake their responses.

There also has been encouraging research on how to better combine admissions data to achieve diversity goals while maintaining validity, such as constrained optimization, pareto optimization, and policy capturing.

Law schools view admission of candidates with lower test scores and UGPAs as unduly risky—which is understandable, given its potential negative impact on attrition, bar exam passage rates, and US News rankings.

Law schools can choose to address the affirmative action ban by giving additional academic support to admitted candidates whose lower test scores and UGPAs indicate that they are at risk of attrition, bar exam failure, and/or low grades—and by expanding the metric of law student success beyond grades.

Experts in defining and measuring performance of law students and attorneys have observed that lack of racial and ethnic diversity in law school is partially because of failures to admit law school applicants based on a broader definition of the knowledge and skills needed to be a successful lawyer. They have identified 26 factors, many of which are unrelated to what standardized tests assess, that are critical to lawyer effectiveness.

These approaches require reliable and valid diagnostic tools—such as tests. Predicting grades will likely remain central to admissions because of the need to assess legal reasoning skills and knowledge acquisition efficiently and cost-effectively, and because many legal employers place significant emphasis on grades in making hiring decisions.

What Comes Next

Eliminating admissions tests altogether doesn’t solve the diversity challenges law schools face with standardized testing and UGPA. Instead, the answer lies in innovation.

If we fail to reliably and validly innovate admissions, we risk undermining both diversity and sound decision-making in legal education. This could have a significant ripple effect for the legal industry at large. We can’t wait until 2025 for admissions data to tell the story.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

David Klieger is an attorney and program director at JD-Next, an innovative law school admissions program and entrance exam.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Jada Chin at jchin@bloombergindustry.com; Melanie Cohen at mcohen@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.