Disparate but Not Serious
College is an expensive way of taking an IQ test.

The Wall Street Journal, Friday, May 18, 2007

By all accounts Marilee Jones did an excellent job as dean of admissions at the Massachusetts Institute of Technology. But she was forced to resign last month after it emerged that she had falsely claimed to hold three degrees when she first came to work at MIT 28 years earlier. In fact, she held only an undergraduate degree from an obscure Catholic college.

I feel her pain, for school never agreed with me. I repeatedly found myself in conflict with teachers and professors. I left high school after my sophomore year; and although I spent several years in college, I never bothered to graduate. In my 20s I considered a career in law, but I decided to stick with journalism in large part because the thought of spending three more years in school repelled me.

Ostensibly Ms. Jones was forced out because she committed fraud, but one can make a strong case that MIT had to get rid of her to avoid acknowledging that there is something fraudulent at the heart of American higher education. "If she had done a miserable job as dean, MIT might have been more forgiving," the leftist author Barbara Ehrenreich writes in an essay for the Nation, "but her very success has to be threatening to an institution of higher learning: What good are educational credentials anyway?"

Ms. Ehrenreich argues that "there are ways in which the higher education industry is becoming a racket: Buy our product or be condemned to life of penury, and our product can easily cost well over $100,000. . . . In the last three decades the percentage of jobs requiring at least some college has doubled, which means that employers are going along with the college racket. A résumé without a college degree is never going to get past the computer programs that screen applications."

What accounts for the increasing insistence on college degrees as a prerequisite for entry-level professional jobs? Ms. Ehrenreich offers this theory: "Employers prefer college grads because they see a college degree chiefly as mark of one's ability to obey and conform."

To a nonconformist dropout like me, this explanation is emotionally appealing. But I think it's bunk. For one thing, not all white-collar jobs require obedience and conformity. Some employers prize creativity and enterprise--but even they do not generally go out of their way to hire people without degrees. For another, it's hard to believe that employers today value the "ability to obey and conform" twice as highly as they did in the era of "The Man in the Gray Flannel Suit."

I have a better theory. I blame the Supreme Court.

What most professional jobs require is basic intellectual aptitude. And what has changed since the 1970s is that the court has developed a body of law that prevents employers from directly screening for such aptitude. The landmark case was Griggs v. Duke Power Co. (1971). A black coal miner claimed discrimination because his employer required a high-school diploma and an intelligence test as prerequisites for promotion to a more skilled position. The court ruled 8-0 in the miner's favor. "Good intent or absence of discriminatory intent does not redeem employment procedures or testing mechanisms that operate as 'built-in headwinds' for minority groups," Chief Justice Warren Burger wrote.

This became known as the "disparate impact" test, and it applies only in employment law. Colleges and universities remain free to use aptitude tests, and elite institutions in particular lean heavily on exams such as the SAT in deciding whom to admit. For a prospective employee, obtaining a college degree is a very expensive way of showing that he has, in effect, passed an IQ test.

But why are employers able to get away with requiring a degree without running afoul of Griggs? Because colleges and universities--again, especially elite ones--go out of their way to discriminate in favor of minorities. By admitting blacks and Hispanics with much lower SAT scores than their white and Asian classmates, purportedly in order to promote "diversity," these institutions launder the exam of its disparity.

Thus the higher-education industry and corporate employers have formed a symbiotic relationship in which the former profits by acting as the latter's gatekeeper and shield against civil-rights lawsuits. Little wonder that in 2003, when the Supreme Court considered the constitutionality of discriminatory admissions policies at the University of Michigan, 65 Fortune 500 companies filed a friend-of-the-court brief urging that they be upheld.

They were. By a 5-4 vote in Grutter v. Bollinger, the court found that universities may use race as "a 'plus' factor" in determining whether "an applicant might contribute to a diverse educational environment." The author of that decision, Justice Sandra Day O'Connor, said she expected that "25 years from now, the use of racial preferences will no longer be necessary."

Michigan voters didn't want to wait. Last November they approved an initiative banning discrimination by the state, including the university. Meanwhile Justice O'Connor has retired, and there is reason to think the man who replaced her, Justice Samuel Alito, will take a harder line against discrimination. Last year he joined Chief Justice John Roberts's dissent in a voting-rights case, which flatly stated: "It is a sordid business, this divvying us up by race."

It's quite possible that legally sanctioned discrimination in university admissions will come to an end sooner than Justice O'Connor expected. But the court cannot overturn the disparate-impact test in Griggs, because Congress codified it into law in the Civil Rights Act of 1991. Thus a reversal of Grutter would make it harder for employers to screen applicants and avoid litigation.

Then again, what do I know? I never went to law school.

Next article: Dealing With Iran (5/26/07)

Previous article: Life in the Vast Lane (The American Spectator, 5/07)

Go to main list