Clear78° WeatherClear78° Weather

Picking better college students

Wait list. That was the outcome of my application to Yale. I was eventually admitted, and I later had an opportunity that very few applicants ever have: I got to find out why I had been wait-listed.

My first job after college was in the Yale admissions office, and one day I sneaked into the attic where old records were kept and read my interview report, which described me as having a "flaky personality." I did not read the rest of my admissions file — I felt too guilty — so I cannot say whether it was the interviewer's assessment or some other perceived deficiency that consigned me to the wait list. I do know that when I finally got in, it was through the intervention of the admissions officer for my area, who saw something special in me.

Most students don't benefit from this kind of intervention; SAT scores and GPAs are much of what make or break a college application. Yet, over the course of my years in the Yale admissions office, I was continually surprised by how many of the students we accepted had sky-high SAT scores but seemed to lack basic practical and creative skills, whereas others with more modest scores were stunning successes at Yale, both academically and personally.

Great schools don't always produce great people. But it's not just what happens after students arrive on campus that's the problem. By and large, our best schools don't always pick the best people in the first place. Many students who appear to have tremendous potential at age 17, based on their SAT scores and GPAs, don't look so wonderful 20 years later.

An executive at a major investment bank, looking back on his 25 years on Wall Street, told me that SAT scores predicted quite well who would be good analysts — that is, they predicted the technical skills needed to evaluate investments. What they did not signal, he said, is who could envision where various markets are going, to see larger trends and to make decisions that go beyond individual stock or bond picks.

We can do a much better job of college admissions if we start thinking about student abilities differently. We should assess and value analytical, creative and practical skills and wisdom, not just the ability to memorize or do well on tests. And we should admit people on the basis of their potential for leadership and active citizenship.

Many admissions offices try to do this already, through essays and the like, but their applications nonetheless remain anchored on test scores and grades. This is in part because scores and grades can be quantified and therefore get more weight than more abstract, seemingly "fluffy" qualities.

There is, however, a way to test these other important skills, and get this: the kids we select using this new method, which puts more emphasis on things other than GPAs? They'll have higher GPAs in college. I know, because this is what has happened at Tufts University.

• • •

"Use one of the following topics to create a short story: a. The Spam Filter, b. Seventeen Minutes Ago . . . , c. Two by Two, d. Facebook, e. Now There's the Rub . . . , f. No Whip Half-Caf Latte, g. The Eleventh Commandment."

This was one of seven questions that appeared on the Tufts undergraduate application for the Class of 2013. How did it get there? The short answer is that it was crafted by a clever group of Tufts admissions officers, led by dean of admissions Lee Coffin.

The long answer goes back a few years. I became a psychology professor at Yale and in 1997 proposed a theory of successful intelligence, based on the idea that people are meaningfully intelligent only to the extent that they can formulate and achieve their goals by synthesizing their creative, analytical and practical skills and their wisdom. People need creative skills to generate new ideas, analytical skills to determine if they are good ideas, practical skills to implement their ideas and wisdom to ensure that their ideas help achieve a common good. This theory inspired me to design two projects to improve college admissions.

In the early 2000s, I collaborated with teachers and researchers at two high schools and 13 colleges and universities on a study we called the Rainbow Project. Our goal was to determine whether including a mix of creative, analytical and practical questions on an admissions test might benefit the admissions process. It did: Incorporating the results of our tests made predictions of freshman grade-point average twice as accurate as those based on the SAT alone, and 50 percent better than those based on SATs and high school grades combined. We also found that differences between ethnic groups were substantially smaller on our questions than on the SAT.

In 2005, I became dean of arts and sciences at Tufts and helped start the Kaleidoscope Project, which added to the Tufts application optional questions designed to assess creative, analytical and practical skills and general wisdom. Above is one example of a "creative" question; others might ask students to draw something, such as a design for a new product; to post a video on YouTube; or to imagine an alternate history (what if the Nazis had won World War II?). An analytical question might ask a student what his favorite book is and why. A practical question might ask a student how he convinced a friend of an idea. And a wisdom-oriented question might ask him how a high school passion might be turned toward the common good later in life.

This model provides a simple way of quantifying important qualities so they can become more central to the selection process. How could we evaluate answers to questions that seem so subjective? Through well-developed scoring rubrics. For example, one can score creative responses based on how original and compelling they are and how appropriately they accomplish the task at hand.

This system has been in place for five years, with about two-thirds of Tufts's roughly 15,500 annual applicants choosing to answer one of the optional questions. My collaborators and I have published a study in the journal College and University on the results. Among key findings: After controlling for high school grades and SATs, Tufts's new admissions questions, like those posed by the Rainbow Project before them, improved prediction of college grades. They also helped forecast which students would shine as active campus citizens and leaders, and virtually eliminated the admissions edge enjoyed by some ethnic groups.

Our approach is one that any college can adopt by merely adding a few questions to its application. But some schools, in their rush to improve their U.S. News & World Report rankings, are moving in the opposite direction. They are stripping their applications to the bare bones to make their applications easier to fill out. They hope to thereby increase application numbers, and thus rejection rates and the appearance of "selectivity." But they should ask themselves how, exactly, this approach makes their schools any better.

It certainly doesn't make the world better. Many of the major messes confronting us today — in corporate boardrooms and on Wall Street, in politics and even in churches — have been created by people who tested very well and earned high grades at prestigious institutions. They are smart, but foolish. The world might improve if we deliberately and systematically selected students not only for their knowledge and analytical skills, but also for their creative and practical skills — and their wisdom.

Robert J. Sternberg, provost and senior vice president of Oklahoma State University, is the author of "College Admissions for the 21st Century."

Picking better college students 12/11/10 [Last modified: Monday, November 7, 2011 1:39pm]

Copyright: For copyright information, please check with the distributor of this item, Washington Post.
    

Join the discussion: Click to view comments, add yours

Loading...