Value-Added Tests Leave Out An Important Factor: Motivation

Share Button

How can you tell if students are learning anything in college? One method might be to have them take tests of their knowledge and skills when they start college and when they graduate, and then compare the two. Using this approach, some studies have concluded that students aren’t learning much at all. But as Scott Jaschik of InsideHigherEd notes, this method overlooks an important factor. Jaschik writes:

These test results may be high-stakes for colleges, many of which need to show accreditors and others that they are measuring student learning. But for the students taking the exams, the tests tend to be low stakes—no one must pass or achieve a certain score to graduate, gain honors or to do pretty much anything.

A new study by three researchers at the Educational Testing Service—one of the major providers of these so-called “value-added” exams—raises questions about whether the tests can be reliable when students have different motivations (or no motivation) to do well on them. The study found that student motivation is a clear predictor of student performance on the tests, and can skew a college’s average value-added score.

The ETS researchers gave the ETS Proficiency Profile to 757 students from three institutions: a research university, a master’s institution and a community college.

To test the impact of motivation, the researchers randomly assigned students to groups that received different consent forms. One group of students received a consent form that indicated that their scores could be linked to them and (in theory) help them: ‘[Y]our test scores may be released to faculty in your college or to potential employers to evaluate your academic ability.’ The researchers referred to those in this group as having received the ‘personal condition.’ After the students took the test, and a survey, they were debriefed and told the truth, which was that their scores would be shared only with the research team.

The study found that those with a personal motivation did ‘significantly and consistently’ better than other students—and reported in surveys a much higher level of motivation to take the test seriously. Likewise, these student groups with a personal stake in the tests showed higher gains in the test—such that if their collective scores were being used to evaluate learning at their college, the institution would have looked like it was teaching more effectively.” (Read more here.)

This makes a lot of sense—and also points up how important motivation is in terms of achievement. When we perform under the impression that a test or other event doesn’t matter, it shows.

Share Button

One Response to “Value-Added Tests Leave Out An Important Factor: Motivation”

  1. wellevk says:

    Yet, this is the approach we use in testing most students in the K-12 system. Their scores are not tied to grades, advancement, scholarship opportunities, etc. At most, school-wide pizza parties might reward good group performance. And we assume their scores are acurate reflections of their achievement?

Leave a Reply

Sign up for The Brilliant Report, a monthly newsletter full of the latest findings on how to learn smarter:

Close