Purpura Q&A | Arts and Humanities

Skip to content Skip to main navigation
Teachers College, Columbia University
Printer-friendly Version
Teachers College, Columbia University Logo

Rethinking Assessment: An Interview with Dr. James Purpura

This past summer, I met with several graduate students in the Applied Linguistics program, all of whom have made notable strides in their disciplines. Two students, Fred Tsutagawa and Saerhim Oh, Ph.d. Students in the Applied Linguistics program, are now completing their third year in an internship program with Educational Testing Service (ETS). A third student participated in conferences in front of peers and experts in the field of assessment while earning his Ed.M.

What or who was leading these students to make such accomplishments during their time at TC?

The students all agreed the beginning of their trajectories sparked with Dr. James Purpura, Associate Professor of Linguistics and Education in the TESOL and Applied Linguistic program.

Dr. Purpura has been at the forefront of two groundbreaking theories: Learning Oriented Assessment and Scenario-Based Assessment. He also serves on the Committee of Examiners at ETS. His work and easy going demeanor allowed him to garner a relationship between ETS and Teachers College that has brought recognition to both institutions.

I sat down with Dr. Purpura in the atrium of Low Memorial Library at Columbia University to discuss his two theories, his current work in redesigning a placement test for the Community English Program and his views on standardized testing.

On how he developed Learning Oriented Assessment and subsequently, Scenario Based Assessment:

“When I first came to TC, I was very interested in quantitative analysis and the relationship between different types of variables outside the test construct itself that had an impact on the test construct. So for example, my own work looked at the cognitive dimensions of language assessment and how cognition helped or hindered people from doing well on a variety of different language tests. Because of that work, simultaneously, I became interested in the assessment of grammatical ability,” said Dr. Purpura.

“More recently, I wanted to look at how both cognition and grammar were developed and functioned in the context of a second language classroom,” he said. “In other words, there are a lot of times in the classroom where teachers give formal tests and they give informal tests like pop quizzes, just quizzes here and there, and there are a lot of times people embed types of assessment in their teaching,” he said.

Dr. Purpura wanted to know what was actually happening in a second language classroom when people learn grammar in order to speak, read, write and listen about a certain topic. In order to do that, he videotaped intermediate level classrooms in the Teachers College Community English Program to observe teachers and students.

These observances piqued Dr. Purpura’s interest to the point that he became interested in assessment in the context of a classroom, or Classroom-Based Assessment. Although research has been done on Formative Assessment or Assessment for Learning, two other ways to describe Classroom-Based Assessment, it seemed that they were only perceived through the lens of the instructor and not the learner.

What Dr. Purpura wanted to know was what happens after a teacher identifies an error—is there feedback? Is it ignored? Is there follow up with the student?

He found that much assessment occurs in the classroom spontaneously. “The more I looked at data from these classrooms, I realized a whole lot was going on,” he said. “There was a knowledge base from the teacher to be able to manage the whole event. There’s also a knowledge base from the student to be able to manage their own learning.”

On dimensions that affect the accuracy of assessment:

Dr. Purpura claims there are many dimensions which impact assessment and its accuracy, which is what led him to Learning Oriented Assessment. There is an affective dimension which is concerned with how students are feeling about their own learning and at times, their own identity as a language learner.

“For example, if a student in the classroom thinks they’re not a good language learner, and in the context of a normal conversation, the teacher corrects them every time they make a mistake, even though the mistake doesn’t need to be corrected at that particular point in time—they just want to get their ideas across—then this kind of reinforces the student’s idea that, ‘I can’t really communicate because I have all these mistakes, and therefore I am a bad language learner, so do I continue or don’t I continue?’” he said.

Another dimension is proficiency which signals the content the students are learning. The learning dimension and the cognitive dimension and interactional dimension are two others that Dr. Purpura takes into consideration when creating Scenario-Based Assessments.  

“By Learning Oriented Assessment, I’m talking about all of the different types of things that could impact what is trying to be learned in the context of a classroom. That’s theoretically an approach to assessment embedded in classrooms,” he said. “I was trying to think how can we do this, you have an assessment or a standardized assessment, and how can you embed learning into assessment?”

To test this theory, Dr. Purpura gave a group of students a writing task based on a general biological process. With very little instruction, he asked the students to write an essay about it. Many students, even those with a science background, struggled with writing their response simply because they did not possess the necessary vocabulary as second language and first language learners.

Dr. Purpura then asked students how they felt about how they did on the essay. This was part of the assessment. Based on that assessment, he gave the same group of students several more tasks where he incrementally provided more assistance, like giving them the needed vocabulary. With each increase in assistance, he received better responses.

“And all of the information was based on an approach to learning oriented assessment. I asked them a question on those different dimensions. One of the questions in the proficiency dimension was this too hard? The learning dimension was, did this information help you and how? The affective dimension was, how did you feel about this time as opposed to the last time?” he explained.

This led Dr. Purpura to become a trailblazer in developing an entirely new testing genre. He asked whether he could take all of this new information and generate a standardized test where he could get information about student proficiency in reading, writing, speaking and listening while helping them learn at the same time. He also wanted to embed the test into a social context, like a classroom.

As part of this test, he wanted to know, “Can I design a test that really simulates what we do in classrooms when a group of us are trying to solve a coherent problem?”

On Scenario-Based Assessment leading to his relationship with ETS:

Recently, Dr. Purpura and his students from his Internship Course have been trying to develop for the last two years, a new set of assessments for the Community English Program (CEP) at Teachers College.

He explains that within Scenario-Based Assessment, a test should cause the test taker to display knowledge of ability to use language, the ability to communicate content accurately and he or she should also learn something.

“We spent the last two years in my internship class reading the work in Scenario-Based Assessment. There is not very much work in it. The only work that’s been done in it, that I know of, is with Educational Testing Service, with John Sabatini and Tenaha O’reilly who’ve been doing Scenario-Based Assessment in the context of reading comprehension. Or in the division called CBAL, and that’s at ETS once again with Randy Bennett.*

In October 2014, Dr. Purpura hosted the Teachers College, Columbia University Roundtable in Second Language Studies (TCCRISLS), a three day roundtable on the topic of Scenario-Based Assessment. Dr. Purpura, the faculty sponsor for the event, invited people from around the world to discuss learning and assessment, including Mr. Sabatini and Mr. O’Reilly from ETS. Because of that forum, the three experts decided to collaborate.

The forum then led to the first ever TC-ETS forum where experts from ETS and students from Dr. Purpura’s internship course would come together to discuss their respective research efforts.

“Basically, what we’re doing is designing assessment through a scenario but being informed by a Learning Oriented approach to the types of things of how we design the actual assessment. This is completely brand new, we’re the only ones I think that I know who are really trying to expand this whole idea of assessment doesn’t have to be go and sit and take a TOEFL exam and here’s your score.”

On the current Scenario-Based Assessment Prototype in the works:

Dr. Purpura and his students have already mapped out the structure of a Scenario-Based Assessment and are now perfecting the technological aspects of a new placement exam for the Community English Program.

In their scenario, test takers will use computers to access and complete the test. They are then given resources about a certain topic, like the life cycle of a jellyfish. Students must then take a prior knowledge test to measure how much vocabulary they know related to the life cycle. A virtual teacher then gives students a prompt for a test. Working with two other virtual partners, the student must describe what happens in the life cycle by accessing the Internet and reading hand-picked articles, controlled by the test architects. The student is provided with three articles and is allowed to choose which one he or she would like to read. The other virtual partners read the remaining articles. Finally, the group members must reconnect and write what they’ve learned. Here is where the first assessment of writing is recorded.

The second part of the test might include watching a set of Youtube videos, also divided between the group. Each must summarize what they’ve just heard, which is assessed as the listening component. In the third assessment, the group would have to create a pros and cons list based on their readings. After the group has organized their information, the test taker must pitch a short 3-5 minute video to the teacher, which comes in as the speaking component.

“This is what we really do in real life. All the way across this, what you can track when they’re writing their summaries and stuff, are they incorporating some of the language that they’re learning? How are they developing the content? Is the content being elaborated upon? Is the content more nuanced and organized? In the end, what are they able to do in speaking?” Dr. Purpura explains.

At the end of the exam, the virtual partners regroup to re-test the same vocabulary words from the beginning of the test which would most likely be improved.

Although the test has not launched, Dr. Purpura and his internship students have already developed the first prototypical scenario for the CEP, but are finding difficulty in transferring the necessary information into an appropriate online platform.

“What we haven’t developed is the platform—we’re still looking at platforms—this is almost two years we’ve been doing this– that will allow us to embed in the context of one coherent assessment, videos and readings and capturing texts and capturing spoken and allowing these things to be linked,” he said. “The whole linking thing for us—it doesn’t seem to be rocket science in technology but there are no programs out there that actually allow us to do this. So we’re a little bit stuck on this.”

On the current climate of testing:

For several years now, educators, parents and even students have become harsh critics of the deluge of standardized testing thrown their way, most noticeably during last year when thousands of students walked out on Common Core- related exams. Dr. Purpura’s hope is that critics of assessment become more informed about assessment and the potential it has to deliver important information about a learner. As a panelist on the Defense Language Testing Advisory in Washington, D.C., it seems no one would want testing to become more representative of students’ knowledge than Dr. Purpura.

“There are some bad things that can happen with assessment, but there are a lot of good things that can happen from assessment. Assessment, really, the heart and soul of assessment is trying to get information. It’s trying to get information, in my view, in order to help people develop more. And I think we can do this, even with the SAT, this can be designed in terms of helping to develop more.

“I don’t think the end of assessment should be to give them a summative grade,” he said. Dr. Purpura believes the end of assessment should provide information about the learner’s ability to display knowledges, skills and abilities about a given task and the information should then be provided to the learner so he or she can improve.

“This is where I believe assessment should go,” he said.

On his students:

What Dr. Purpura offers is research that paves the way for a new kind of standardized testing, but he is also actively training his students in the Assessment track of Applied Linguistics to realize their professional abilities.

In his internship course, he asks students to do their best work but to also take risks. Although their work might be heavily critiqued, Dr. Purpura assures them that all of their ideas, even the rough ones, are a part of the process.

Those working for the highly coveted position at ETS, gaining recognition at professional conferences based on Dr. Purpura’s work, or participating in his internship class are not only carrying on a legacy, but spearheading the field of assessment in their own right.

This Spring, Dr. Purpura, John Sabatini and Tenaha O’Reilly will offer a weekend workshop. In June 2016, at their professional conference, the three will also offer a pre-conference workshop on designing and developing Scenario-Based Assessments.

*John Sabatini is the Managing Principal Research Scientist in the Center for Global Assessment at ETS.

*Tenaha O’Reilly Senior Research Scientist in the Center for Global Assessment at ETS.

*CBAL is an ETS initiative geared toward advancing research in creating new knowledge to improve educational assessment for the K-12 level


Picture of Nori KatoNori Kato is a Staff Writer and Office Assistant for the Department of Arts and Humanities. She is also a second year M.A. student in the International Educational Development program at Teachers College.

Recent Stories

  • Apply
  • Request Info