Sean McDonald is a Project Manager with telc in Frankfurt. He works with telc’s European partners to create, manage and deliver tests for language learners. He has recently been a driving force behind ClarityEnglish’s Dynamic Placement Test. telc (The European Language Certificates) is a non-profit subsidiary of the German Adult Education Association. For 50 years, telc has offered tests based on international standards: fair, transparent and reliable.
Sean McDonald (SM) of telc catches up with Adrian Raper (AR) at the IATEFL Conference in Glasgow. He discusses his philosophy of testing, and the steady move from paper-based exams towards digital language assessment.
AR: Sean, what first attracted you to test development?
SM: I started with the German Bundessprachenamt, the Federal Ministry of Languages, which is a part of the Ministry of Defence. Our job was to teach Air Force officers and to bring their English up to the level expected for NATO. As a part of that we started developing examinations for pilots, so that we had proof of their language skills. So that’s where we started our research: learning the parameters of test development, and trying to come up with tests that were valid, reliable and objective. The reason we started it was that someone’s entire career depends on the outcome. Up until then it had depended on one English teacher, who was neither an officer nor a pilot, deciding these people’s fates. And we thought, well, we should be a little more fair. So that’s how I got in, from the side, as it were. Then I moved on to telc, which specialises in language testing, and I’ve been there ever since.
AR: But telc isn’t just about English, is it? telc creates tests in ten languages. Do you approach each one differently?
SM: We approach language testing the same for all the languages. We have the same specification, guidelines and quality standards. We also use the same statistical assessment tools for our tests. Of course there are some differences. There are some grammatical structures, for example, which are at English A1 and German A2, but these are very minor things.
AR: So I suppose you’ll say that for English, the same test is valid in countries where dialect and culture are different, such as the US, Australia, Ireland?
SM: I would say, why not? We are basing our assessment on the CEFR, and there are very clear ‘can do’ statements, and these ‘can do’ statements don’t change for different countries. Look, you’re English and I’m American. People like you and me both agree that we speak English, and we have no trouble speaking to each other. And we can both order a cup of coffee here in Scotland, and take a taxi. So I don’t see why there should be different tests.
AR: Let’s move on to the topic of our conference talk* — digital tests. What are the strengths and weaknesses of paper-based versus digital tests?
SM: I guess the first strength of paper tests is that we have them, they’re proven, and they work. Then there’s the security question. I’m not going to say that paper tests have a security advantage, but at the moment we can manage the security better with paper. We can lock a room, with no phones and no computers, and have an invigilator watch them do the test so they don’t copy from each other. Even writing answers on their bodies is not possible because we have task-based tests — and it’s all about how you use the language.
In the future, I think there’s a huge advantage for digital testing. I can send tests all over the world, they can be marked almost instantaneously — maybe in the future also for essay questions and the speaking tests. So there’s a huge speed advantage and we save shipping costs. At telc we send out paper tests for ten languages all over the world and we have enormous shipping costs.
AR: You’ve worked on at least two digital tests. How did you conceptualise them?
SM: That’s been the hardest part, of course. The very first time we did a digital test, the concept was to make it as close to the paper test as possible. That was a mistake. I think with digital tests we have opportunities to do new things — new test items and new test structures that I hope will be more effective, and will certainly be more fun, more interesting for the students and more engaging.
There are some steps before you design the test. You have to know who your stakeholders are, what you’re looking for and exactly why you are making this test. And you build the test on that. But it’s been fun designing the electronic test because we wanted as far as possible to start from scratch. That means not using exercise types that we’ve done before, and trying to come up with completely new and original ideas, yet keep the quality that we’ve always had. There’s only so much you can do on paper.
When it comes to productive skills, it’s a tough one and I feel there are arguments either way. ETS (publisher of TOEFL and TOEIC) especially believe that tests are valid without speaking or writing elements. I think it is open to question, and we need to collect research data. One thing that is true is that we come from a Euro culture where any given student tends to be at more or less the same level in all the skills — whereas in other parts of the world this can be very different.
AR: Do you see digital tests completely replacing paper and pencil?
SM: I think it’s really a financial question, for two reasons. From telc’s experience in central Europe, no matter how wonderful the electronic test we make, most of the institutions we work with don’t have the hardware to run it. They probably have the hardware to run five tests at one time, but that doesn’t suit their needs.
On the other hand, if we had the hardware — if for example we could run the tests on the students’ own devices — we could offer the tests much more cheaply. We can see this already with electronic speaking tests. They speak into a camera for ten minutes and that greatly reduces the cost. At telc we still offer the face to face oral interview, which is high quality and much preferred by the students. It helps the students relax and it’s a better experience. But again, the quality costs a lot of money. There’s always going to be this tension between cost and quality.
AR: Do you have a vision for testing in the future?
SM: One issue I have and one thing I’m starting to notice is the relevance of testing. In the past we had more formalised, structured systems around and a test carried a lot more weight. Now I’m starting to see people asking, ‘What is the relevance of this test? Why should I take this English test?’ And it comes down to, ‘Am I being forced to take a test?’ And most people feel they are being kind of coerced into it, which I feel is too bad.
I mean, our whole idea with telc 50 years ago was to offer somebody a source of pride. You know: ‘Hey, I took a course, I learned Japanese. And look at this! This is a certificate that proves that I can speak Japanese now.’ But it’s evolved into a coercion tool, and I’m sorry to say at the moment people don’t take tests for fun. It costs money, it takes time, people are afraid of tests. So those are things that I would like to change about testing because I think, you know, what is wrong with showing how good you are?
I think it’s fair to make the comparison with sport. My daughter’s a swimmer, and you get all the kids together and they have a race and they’re all excited and they see how everybody does. I think that’s a very positive thing. Some people say, well children shouldn’t be competing with each other, but I think if we could get a little bit of that in there, it would really help language learning. That’s the washback effect — and that’s an important part of our philosophy, you know? To have fun.
*This post is based on Sean and Adrian’s talk at IATEFL Glasgow 2017 – ‘Easier said than done: Using mobile phones for a test’
No more marking. Run your test on their devices. The Dynamic Placement Test from ClarityEnglish and telc