People applying for Australian permanent residency or working visas have described how they have met the government’s required English language standards not by improving their language proficiency but by learning to outsmart the computer.
It follows the story of an Irish veterinarian with degrees in history and politics, Louise Kennedy, who failed the oral fluency section of the English test delivered by Pearson Test of English (PTE) Academic, one of five test providers used by Australia’s immigration department.
Unlike other test providers, Pearson uses voice recognition technology to test speaking ability, and the audio recordings are then scored by a computer. Despite English being her native language, Kennedy failed the test with a score of 74. The immigration department requires a score of at least 79 out of a possible 90.
Others have shared similar experiences.
Alice Xu, a childcare worker from China who obtained a master of education in Australia and who speaks fluent English, took the PTE test last year in order to apply for permanent residency. She was shocked when she failed.
I didn’t improve my English, I just changed the way I took the test
“I got a score of 41, so low that I felt so discouraged about trying again,” she told Guardian Australia. She said she was frustrated as she could already read, write and speak English fluently, had successfully completed university, and was unsure how to improve.
One year later she decided to try again, this time enlisting the help of a tutor who mentored her for two weeks. She said this helped her to pass the test with a perfect score of 90.
“I didn’t improve my English, I just changed the way I took the test,” the 34-year-old said.
“I did it by learning how the computer worked, I don’t think my English skills or ability improved in any way. This exam is really about your test-taking skills, it’s not about your speaking or language ability.”
Clive Liebmann is an English-language assessor with seven years’ experience helping students to prepare for English language tests, including the PTE. He tutored Xu, and described her English as “excellent” even before he began tutoring her.
He said many students with strong English came to him complaining they had failed the PTE.
“In my opinion it’s not so much about the computer marking them low because of an accent,” he said. “I think getting the required score is about giving the computer things to make it happy. If people speak in a natural voice they are less likely to do well.
“So I encourage students to exaggerate tonation in an over-the-top way and that means less making sense with grammar and vocabulary and instead focusing more on what computers are good at, which is measuring musical elements like pitch, volume and speed.
“I’m a real grammar nerd, but I also have a duty to my students to help them achieve as high a score as possible. In preparing people for PTE I’ve had to focus less on teaching English and more on exaggeration of sounds.”
He said while he believes voice-recognition technology has potential, it was not yet at a level that made it suitable for testing English proficiency.
On an English-language blog, a Canadian described how they “excitedly opened my PTE result document … expecting to feel self-satisfied and proud”.
“I FAILED the PTE speaking section by a lot; 43 out of 90, to be exact,” they wrote. “Just to reiterate: I failed a speaking test in my first language. How did this happen?”
Former test-takers also shared their experiences on a website containing study material for the PTE. The website includes advice from someone from India who passed the test despite not understanding some of the test information.
“In fact, without realising at first, I spoke on mute for two of the questions in [the] repeat sentence section and mostly spoke off-topic for one of the ‘re-tell lecture’ questions (because I didn’t understand anything from the lecture – but of course I did note down important words and used them while ‘re-telling’ the lecture). I definitely wasn’t expecting anything above 75 but I scored a full 90!”
Sasha Hampson, the head of English for Pearson Asia Pacific, said the test was highly accurate and there was an ongoing quality-assurance process.
“We have experts in the language field constantly reviewing the test,” she said.
“It is not one test, there are numerous versions as we test on a global scale and for security purposes we need a range of different versions. Each version and each question are monitored to ensure it is performing as it should and if candidates were consistently scoring too high or low that question would be withdrawn from the pool.”
She said the benefit of a computer-operated test was that it removed biases and complaints about discrimination based on race. She added that the score required by the department was a high one, and that even some of her Pearson colleagues who had taken the test had failed to achieve a perfect mark.
“It all comes down to how you perform on the test day,” Hampson said. “The analogy I’d draw is even if you have been driving a car for a long time and you are a capable driver it doesn’t mean you won’t make a mistake and fail to observe a stop sign. If you did that during a driving test, you’ll fail. It doesn’t mean you’re not a good driver, you’ve failed to meet standard on one day.”
A Department of Immigration spokeswoman said the government had not been made aware of any complaints about the voice-recognition technology used in the test.
She did not respond to questioning about whether the department would be reviewing use of the system.
“PTE Academic is the only English-language test provider accepted by the department using voice-recognition technology,” she said. “The department is not involved in the administration or operation of the English-language tests. Individuals can request to have their score reviewed by the provider.”
from Artificial intelligence (AI) | The Guardian http://ift.tt/2vOMoPB