9 minute read
COMPUTER-BASED FOREIGN LANGUAGE EXTERNAL ASSESSMENT IN THE ESTONIAN CONTEXT – OPPORTUNITIES AND CHALLENGES Kristel Kriisa
from Open 58, 2020-2
by Katrin Saks
COMPUTER-BASED FOREIGN LANGUAGE EXTERNAL ASSESSMENT IN THE ESTONIAN CONTEXT – OPPORTUNITIES AND CHALLENGES
Kristel Kriisa Foundation Innove
Advertisement
In the globalised world where everyone is expected to speak at least one foreign language in addition to their mother tongue, the ways of evaluating people’s foreign language skills are becoming increasingly important. At the same time, rapid technological developments, as well as a recent worldwide pandemic have led to innovative approaches to how language tests are prepared, administered, assessed and analysed. This has also been the case in Estonia.
Up until now, the foreign language tests used in external assessment in Estonia (e.g. national standardised tests and exams) have been traditional paper-based (PB) tests, which have been produced following ALTE Minimum Standards as well as EALTA Guidelines for Good Practice in Language Testing and Assessment. The Year 9 and 12 examinations are linked to the Common European Framework of Reference for Languages (CEFR) and measure all four basic language skills: listening, reading, speaking and writing.
To implement the Estonian Lifelong Learning Strategy, the Ministry of Education and Research initiated the Digital Turn programme where it is stated that computer-based (CB) assessment would create “more and easier opportunities for schools to make use of assessment results in teaching and to measure different types of knowledge and skills” (HTM, 2018). Based on the above, CB standardised tests and CB exams will soon be developed and introduced.
The Ministry wants to develop CB foreign language tests that measure all the four skills and are prepared, administered and assessed in the Examination Information System (EIS). Test results can be accessed by different stakeholders (e.g. students, parents, teachers, school management, school administrators, ministry, testing organisation) and should help them make decisions concerning teaching, learning and assessment processes.
In 2017, the Ministry and Foundation Innove started cooperating closely to establish an e-assessment system for foreign languages. A document giving an overview of this innovative approach in foreign language assessment was developed in collaboration with several foreign language experts. It concluded, firstly, that students should receive more systematic feedback on the progress of their foreign language learning, and secondly, that CB tests to evaluate as well as indirectly support students’ foreign language learning pathways should be available at the end of Years 6, 9 and 12.
Although students’ foreign language proficiency has been evaluated externally in Estonia for more than 20 years, the number of people who can be considered experts in the field of foreign language assessment is relatively small and not many national studies have been carried out in the field. Therefore, a master’s thesis in educational technology was written at Tallinn University in 2019 (Kriisa, 2019), which studied the suitability of CB tests for implementation in the national external evaluation system based on the first pilot CB test in English. Another master’s thesis will be written next year to analyse how CB tests could be used in the Estonian context to evaluate students’ speaking skills.
Overview of pilot tests
Since 2018, five different CB tests in English have been piloted. Interest in the pilot tests has been remarkable and in total 2,102 students from 81 schools have taken part in the pilot sessions. Three of the tests were at A2 level, one was at B1 level and one was a bilevel (B1/B2) test. Four of the
tests assessed reading, listening and writing skills, one measured students’ speaking skills. Most of the tasks in the pilot tests were automatically marked by computer. Writing tasks were double marked using marking scales: by schoolteachers and by assessors trained at Innove. In total, 128 different teachers of English have been involved in the assessment of the five pilot tests. At the end of each test, a short questionnaire consisting of mostly multiple-choice questions was added to collect feedback from students. Feedback has also been collected from those involved in administering and assessing the test (e.g. teachers, educational technologists and trained assessors).
CB tests: opportunities
The pilot tests have shown that compiling tasks for CB tests in the EIS environment gives item writers the opportunity to approach the task from a different angle, because several rules that apply to paperbased tests are no longer relevant. However, based on the recommendations of testing experts, the convenience of the test taker must be constantly considered when compiling tasks both on paper and in an e-environment.
The EIS also offers more creative and interactive solutions than paper-based tests allow. For example, images, sound files, graphs and videos can be used to illustrate text, without adding any additional costs. CB tests replace boring gap-filling and matching tasks with much more interactive solutions, such as drop-down menus, text boxes or dragging the answers.
The EIS also offers additional options which help increase test security. For example, multiple-choice tasks can be programmed so that students see the answer options in different orders. This reduces the risk of students collaborating with one another whilst sitting the test.
The median time spent on completing the test and the feedback collected from students suggest that taking CB tests is faster, more exciting and more convenient than taking traditional PB tests. In contrast to the common practice of all students completing the listening component together and listening to each audio file twice, students were able to complete the listening tasks at their own pace, thereby reducing average time spent. Because the audio files were listened to individually using headphones, each student was able to play them at their preferred time. The students could also decide for themselves whether they needed to listen to each audio file once or twice.
The writing tasks look more authentic in the EIS environment than in a PB test. In the latter, one option is to ask the students to write a letter to his or her pen friend, which seldom happens in real life, or have them write an email on paper using a pen, which is another illogical activity. In a CB test the environment is natively digital which lends itself to a natural user experience for such task types.
The fact that in the writing task, the number of words written is automatically counted is certainly helpful to the students as they do not have to spend extra time checking whether they have met requirements. In the case of open-ended questions, students can delete and retype a misspelled answer, whereas in a PB test, any corrections made will always remain visible and can look messy.
The pilot tests demonstrated that automated assessment can be used with most tasks, which is obviously much faster than with PB tests. In the case of PB tests, the risk of human error in assessing is clearly much higher than in the case of e-tests. Thus, e-assessment makes it possible to significantly reduce the level of subjectivity. For matching and multiple-choice tasks, the key is generally easy to compile, and assessment is completely automatic. The responses do not have to be keyed in manually and the writing part can be assessed by several assessors at the same time, so there is no need to scan or transport the papers. The computer automatically scores blank answers with 0 points and automatically counts the number of words written which significantly reduces the assessors’ workload. Support materials for the assessors (e.g. marking scale, sample scripts) can also be uploaded to the portal and the assessors can easily ask the Assessment Team Leader for advice during the assessment period. When it comes to PB tests, assessors often struggle as it is not always easy to decipher students’ handwriting. In the case of e-tests, difficult-to-read or even illegible handwriting does not affect assessment.
CB tests: challenges
Unfortunately, the EIS occasionally imposes limitations and the use of digital solutions sometimes results in unexpected technical problems. Since 2018, either before or after each pilot test, software
Also, the graphical user interface of the EIS portal takes some time to get used to and registering test takers online or taking a CB test can be a bit tricky when done for the very first time. Sometimes logging-in to the EIS environment using a password, ID card, Mobile-ID or Smart-ID, can create problems that do not occur when students are sitting a PB test.
In addition, the e-assessment limits the types of tasks that can be used. In the case of open-ended tasks, the assessment criteria need to be thought through very carefully and more experts need to be involved before testing. Assessing open-ended listening tasks also poses some challenges. Such tasks require additional work during the test development period and a careful pre-testing process to ensure that all correct answers are automatically marked as correct in the end. In the case of PB listening tasks, small typographical errors in responses have been acceptable so far. However, because the computer cannot decide independently which errors may be considered minor, and as it is also impossible to include all potential misspelled responses in the assessment algorithm, misspelled answers are automatically marked as incorrect.
Feedback
The pilot tests have shown that students generally like taking CB tests. Probably because these allow the use of digital solutions and because images, sound files and interactive features make the whole process of taking the test more interesting.
The pilot tests have also revealed that depending on Internet bandwidth, technical problems may occur, and these can obviously have a negative effect on students’ performance. Fortunately, most of the students stated that there had been no technical problems during the test.
Teachers’ feedback suggested that students were not used to typing text and tended to make more spelling mistakes than they would have in PB tests. An analysis of responses to the writing tasks confirmed the same issue. The English and Estonian alphabets are relatively similar. However, for Russian-speaking students (especially younger ones), having to use a foreign keyboard can cause serious problems as it is completely different from what they are used to.
Next steps
The pilot tests carried out so far have shown that it is possible to design, administer, sit and assess CB English tests in the EIS environment. Compared to PB assessment, CB assessment offers several innovative solutions and will probably replace PB tests in the future. However, it also presents several challenges that need to be overcome before high-stakes national PB exams can be replaced by CB ones.
In the autumn of 2020, more pilot tests in English will be carried out. While the first trials focused more on designing tasks and administering tests in the EIS environment, future ones will aim to provide reliable and understandable feedback to students, teachers and schools.
WORKS CITED
HTM. (2018). Digipöörde programm 2019–2022. www.hm.ee/sites/default/files/2_digiprogr_2019_22_ seletuskiri_28dets18.pdf. Kriisa, K. (2019). Arvutipõhine võõrkeeleoskuse välishindamine Eesti kontekstis – võimalused ja kitsaskohad (master’s thesis). https://www.etera.ee/s/V21X41JIEG