Browsing English language learning and assessment by Journal
Now showing items 1-4 of 4
A new test for China? Stages in the development of an assessment for professional purposes.It is increasingly recognised that attention should be paid to investigating the needs of a new test, especially in contexts where specific purpose language needs might be identified. This article describes the stages involved in establishing the need for a new assessment of English for professional purposes in China. We first investigated stakeholders’ perceptions of the target language use activities and the necessity of the proposed assessment. We then analysed five existing tests and six language frameworks to evaluate their suitability for the need of the proposed assessment. The resulting proposal is for an advanced-level English assessment capable of providing a diagnostic evaluation of the proficiency of potential employees in areas of relevance to multinationals operating in China. The study has demonstrated the value of following a principled procedure to investigate the necessity for and the needs of a new test at the very beginning of the test development
Preparing for admissions tests in EnglishTest preparation for admissions to education programmes has always been a contentious issue (Anastasi, 1981; Crocker, 2003; Messick, 1982; Powers, 2012). For Crocker (2006), ‘No activity in educational assessment raises more instructional, ethical, and validity issues than preparation for large-scale, high-stakes tests.’ (p. 115). Debate has often centred around the effectiveness of preparation and how it affects the validity of test score interpretations; equity and fairness of access to opportunity; and impacts on learning and teaching (Yu et al., 2017). A focus has often been preparation for tests originally designed for domestic students, for example, SATs (e.g., Alderman & Powers, 1980; Appelrouth et al., 2017; Montgomery & Lilly, 2012; Powers, 1993; Powers & Rock, 1999; Sesnowitz et al., 1982) and state-wide tests (e.g., Firestone et al., 2004; Jäger et al., 2012), but the increasing internationalisation of higher education has added a new dimension. To enrol in higher education programmes which use English as the medium of instruction, increasing numbers of international students whose first language is not English are now taking English language tests, or academic specialist tests administered in English, or both. The papers in this special issue concern how students prepare for these tests and the roles in this process of the tests themselves and of the organisations that provide them.
Video-conferencing speaking tests: do they measure the same construct as face-to-face tests?This paper investigates the comparability between the video-conferencing and face-to-face modes of the IELTS Speaking Test in terms of scores and language functions generated by test-takers. Data were collected from 10 trained IELTS examiners and 99 test-takers who took two speaking tests under face-to-face and video-conferencing conditions. Many-facet Rasch Model (MFRM) analysis of test scores indicated that the delivery mode did not make any meaningful difference to test-takers’ scores. An examination of language functions revealed that both modes equally elicited the same language functions except asking for clarification. More test-takers made clarification requests in the video-conferencing mode (63.3%) than in the face-to-face mode (26.7%). Drawing on the findings, as well as practical implications, we extend emerging thinking about video-conferencing speaking assessment and the associated features of this modality in its own right.
Washback to learning outcomes: a comparative study of IELTS preparation and university pre-sessional language coursesThis study investigated whether dedicated test preparation classes gave learners an advantage in improving their writing test scores. Score gains following instruction on a measure of academic writing skills—the International English Language Testing System (IELTS) academic writing test—were compared across language courses of three types; all designed for international students preparing for entry to UK universities. Course types included those with a test preparation focus, those designed to introduce students to academic writing in the university setting and those combining the two. In addition to IELTS academic writing test scores, data relating to differences in participants and practices across courses were collected through supplementary questionnaire and test instruments. To take account of the large number of moderating variables and non-linearity in the data, a neural network approach was used in the analysis. Findings indicated no clear advantage for focused test preparation.