• European language testing in a global context: proceedings of the 2001 ALTE conference.

      Milanovic, Michael; Weir, Cyril J. (Cambridge University Press, 2004-01-01)
      The ALTE conference, Euopean Language Testing in a Global Context, was held in Barcelona in 2001 in support of the European Year of Languages. The conference papers presented in ths volume represent a small subset of the many excellent presentations made at that event. They have been selected to provide a flavour of the issues that the conference addressed.
    • Examiner interventions in oral interview tests: what are the listening demands they make upon candidates?

      Nakatsuhara, Fumiyo; Field, John; University of Bedfordshire (2012-11-18)
    • Examining writing: research and practice in assessing second language writing

      Shaw, Stuart D.; Weir, Cyril J. (Cambridge University Press, 2007-07-01)
      This publication highlights the need for test developers to provide clear explanations of the ability constructs which underpin tests offered in the public domain. An explanation is increasingly required, if the validity of test score interpretation and use are to be supported both logically, and with empirical evidence. The book demonstates the application of a comprehensive test validation framework which adopts a socio-cognitive perspective. The framework embraces six core components which reflect the practical nature and quality of an actual testing event. It examines Cambridge ESOL writing tasks from the following perspectives: Test Taker, Cognitive Validity, Context Validity, Scoring Validity, Criterion-related Validity and Consequential Validity. The authors show how an understanding and analysis of the framework and its components in relation to specific writing tests can assist test developers to operationalise their tests more effectively, especially in relation to criterial distinctions across test levels.
    • Exploring language assessment and testing: language in action

      Green, Anthony (Taylor and Francis, 2013-10-01)
      This book is an indispensable introduction to the areas of language assessment and testing, and will be of interest to language teachers as well as postgraduate and advanced undergraduate students studying Language Education, Applied Linguistics and Language Assessment.
    • Exploring language assessment and testing: language in action

      Green, Anthony (Routledge, 2020-12-30)
      Exploring Language Assessment and Testing offers a straightforward and accessible introduction that starts from real-world experiences and uses practical examples to introduce the reader to the academic field of language assessment and testing. Extensively updated, with additional features such as reader tasks (with extensive commentaries from the author), a glossary of key terms and an annotated further reading section, this second edition provides coverage of recent theoretical and technological developments and explores specific purposes for assessment. Including concrete models and examples to guide readers into the relevant literature, this book also offers practical guidance for educators and researchers on designing, developing and using assessments. Providing an inclusive and impartial survey of both classroom-based assessment by teachers and larger-scale testing, this is an indispensable introduction for postgraduate and advanced undergraduate students studying Language Education, Applied Linguistics and Language Assessment.
    • Exploring language frameworks: proceedings of the ALTE Kraków Conference, July 2011

      Galaczi, Evelina D.; Weir, Cyril J. (Cambridge University Press, 2013-02-01)
      This volume explores the impact of language frameworks on learning, teaching and assessment, viewed from the perspective of policies, procedures and challenges. It brings together a selection of edited papers, based on presentations given at the 4th International Conference of the Association of Language Testers in Europe (ALTE) held in Kraków, Poland, in July 2011. The selected papers focus on the conference's core themes as follows: the effect of frameworks on teaching, learning and assessment; the value of frameworks for teachers, learners and language policymakers; the contribution of frameworks towards describing particular languages.
    • Exploring performance across two delivery modes for the IELTS Speaking Test: face-to-face and video-conferencing delivery (Phase 2)

      Nakatsuhara, Fumiyo; Inoue, Chihiro; Berry, Vivien; Galaczi, Evelina D. (IELTS Partners, 2017-10-04)
      Face-to-face speaking assessment is widespread as a form of assessment, since it allows the elicitation of interactional skills. However, face-to-face speaking test administration is also logistically complex, resource-intensive and can be difficult to conduct in geographically remote or politically sensitive areas. Recent advances in video-conferencing technology now make it possible to engage in online face-to-face interaction more successfully than was previously the case, thus reducing dependency upon physical proximity. A major study was, therefore, commissioned to investigate how new technologies could be harnessed to deliver the face-to-face version of the IELTS Speaking test.  Phase 1 of the study, carried out in London in January 2014, presented results and recommendations of a small-scale initial investigation designed to explore what similarities and differences, in scores, linguistic output and test-taker and examiner behaviour, could be discerned between face-to-face and internet-based videoconferencing delivery of the Speaking test (Nakatsuhara, Inoue, Berry and Galaczi, 2016). The results of the analyses suggested that the speaking construct remains essentially the same across both delivery modes.  This report presents results from Phase 2 of the study, which was a larger-scale followup investigation designed to: (i) analyse test scores obtained using more sophisticated statistical methods than was possible in the Phase 1 study (ii) investigate the effectiveness of the training for the video-conferencing- delivered test which was developed based on findings from the Phase 1 study (iii) gain insights into the issue of sound quality perception and its (perceived) effect (iv) gain further insights into test-taker and examiner behaviours across the two delivery modes (v) confirm the results of the Phase 1 study. Phase 2 of the study was carried out in Shanghai, People’s Republic of China in May 2015. Ninety-nine (99) test-takers each took two speaking tests under face-to-face and internet-based video-conferencing conditions. Performances were rated by 10 trained IELTS examiners. A convergent parallel mixed-methods design was used to allow for collection of an in-depth, comprehensive set of findings derived from multiple sources. The research included an analysis of rating scores under the two delivery conditions, test-takers’ linguistic output during the tests, as well as short interviews with test-takers following a questionnaire format. Examiners responded to two feedback questionnaires and participated in focus group discussions relating to their behaviour as interlocutors and raters, and to the effectiveness of the examiner training. Trained observers also took field notes from the test sessions and conducted interviews with the test-takers.  Many-Facet Rasch Model (MFRM) analysis of test scores indicated that, although the video-conferencing mode was slightly more difficult than the face-to-face mode, when the results of all analytic scoring categories were combined, the actual score difference was negligibly small, thus supporting the Phase 1 findings. Examination of language functions elicited from test-takers revealed that significantly more test-takers asked questions to clarify what the examiner said in the video-conferencing mode (63.3%) than in the face-to-face mode (26.7%) in Part 1 of the test. Sound quality was generally positively perceived in this study, being reported as 'Clear' or 'Very clear', although the examiners and observers tended to perceive it more positively than the test-takers. There did not seem to be any relationship between sound quality perceptions and the proficiency level of test-takers. While 71.7% of test-takers preferred the face-to-face mode, slightly more test-takers reported that they were more nervous in the face-to-face mode (38.4%) than in the video-conferencing mode (34.3%).  All examiners found the training useful and effective, the majority of them (80%) reporting that the two modes gave test-takers equal opportunity to demonstrate their level of English proficiency. They also reported that it was equally easy for them to rate test-taker performance in face-to-face and video-conferencing modes.  The report concludes with a list of recommendations for further research, including suggestions for further examiner and test-taker training, resolution of technical issues regarding video-conferencing delivery and issues related to rating, before any decisions about deploying a video-conferencing mode of delivery for the IELTS Speaking test are made.
    • Exploring performance across two delivery modes for the same L2 speaking test: face-to-face and video-conferencing delivery: a preliminary comparison of test-taker and examiner behaviour

      Nakatsuhara, Fumiyo; Inoue, Chihiro; Berry, Vivien; Galaczi, Evelina D. (The IELTS Partners: British Council, Cambridge English Language Assessment and IDP: IELTS Australia, 2016-11-10)
      This report presents the results of a preliminary exploration and comparison of test-taker and examiner behaviour across two different delivery modes for an IELTS Speaking test: the standard face-to-face test administration, and test administration using Internetbased video-conferencing technology. The study sought to compare performance features across these two delivery modes with regard to two key areas:  • an analysis of test-takers’ scores and linguistic output on the two modes and their perceptions of the two modes  • an analysis of examiners’ test management and rating behaviours across the two modes, including their perceptions of the two conditions for delivering the speaking test.  Data were collected from 32 test-takers who took two standardised IELTS Speaking tests under face-to-face and internet-based video-conferencing conditions. Four trained examiners also participated in this study. The convergent parallel mixed methods research design included an analysis of interviews with test-takers, as well as their linguistic output (especially types of language functions) and rating scores awarded under the two conditions. Examiners provided written comments justifying the scores they awarded, completed a questionnaire and participated in verbal report sessions to elaborate on their test administration and rating behaviour. Three researchers also observed all test sessions and took field notes.  While the two modes generated similar test score outcomes, there were some differences in functional output and examiner interviewing and rating behaviours. This report concludes with a list of recommendations for further research, including examiner and test-taker training and resolution of technical issues, before any decisions about deploying (or not) a video-conferencing mode of the IELTS Speaking test delivery are made. 
    • Exploring the potential for assessing interactional and pragmatic competence in semi-direct speaking tests

      Nakatsuhara, Fumiyo; May, Lyn; Inoue, Chihiro; Willcox-Ficzere, Edit; Westbrook, Carolyn; Spiby, Richard; University of Bedfordshire; Queensland University of Technology; Oxford Brookes University; British Council (British Council, 2021-11-11)
      To explore the potential of a semi-direct speaking test to assess a wider range of communicative language ability, the researchers developed four semi-direct speaking tasks – two designed to elicit features of interactional competence (IC) and two designed to elicit features of pragmatic competence (PC). The four tasks, as well as one benchmarking task, were piloted with 48 test-takers in China and Austria whose proficiency ranged from CEFR B1 to C. A post-test feedback survey was administered to all test-takers, after which selected test-takers were interviewed. A total of 184 task performances were analysed to identify interactional moves utilised by test-takers across three proficiency groups (i.e., B1, B2 and C). Data indicated that test-takers at higher levels employed a wider variety of interactional moves. They made use of concurring concessions and counter views when seeking to persuade a (hypothetical) conversational partner to change opinions in the IC tasks, and they projected upcoming requests and made face-related statements in the PC tasks, seemingly to pre-empt a conversational partner’s negative response to the request. The test-takers perceived the tasks to be highly authentic and found the video input useful in understanding the target audience of simulated interactions.
    • Exploring the use of video-conferencing technology in the assessment of spoken language: a mixed-methods study

      Nakatsuhara, Fumiyo; Inoue, Chihiro; Berry, Vivien; Galaczi, Evelina D.; University of Bedfordshire; British Council; Cambridge English Language Assessment (Taylor & Francis, 2017-02-10)
      This research explores how internet-based video-conferencing technology can be used to deliver and conduct a speaking test, and what similarities and differences can be discerned between the standard and computer-mediated face-to-face modes. The context of the study is a high-stakes speaking test, and the motivation for the research is the need for test providers to keep under constant review the extent to which their tests are accessible and fair to a wide constituency of test takers. The study examines test-takers’ scores and linguistic output, and examiners’ test administration and rating behaviors across the two modes. A convergent parallel mixed-methods research design was used, analyzing test-takers’ scores and language functions elicited, examiners’ written comments, feedback questionnaires and verbal reports, as well as observation notes taken by researchers. While the two delivery modes generated similar test score outcomes, some differences were observed in test-takers’ functional output and the behavior of examiners who served as both raters and interlocutors.
    • Exploring the use of video-conferencing technology to deliver the IELTS Speaking Test: Phase 3 technical trial

      Berry, Vivien; Nakatsuhara, Fumiyo; Inoue, Chihiro; Galaczi, Evelina D.; IELTS (IELTS Partners: British Council, Cambridge Assessment English and IDP: IELTS Australia, 2018-01-01)
      This report presents Phase 3 of the study which was carried out with test-takers in five cities in Latin America. This phase focused only on the video-conferencing mode of delivery of the IELTS Speaking test. The primary aims were to: trial a new platform to deliver video-conferencing tests across different locations; and further investigate the scoring validity of the video-conferencing test.
    • Exploring the value of bilingual language assistants with Japanese English as a foreign language learners

      Macaro, Ernesto; Nakatani, Yasuo; Hayashi, Yuko; Khabbazbashi, Nahal; University of Oxford; Hosei University (Routledge, 2012-04-27)
      We report on a small-scale exploratory study of Japanese students’ reactions to the use of a bilingual language assistant on an EFL study-abroad course in the UK and we give an insight into the possible effect of using bilingual assistants on speaking production. First-year university students were divided into three groups all taught by a monolingual (native) speaker of English. Two teachers had monolingual assistants to help them; the third group had a bilingual (Japanese–English) assistant. In the third group, students were encouraged to ask the assistant for help with English meanings and to provide English equivalents for Japanese phrases, especially during student-centred activities. Moreover, the students in the third group were encouraged to code-switch rather than speak hesitantly or clam up in English. In the first two groups, the students were actively discouraged from using Japanese among themselves in the classroom. The data from an open-ended questionnaire suggest that attitudes to having a bilingual assistant were generally positive. Moreover the ‘bilingual’ group made the biggest gains over the three week period in fluency and in overall speaking scores although these gains were not statistically significant. Suggestions for further research are explored particularly in relation to whether a bilingual assistant may provide support with the cross-cultural challenges faced by EFL learners.
    • Farewell to holistic scoring?

      Hamp-Lyons, Liz (Elsevier, 2016-01-26)
    • Focus on assessment : book review

      Green, Anthony; University of Bedfordshire (2016-10-24)
      Book review: Focus on Assessment E. E. Jang Oxford University Press, 2014, ISBN 9780194000833
    • Fostering the future: the micro-educational impact of a language assessment course

      Green, Anthony; Solnyshkina, Marina; University of Bedfordshire; Kazan Federal University (2013-11-17)
    • The future of JEAP and EAP

      Hamp-Lyons, Liz (Elsevier Ltd, 2015-12-12)
    • General Language Proficiency (GLP): reflections on the "issues revisited" from the perspective of a UK examination board

      Taylor, Lynda (Taylor & Francis, 2014-05-21)
      Looking back to the language testing world of the 1980s in the United Kingdom, we need to be aware that how we perceive or remember ourselves to have been then—whether as individual language testing academics or as corporate language testing organisations—will be shaped by multiple influences. Although we may have been present at and shared in the 1980 discussions, our recollections of how things were then and our views on how they have (or have not) changed will vary. What follows in this article offers a predominantly personal perspective. It is the view as I perceive it, in light of my own journey as a UK-based language teacher and tester over the past 30 years, seen from where I stand now as a consultant to a large international examining board in the United Kingdom. It is also therefore an institutional perspective, drawing on a long association with one particular language testing organisation. Just as my perspective is from the position of only one language testing institution, I am also only one individual from within that institution. There will undoubtedly be other stances, voices, and perspectives that are equally valid and relevant from within the same institution.