• Development of empirically driven checklists for learners’ interactional competence

      Nakatsuhara, Fumiyo; May, Lyn; Lam, Daniel M. K.; Galaczi, Evelina D.; University of Bedfordshire; Queensland University of Technology; Cambridge Assessment English (2019-03-27)
    • Interactional competence in the workplace: challenges and opportunities

      Galaczi, Evelina D.; Taylor, Lynda; Cambridge Assessment English; University of Bedfordshire (2018-11-25)
    • Learning oriented feedback in the development and assessment of interactional competence

      Nakatsuhara, Fumiyo; May, Lyn; Lam, Daniel M. K.; Galaczi, Evelina D.; Cambridge Assessment English; University of Bedfordshire; Queensland University of Technology (Cambridge Assessment English, 2018-01-01)
      This project developed practical tools to support the classroom assessment of learners’ interactional competence (IC) and provide learning-oriented feedback in the context of Cambridge English: First (now known as B2 First). To develop a checklist, accompanying descriptions and recommendations for teachers to use in providing feedback on learners’ interactional skills, 72 stimulated verbal reports were elicited from six trained examiners who were also experienced teachers. They produced verbal reports on 12 paired interactions with high, mid, and low interactive communication scores. The examiners were asked to comment on features of the interaction that influenced their rating of candidates’ IC and, based on the features of the performance they noted, provide feedback to candidates. The verbal reports were thematically analysed using Nvivo 11 to inform a draft checklist and materials, which were then trialled by four experienced teachers in order to further refine these resources. The final product comprises (a) a full IC checklist with nine main categories and over 50 sub-categories which further specify positive and negative aspects, accompanying detailed description of each area and feedback to learners, and (b) a concise version of the IC checklist with fewer categories and ‘bite-sized’ feedback to learners, to support use by teachers and learners in real-time. As such, this research addressed the area of meaningful feedback to learners on IC, which is an essential component of communicative language and yet cannot be effectively addressed via digital technologies and therefore needs substantial teacher involvement. This study, in line with the Cambridge English Learning Oriented Assessment (LOA) approach (e.g. Hamp-Lyons and Green 2014, Jones and Saville 2014, 2016), took the first step to offering teachers practical tools for feedback on learners’ interactional skills. Additionally, these tools have the potential to be integrated into the learning management system of the Empower course, aligning classroom and standardised assessment.
    • Video-conferencing speaking tests: do they measure the same construct as face-to-face tests?

      Nakatsuhara, Fumiyo; Inoue, Chihiro; Berry, Vivien; Galaczi, Evelina D.; ; University of Bedfordshire; British Council; Cambridge Assessment English (Routledge, 2021-08-23)
      This paper investigates the comparability between the video-conferencing and face-to-face modes of the IELTS Speaking Test in terms of scores and language functions generated by test-takers. Data were collected from 10 trained IELTS examiners and 99 test-takers who took two speaking tests under face-to-face and video-conferencing conditions. Many-facet Rasch Model (MFRM) analysis of test scores indicated that the delivery mode did not make any meaningful difference to test-takers’ scores. An examination of language functions revealed that both modes equally elicited the same language functions except asking for clarification. More test-takers made clarification requests in the video-conferencing mode (63.3%) than in the face-to-face mode (26.7%). Drawing on the findings, as well as practical implications, we extend emerging thinking about video-conferencing speaking assessment and the associated features of this modality in its own right.