Evaluating locally-developed language testing
A predictive study of ‘direct entry’ language programs at an Australian University
Nicholas Cope |
Centre for Macquarie English, Macquarie University
The study reported here investigates the predictive validity of language assessments by ‘Direct Entry’ programs at an Australian University – programs developed on site for Non English Speaking Background international students, principally to provide (i) pre-entry academic and language preparation and (ii) language assessment for university admissions purposes. All 138 students in the sample had entered degree studies via one of the three programs that made up the locally-developed Direct Entry pathway. Inferential statistics (correlation and regression) showed the assessments awarded by two programs to satisfactorily predict academic outcomes, while predictive validity for one was not demonstrated. Descriptive statistics (mean pass rates and academic averages) then revealed a pattern of relatively poor academic performance in certain university disciplines to which particular Direct Entry programs were dedicated. Informed by principles of language program evaluation, the study’s outcomes were seen as both summative and formative: remedial strategies are accordingly recommended. While the specific relevance of the study’s findings is to the particular institutional context in which the study was conducted, the study instantiates a perspective on language assessment validation of broader relevance in an Australian context where locally-developed Direct Entry programs – about which the research literature is largely silent – are increasingly widespread.
Keywords: language tests, test validity and reliability, English for academic purposes, Australia, higher education, academic achievement
For any use beyond this license, please contact the publisher at