Vol. 10:1 (2024) ► pp.10–34
Assessing pronunciation using dictation tools
The use of Google Voice Typing to score a pronunciation placement test
Language institutions need efficient and reliable placement tests to ensure students are placed in appropriate classes. This can be achieved by automating the scoring of pronunciation tests via the use of speech recognition, as its reliability has been shown to be comparable to that of human raters. However, this technology can be costly as it requires development and maintenance, placing it beyond the means of many institutions. This study investigates the feasibility of assessing English second language pronunciation in placement tests through the use of a free automatic speech recognition tool, Google Voice Typing (GVT). We compared human-rated and GVT-rated scores of 56 pronunciation placement tests. Our results indicate a strong correlation between scores for the final rating and for each criterion on the rubric used by human raters. We conclude that leveraging this free speech technology could increase the test usefulness of language placement tests.
Article outline
- 1.Introduction
- 2.Background
- 2.1Human rater biases when assessing pronunciation
- 2.2Automatic Speech Recognition (ASR)
- 2.3Current use of automated assessment of pronunciation
- 2.4ASR-based dictation tools and L2 pronunciation
- 2.5Google voice typing
- 2.6Test usefulness
- 3.The study
- 4.Method
- 4.1Overview
- 4.2Research context and participants
- 4.3Data collection materials
- 4.3.1Pronunciation samples
- 4.3.2Rubric
- 4.4Procedure
- 4.4.1Pronunciation samples
- 4.4.2Human-rated scores and analysis
- 4.4.3GVT-rated scores and analysis
- 4.5Data analysis
- 5.Results
- 6.Discussion
- 6.1Reliability
- 6.2Validity
- 6.3Practicality
- 7.Conclusion
- Notes
-
References
https://doi.org/10.1075/jslp.23033.joh