A Valid and Reliable Proficiency Exam for English language with Respect to University Language Program in Iran


  • Gholam-Reza Abbasian Imam Ali University and Islamic Azad University, South Tehran Branch
  • Elmira Hajmalek Islamic Azad University, Kish International Branch




Proficiency exams are used for numerous purposes and there is several commercially produced proficiency exams available are the market today. But, these exams are costly, only offered at limited times, and may not be appropriate for the needs of some programs. Methodology: First, the paper outlines the procedures that were followed to create the three sections (grammar, reading, and listening) of the exam. Next, the steps that were used to determine validity and estimate reliability are presented. Finally, the paper concludes with a discussion and explanation of the changes to test specifications to better assess the current language ability of university students in Iran. Results: As a result, many universities are in the process of creating their own language proficiency exams. However, there are few models for educational institutions to follow when creating their own proficiency exams. The purpose of this paper is to present the procedures a university followed to create a language proficiency exam with an appropriate validity, high reliability, and strong correlations to established standardized exams. Conclusion: Finally, the paper concludes with a discussion of the changes to test specifications to better reflex changes in the English ability of current university students in Iran. It is hoped that this paper will serve as a model for other schools that want to create their own language proficiency exams.


Abedi, J. 2006a. Language issues in item-development. In S. M. Downing & T. M. Haladyna (Eds.), Handbook of test development (pp. 377–398). Mahwah, NJ: Erlbaum.

Abedi, J. 2006b. Psychometric issues in the ELL assessment and special education eligibility. Teacher’s College Record, 108(11), 2282–2303.

Alderson, J. C. 2000. Assessing reading. Cambridge, England: Cambridge University Press.

Bachman, L. F., & Palmer, A. S. 1996. Language testing in practice: Designing and Developing Useful Language test. Oxford, England: Oxford University Press. 112 EaGLE Journal 1(2), 2015

Bae, J., & Bachman, L. F. 1998. A latent variable approach to listening and reading: testing factorial invariance across two groups of children in the Korean/English two-way immersion program. Language Testing, 15, 380-414.

Bailey, K. 1998. Learning about Language Assessment. Boston, MA: Heinle & Heinle.

Brown, H. D. 2004. Language assessment: Principles and classroom practices. White Plains, NY: Pearson Education.

Buck, G. 2001. Assessing listening. Cambridge, England: Cambridge University Press.

Chapelle, C. A., and Abraham, R. G. 1990. Cloze method: What does it make? Language Testing, 7, 121-146.

Hale, G. A., Rock, D. A., & Jirele, T. 1989. Confirmatory factor analysis of the TOEFL. TOEFL research report 32. Princeton, NJ: Educational Testing Service.

Henning, G. 1987. A guide to language testing. Los Angeles, CA: Newbury House.

Hughes, A. 2002. Testing for language teachers. Cambridge, England: Cambridge University Press.

International Language Testing Association. 2007. Guidelines for practice. Retrieved from http://www.iltaonline.com/images/pdfs/ilta_guidelines. pdf

McNamara, T. F. 2000. Communication and design of language tests. In H. G. Widdowson (Ed.), Language testing (13-22). Oxford, England: Oxford University Press.

Shohamy, E. 2001. The power of tests. Harlow, England: Pearson Education.

Song, M. Y. 2008. Do divisible subskills exist in second language (L2) comprehension? A structural equation modeling approach. Language 114 EaGLE Journal 1(2), 2015 Testing, 25, 435-464.

Tomblin, J. B., & Zhang, X. 2006. The dimensionality of language ability in school-age children. Journal of Speech, Language, and Hearing Research, 49, 1193-1208.