Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning

At the national level, the Ministry of Education in Malaysia assesses the achievement of primary school students in reading and writing, mathematics and science. The results of the assessments are used for selection decisions as well as for grading students. Since the implementation of the new la...

Full description

Bibliographic Details
Main Author: Ong , Saw Lan
Format: Article
Language:English
Published: Penerbit Universiti Sains Malaysia 2007
Subjects:
Online Access:http://eprints.usm.my/34335/
http://eprints.usm.my/34335/1/JPP_03_ONG_SAW_LAN_ART_3_%2845-59%29.pdf
_version_ 1848877315210084352
author Ong , Saw Lan
author_facet Ong , Saw Lan
author_sort Ong , Saw Lan
building USM Institutional Repository
collection Online Access
description At the national level, the Ministry of Education in Malaysia assesses the achievement of primary school students in reading and writing, mathematics and science. The results of the assessments are used for selection decisions as well as for grading students. Since the implementation of the new language policy of teaching science and mathematics in English, both Malay and English have been used as the language of assessment. The validity of interpretation for tests results across different language version is an important issue that needs to be investigated. Translating a test from a source language to a target language does not necessarily produce two psychometrically equivalent tests. The purpose of this study is to identify item(s) in translated achievement tests that may function differently across languages. Differential Item Functioning (DIF) analysis is useful to reveal items with psychometric characteristics that have been altered by the translation. Two statistical analyses were conducted to identify and evaluate DIF item(s). The simultaneous item bias test (SIBTEST), a nonparametric statistical method of assessing DIF in an item is used. The result obtained is then compared with the oneparameter logistic model, analyze using BILOG-MG V3.0 in assessing DIF in translated items. Both statistical analyses identified approximately 50% of the science items displayed DIF. This result suggests that substantial psychometric differences exist between the two language versions of the science test at the item level.
first_indexed 2025-11-15T17:13:29Z
format Article
id usm-34335
institution Universiti Sains Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T17:13:29Z
publishDate 2007
publisher Penerbit Universiti Sains Malaysia
recordtype eprints
repository_type Digital Repository
spelling usm-343352017-05-19T02:53:34Z http://eprints.usm.my/34335/ Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning Ong , Saw Lan LB2300 Higher Education At the national level, the Ministry of Education in Malaysia assesses the achievement of primary school students in reading and writing, mathematics and science. The results of the assessments are used for selection decisions as well as for grading students. Since the implementation of the new language policy of teaching science and mathematics in English, both Malay and English have been used as the language of assessment. The validity of interpretation for tests results across different language version is an important issue that needs to be investigated. Translating a test from a source language to a target language does not necessarily produce two psychometrically equivalent tests. The purpose of this study is to identify item(s) in translated achievement tests that may function differently across languages. Differential Item Functioning (DIF) analysis is useful to reveal items with psychometric characteristics that have been altered by the translation. Two statistical analyses were conducted to identify and evaluate DIF item(s). The simultaneous item bias test (SIBTEST), a nonparametric statistical method of assessing DIF in an item is used. The result obtained is then compared with the oneparameter logistic model, analyze using BILOG-MG V3.0 in assessing DIF in translated items. Both statistical analyses identified approximately 50% of the science items displayed DIF. This result suggests that substantial psychometric differences exist between the two language versions of the science test at the item level. Penerbit Universiti Sains Malaysia 2007 Article PeerReviewed application/pdf en http://eprints.usm.my/34335/1/JPP_03_ONG_SAW_LAN_ART_3_%2845-59%29.pdf Ong , Saw Lan (2007) Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning. The Asia Pacific Journal of Educators and Education (formerly known as Journal of Educators and Education), 22 (1). pp. 1-15. ISSN 2289-9057 http://apjee.usm.my/APJEE_22_2007/JPP%2003%20ONG%20SAW%20LAN%20ART%203%20(45-59).pdf
spellingShingle LB2300 Higher Education
Ong , Saw Lan
Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title_full Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title_fullStr Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title_full_unstemmed Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title_short Comparing Two Language Version Of Science Achievement Tests Using Differential Item Functioning
title_sort comparing two language version of science achievement tests using differential item functioning
topic LB2300 Higher Education
url http://eprints.usm.my/34335/
http://eprints.usm.my/34335/
http://eprints.usm.my/34335/1/JPP_03_ONG_SAW_LAN_ART_3_%2845-59%29.pdf