In order to understand (compare) the effects of the new and the old versions of curriculum guidelines on the junior high school students’ performances, and to identify the factors that affect the growth of students’ learning ability, it is urgent to establish a scale, which measures the students’ cognitive abilities and also provides the educational interpretation for stakeholders in the related fields. However, differential item functioning (DIF) of measurement tools is inevitable in the large scale major educational surveys. DIF of the test items affects the precision of the measurement tools. This study therefore addresses the DIF in the test items of academic disciplines and survey questionnaires. The investigation will be carried out in September 2018 (the last year using the old version of curriculum guidelines) and in September 2019 (the first year using the new version of curriculum guidelines) on the seventh graders. These two clusters will be continuously tracked on their academic performances through junior high school. Two stage stratified cluster sampling will go as follows: the population is stratified based on the results of the Comprehensive Assessment Program for Junior High School Students. Schools and classes are then randomly selected. The selected classes complete the student and the parent questionnaires; the homeroom teachers and the discipline teachers complete teacher questionnaires; and the principal of the school fill out school questionnaire. This study will look into the effects caused by DIF from manifest variable and latent variable through the following disciplines, namely Chinese, English, Mathematics, Science, and Social Studies, as well as the survey questionnaires for students, parents, teachers and schools. The variations of DIF in the measurement tools, the stability of academic disciplines tests and survey questionnaires, as well as the factors that cause DIF will also be investigated.