Programme for International Student Assessment (PISA): A case example of interpreting findings

Karl-Göran Karlsson

International comparative studies of educational outcomes have been performed for almost 50 years. For a long time this arena has been dominated by IEA, which has studied mathematics and science achievement through TIMSS and its predecessors, as well as reading through such studies as PIRLS. In the last decade another major stakeholder entered the scene – the OECD. This powerful organization had been interested in education for a long time, an interest that is manifested by for instance the annual publication Education at a glance that compares a large number of characteristics of educational systems in the 30 member countries. This publication, however, is mainly concerned with input into the systems.

In the late 1990s it was decided that the OECD should start a progamme for assessing the outcomes of the educational systems in the member countries. The result of this decision was the Programme for international student assessment, PISA. Topics chosen for assessment were reading, mathematics and science, partly because these areas were considered important, and partly because competences in these areas are reasonably culturally independent and therefore yield results that are comparable between different countries.

The PISA design

PISA is a joint construction of countries engaged in the OECD, and was designed to administer to students at the age of 15 who are typically in their last year of compulsory school. The test measures cognitive aptitude (literacy), as well as collects background data on each student and each participating school through a student questionnaire and a school questionnaire. Unlike many other studies, (e.g. TIMSS) PISA is not closely linked to the curricula of the participating countries. Instead, PISA is designed to measure competences that can be useful to function as an informed citizen in modern society, competences generally called literacy. Descriptions of PISA’s interpretation of the test areas can be found on the official PISA website (http://www.pisa.oecd.org/) and in the framework for the assessment (OECD, 2003); in short the domains have the following definitions:

Reading literacy is the ability to understand, use, and reflect upon written texts in order to achieve one’s goals, develop one’s knowledge and potential, and participate in society.

Mathematical literacy is an individual’s capacity to identify and understand the role that mathematics plays in the world; to make well-founded judgements; and to use and engage mathematics in ways that serve the needs of individuals to be a constructive, concerned and reflective citizen.

Scientific literacy is the capacity to use scientific knowledge, to identify questions and to draw evidence-based conclusions in order to understand and help make decisions about the natural world and the changes made to it through human activity.

PISA is administered every three years, starting in year 2000. With each administration there is a dominant theme (e.g. math, reading, science) that comprises approximately two thirds of the test items. The remaining questions are distributed between the remaining two literacy areas. In the first administration cycle (2000) reading literacy was the major domain, in the second (2003) it was mathematical literacy and in the recently conducted third cycle (2006) scientific literacy was the major area. The number of participating students is between 4,500 to 10,000 per country. In 2000, 31 countries participated in the survey. In 2003 the number had increased to 41 and in 2006 to 59.

PISA is a very rigorous study. All test items are piloted in minor studies in all participating countries one year before the main study, and items showing strong cultural or gender bias are deleted. Questionnaires are piloted at the same time. Sampling rules are very strict in order to get representative results from each country. There are also strict limits to dropping out percentage.

The results of PISA

Initial results of the two PISA studies conducted so far have been published in two international reports (OECD, 2001; 2004). In these reports overall results for each domain and each country are presented and analysed. In addition to these documents a substantial number of thematic reports have been produced. These cover many topics, such as an analysis of factors that make school systems perform, students’ learning strategies, their sense of belonging in school and reports on immigrant students and many more. More publications are listed on the PISA website. Of course, many national reports have also been produced.

The impact of PISA is generally great. In a number of participating countries educational reforms, based on the findings of PISA, have been initiated. It is highly likely that the PISA study will become increasingly important with the accumulation of more data, allowing more sophisticated analyses, including trends. In Sweden analyses are conducted to examine trends in student literacy in relation to educational programming and policy. The following is and example of one trend, and how the data are being examined. This example is provide to both suggest ways in which data can facilitate a deeper analysis of school development, as well as to provide some insights into educational trends in Sweden at the present time.

One Swedish example

A corner stone of Swedish educational policy is that all students should have equal opportunities. Among other things this means that less able students should get extra concern. The Swedish school system is a goal-based system with a high degree of local freedom (at school level). The overall national goals are set out by Swedish Parliament and Government in The Education Act, Curriculum for the Compulsory School System and Course syllabi for compulsory school. The National Agency for Education draws up and takes decisions on general recommendations and grading criteria for all types of Swedish schools.

To support teachers in their grading work and to ensure equivalence compulsory national tests are given in the core subjects Swedish, English and Mathematics. The main purpose of the tests is to help teachers assess to what extent pupils have attained the goals set up in the syllabi and to provide support for teachers in awarding grades. After a grading reform in 1999 a student must obtain a pass grade in the core subjects Swedish, English and Mathematics to qualify for a national programme in upper secondary school.

In Sweden no significant changes between PISA 2000 and PISA 2003 occurred in none of the three test domains (reading literacy, mathematical literacy and scientific literacy) with regard to the country means. Moreover, Swedish scores were significantly above the OECD average in all three domains. So, seen at that scale, the Swedish results do not seem very exciting.

What happened to science results?

However, when viewed in some more detail, there are some interesting findings. Figure 1 displays a comparison between Swedish results and the OECD averages for students of different ability. For some selected performance percentiles the vertical scale gives the differences between Swedish results and OECD averages in PISA 2000.

Figure 1. Comparison between Swedish and OECD results for different domains in PISA 2000.

As is quite obvious from the figure low performance students score better than their peers in the OECD, whereas the differences at the high performance end is small. One interpretation of this is that the Swedish school system succeeds in giving low ability students a good start. This is, as was mentioned above, an important ambition. Figure 2 displays the corresponding results in PISA 2003. The curves for reading and mathematics are quite similar to those of PISA 2000, but the science curve is radically different. According to this curve, Swedish students no longer have a significant advantage over the OECD average at any percentile. The drop at the low performance end between PISA 2000 and 2003 is significant at the 1 % level. Obviously, this indicates a real decline. Such a large change in only three years is very uncommon in education, where changes are usually slow. So what has happened?

Already in the Swedish national PISA report (Skolverket, 2004), the poorer performance of the weaker students was discussed. At that time there was not much additional support for the idea presented in that report - that the decline could be due to the extra emphasis put on the ‘core subjects’ Swedish, English and Mathematics after the grading reform of 1999. In a recent study Eriksson at al (2004) discuss the effects of the fact that every student wanting to enter a national program in upper secondary school must have a pass grade in those three subjects. These authors have interviewed a substantial number of teachers. For instance, one teacher says:

“And then maybe this demand of eligibility in only three subjects to get into upper secondary school, it is also a risk. You need to get passed in maths, then we take something else away and push in more maths for example.” (ibid., p. 41)

The authors of the report conclude:

“Teachers interpret their task as guaranteeing a three-subject school, where their mission is to make sure that students get at least a pass grade in Swedish, English and Mathematics.” (ibid p.43f)

We feel that the report strongly supports the idea put forward in the national PISA report (Skolverket, 2004), that less time and effort is put on other subjects than on the three ‘core subjects’, and that weak performing students suffer from this. The lowest achievers show a slightly increased score in mathematics from PISA 2000 to PISA 2003. More interesting, however, is the fact that the proportion of students not reaching the goals for pass on the national mathematics test was much smaller in 2003 than in 2000. This supports the idea that high priority is given to mathematics.

Figure 2. Comparison between Swedish and OECD results for different domains in PISA 2003.

Immigrant students

Seen as a group, immigrant students often perform worse than native students on different tests. Of course, this does not mean that all immigrant students are low performers. We have investigated PISA results for Swedish native students, and for first generation immigrant students. This means that the students themselves were born in Sweden, but both their parents were born in another country. For the three domains – reading literacy, mathematical literacy and scientific literacy – we have plotted the differences between native and immigrant students in PISA 2000 and PISA 2003. The result is shown in Figure 3.

Figure 3. Differences between native students and first generation immigrant students i PISA 2000 and PISA 2003 for each of the three test domains.

The result is very clear. Differences between the student groups have decreased considerably in reading and mathematics, while it has increased in science. Again, our interpretation is the strong focus on core subjects like Swedish and mathematics.

Conclusions

It seems clear that low performance students do worse in science in PISA 2003 than in 2000, while results in reading and mathematics are unchanged or better in 2003 than in 2000. We also see a decreased difference between native and immigrant students in reading and mathematics, but an increased difference in science. These findings are interpreted in terms of the extra focus put on the core subjects Swedish, English and mathematics. The interpretation is supported by other studies.

We are a bit concerned by these findings. Since pass grades in the core subjects are necessary for eligibility for national programs in upper secondary school the strategy with focus on these subjects will lead to more students going into these programs. The problem is what happens next. Once a student is in upper secondary, he or she will undoubtedly meet other subjects than the core subjects. If students have been allowed to bother less about these subjects during their earlier school career, it is likely that they will face problems in upper secondary school. Further studies will show if these apprehensions turn out to be justified.

References

Eriksson, I., Orlander, A.A., Jedemark, M. (2004). Att arbeta f�r godk�nt – timplanens roll i ett f�r�ndrat uppdrag. Centrum f�r studier av skolans kunskapsinneh�ll i praktiken. Stockholm, HLS F�rlag.

OECD (2001). Knowledge and Skills for Life. First Results from PISA 2000. Paris: OECD

OECD (2003). The PISA Assessment Framework. Mathematics, Reading, Science and Problem solving Knowledge and Skills. Paris: OECD.

OECD (2004). Learning for Tomorrow’s World. First Results from PISA 2003. Paris: OECD

Skolverket (2004). PISA 2003 – Svenska femton�ringars kunskaper och attityder i ett internationellt perspektiv. Rapport 254. Stockholm: Skolverket.