Thursday, November 29, 2007

PISA results 2006 (Science)

OECD has just released the tables for the 2006 PISA survey on science skills of the 15-year old. (See the table on the left). I am also attaching below an extract of the relevant OECD news release.
Although some people critisize this kind of ranking, I consider this exercise an important tool in the effort to improve an educational system.

The PISA survey, based on tests carried out in 2006 in 57 countries that together account for nearly 90% of world GDP, is the most comprehensive and rigorous international yardstick of secondary-school students' attainments. After focusing in 2000 on reading skills and in 2003 on mathematics, PISA 2006 tested students on how much they knew about science and their ability to use scientific knowledge and understanding to identify and address questions and resolve problems in daily life.
Comparisons between the results of the 2006 tests and those of previous years are not strictly valid.

PISA is a three-yearly survey of the knowledge and skills of 15-year-olds in OECD member countries and partner countries and economies. The product of collaboration between participating countries through the Organisation for Economic Co-operation and Development (OECD), it draws on leading international expertise to develop valid comparisons across countries and cultures.
In 2006,
PISA assessed the competencies of 15-year-old students in 57 countries with an extensive two-hour test. More than 400,000 students from 57 countries making up close to 90% of the world economy took part. The focus was on science but the assessment also included reading and mathematics and collected data on student, family and institutional factors that can help to explain differences in performance.
The table summarises the performance of 15-year-olds in science.
It shows three main pieces of information:
The average score of the country on the science assessment. Across the 30 OECD countries the scores are statistically standardised to have an average of 500 points. The scores are also standardised so that approximately two-thirds of the students score between 400 and 600 - that is the standard deviation is 100. The colour coding in the list of the countries in the first column gives an indication of whether the country's score is statistically significantly above, below or not different from the OECD average.

  • The rank of the country compared to other OECD countries. When a sample of students represents all students in a country, it is not always possible to state with 100% accuracy what the exact rank of the country is compared with other countries. For this reason, OECD calculates, with 95% confidence, a range of ranks that the country falls within. For example, in the list above, OECD is 95% confident that New Zealand ranks between 2nd and 5th of all the OECD countries.
  • The rank of the country compared to all the countries which participated in PISA 2006. The same 95% level of confidence is applied when comparing a country's position to all the other countries. For example, the list above shows, with 95% confidence, that Croatia ranks between 23rd and 30th position of all the countries which participated in PISA 2006.
  • For further background reading, see :
Assessing Scientific, Reading and Mathematical Literacy, A Framework for PISA 2006,
Sample test questions from the PISA 2006 assessment

No comments:

Post a Comment