Two recent key international assessments, PISA and ICILS, raise serious concerns about students’ capacity to critically assess information found online
by Pisana Ferrari – cApStAn Ambassador to the Global Village
Reading is no longer mainly about extracting information; it is about constructing knowledge, thinking critically and making well-founded judgements”. “Students need to be able to read complex texts, distinguish between credible and untrustworthy sources of information, and between fact and fiction, and question or seek to improve the accepted knowledge and practices of our times”. These are quotes from the recently published report on PISA 2018, the seventh three-year cycle of the OECD’s Programme for International Student Assessment, which began in 2000. (1) PISA 2018 assessed around 600,000 15-year-old students in 79 countries and economies on reading, science and mathematics. Each PISA cycle has a main focus on one of these three domains: for PISA 2018 it was reading literacy.
Results of PISA 2018 in this respect are alarming: fewer than 1 in 10 students in OECD countries were able to distinguish between fact and opinion, based on implicit cues pertaining to the content or source of the information. The share of low-performers, both girls and boys, also increased on average between 2018 and 2009, the last time reading literacy was the main PISA domain. Regarding the methodology, the report says the assessment tasks were designed based on texts composed of several smaller units, each created by a different author or authors or at different times. Examples of these kinds of texts are an online forum with multiple posts and a blog that links to a newspaper article. Computer delivery made it possible to use various digital navigation tools, such as hyperlinks or tabs, and to present such tasks in realistic scenarios, in which the amount of available text sources increases as the student progresses through the assessment – to see what some of these tasks were like go to this link: www.oecd.org/pisa/test/
The results of the IEA’s International Computer and Information Literacy Study (ICILS), announced recently, are just as disconcerting: over 40% of students were found to have only minimal ability to critically assess information found online. Eighteen percent failed to reach even the lowest level of the computer and information literacy (CIL) scale, and 25% of students scored the lowest level (2). This study, which involved 46,000 grade 8 students and 26,000 teachers from 200 schools, in fourteen countries, across Europe, Asia and North America, was designed to address the critical question of how well students are prepared for study, work, and life in a digital world. The type of literacy analysed refers to “students’ ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in the community”. One of the aspects analysed was “the capacity to find, retrieve and make judgements about the relevance, integrity, and usefulness of computer-based information.”
A lot more could be said, of course, about these results, e.g. about country or gender variance, the effect on reading proficiency of exposure to books at home for PISA, access to technology, the correlation between the students’ (and schools’) socioeconomic status and their performance, and so on. On the latter subject, as might be expected, both PISA and ICILS reveal that this is associated with student performance, although PISA found pockets of excellence even in disadvantaged students/schools, which leads the PISA researchers to express mild optimism about disadvantage not necessarily becoming destiny (to use their words). Factors that PISA shows to be positively associated with “academic resilience” include support from parents, a positive school climate and having a growth mindset.
The key finding, however, in our view, is the apparent inability for young people to critically assess information found online. We cannot but agree with the PISA report in saying that in the current “post-truth” climate, where quantity seems to be valued more than quality when it comes to information, “education is no longer just about teaching people something, but about helping people build a reliable compass and the navigation tools to find their own way through an increasingly volatile, uncertain and ambiguous world”.
Studies conducted by the OECD and IEA are extremely important in this respect because they help understand and compare education ecosystems across the world and gather meaningful information for policy makers and education stakeholders. At cApStAn Linguistic Quality Control we are proud to have been asked to verify the translations – with a view to maximizing comparability across language versions – of the ICILS data collection instruments. We have had the privilege of working with IEA for 19 years, including on other flagship projects such as TIMSS, Trends in International Mathematics and Science Study and PIRLS, Progress in International Reading Literacy Study. PISA 2018 is the seventh PISA cycle in which cApStAn was charged with maximising cross-language and cross-country comparability – we started our collaboration with PISA in 2000. For the PISA 2018 programme we verified 106 translated or adapted versions of the survey instruments.
Footnotes
Photo credit : Cover of “PISA 2018: Insights and interpretations”, by Andreas Schleicher