The Programme for International Student Assessment, better known as PISA, is an international study run by the OECD that measures how well 15-year-old students can use what they know in mathematics, reading, and science when they face unfamiliar, real-world tasks. It is not built as a memory test of one national curriculum. It asks a narrower and more demanding question: can students transfer knowledge, reason with it, and use it under new conditions? That single design choice explains why PISA is quoted so often in debates on school quality, equity, curriculum reform, teacher preparation, digital learning, and long-run human capital.[a][b]
What PISA measures in practice
- Target group: students aged 15 years and 3 months to 16 years and 2 months, enrolled in grade 7 or above.[c]
- Core subjects: mathematics, reading, and science in every cycle.[a]
- Cycle pattern: a focal domain rotates by cycle; PISA 2022 focused on mathematics, while PISA 2025 focuses on science.[d][e]
- Assessment mode: a computer-based test plus background questionnaires for students and school leaders; some files also include parent and teacher data.[f][g]
- Latest published core results: PISA 2022, released on 5 December 2023.[d]
- Current forward path: PISA 2025 results are due in 2026 for core domains and in 2027 for the innovative digital-world component; PISA 2029 is already being prepared with media and AI literacy as an innovative domain.[a][e][h]
Why PISA Exists and Why 15-Year-Olds Are Tested
PISA was created because comparing education systems only by years of schooling, graduation rules, or national exam pass rates gives an incomplete picture. Countries differ in curriculum sequence, school structure, tracking, compulsory years, and grading practices. A student can spend the same number of years in school in two countries and still leave with very different levels of literacy, numeracy, and scientific reasoning. PISA therefore uses an age-based sample rather than a grade-based sample as its anchor. That makes international comparison more stable, because it tests students at roughly the point when most are nearing the end of compulsory education.[b][c]
This age choice matters. PISA is not mainly trying to tell whether a country’s tenth-grade textbook is harder than another country’s ninth-grade textbook. It is trying to estimate whether young people have reached a level of transferable competence that will help them continue learning, enter further education, and function in adult life. That is why PISA tasks often ask students to interpret graphs, compare routes, reason through evidence, or evaluate claims instead of only reproducing formulas or definitions.[b][i]
How PISA Is Built
The assessment is designed as a large-scale sample survey, not a census in most systems. OECD guidance states that the target population includes students in the defined age band who are enrolled in an educational institution at grade 7 or higher. Countries draw school samples and then student samples inside those schools. Sampling weights are later applied so that the tested students represent the wider national cohort. In most participating systems, PISA covers more than 80% of all 15-year-olds, but coverage can be much lower in places where many adolescents are outside school. That means PISA is both a learning study and, indirectly, a signal about school participation.[c]
The technical architecture is stricter than many popular summaries suggest. PISA 2022 required at least 150 participating schools in a national sample in standard cases, and it also required at least 80% of selected students in participating schools to sit the test at the national level. These rules exist to reduce bias and protect comparability. Without that discipline, a high score could reflect selective participation rather than stronger learning outcomes.[j]
Each cycle includes a two-hour computer-based test and background questionnaires. The questionnaires collect information on student attitudes, socio-economic background, school climate, resources, instructional practices, and the learning environment. That is why PISA is discussed far beyond rankings. Its database lets researchers study links between performance and variables such as disciplinary climate, digital device use, parent-school relations, sense of belonging, teacher support, and learning time.[k][f][g]
| Feature | How PISA Handles It | Why It Matters |
|---|---|---|
| Population | 15-year-olds in grade 7 or above | Supports cross-country comparison despite different school structures |
| Subjects | Math, reading, science in every cycle | Tracks core learning outcomes over time |
| Cycle design | One focal domain rotates each cycle | Allows deeper measurement in the major area |
| Mode | Computer-based assessment | Captures modern task formats and process data |
| Questionnaires | Student and school context data; some cycles add other respondent groups | Links outcomes to equity, resources, climate, and learning conditions |
| Sampling | Nationally representative school and student samples with weights | Prevents simple score reading without population context |
| Reporting | Mean scores, proficiency levels, subgroup gaps, contextual indicators | More useful than a single league table |
What PISA Actually Reports
The public conversation often compresses PISA into one number per country. That is the least interesting use of the study. PISA reports at least four layers of evidence.
- Average performance in mathematics, reading, and science.
- Distribution of performance, including the share of low performers and top performers.
- Equity indicators, including how strongly socio-economic status relates to outcomes.
- Contextual indicators, drawn from questionnaires on school experiences, learning resources, motivation, well-being, and system features.[d][l]
This structure changes how serious readers interpret results. A country can post a high average score and still show wide inequality. Another can sit near the OECD average yet combine modest scores with small social gaps. A third can improve participation in schooling, bring more 15-year-olds into the tested population, and temporarily face score pressure because the measured cohort has become more inclusive. Looking only at rank is like trying to judge an entire school by the sound of one bell. The signal is real, but it is not the whole building.[c][m]
Proficiency Levels Matter More Than Rank Alone
PISA uses proficiency levels to describe what students can do, not only how many points they scored. In mathematics, the OECD treats Level 2 as a baseline of basic proficiency. Students at or above that level begin to show the ability and initiative to use mathematics in simple real-life situations. In PISA 2022, 31% of students across OECD countries performed below Level 2 in mathematics, meaning only 69% reached at least basic proficiency. At the top end, 9% of students across OECD countries reached Levels 5 or 6 in mathematics. Only 16 of 81 participating countries and economies had more than 10% of students at those highest levels.[n]
That distribution tells a sharper story than a raw rank. A system with many students below baseline has a broad foundational problem. A system with few low performers but very few top performers may have succeeded at basic inclusion but still struggle to develop advanced reasoning. A system with strong averages and a sizable high-performance group may offer both depth and breadth. PISA lets analysts separate those patterns instead of treating all score gaps as the same phenomenon.[n]
A more useful way to read a PISA release
- First ask how many students are below baseline.
- Then ask how many reach advanced performance.
- Then check whether results are socially broad or highly stratified.
- Only after that does the league table become informative.
What PISA 2022 Showed
PISA 2022 remains the latest full release for the core domains, and its headline finding was a sharp decline in performance after years of disruption. Across OECD countries, average performance fell by almost 15 score points in mathematics and about 10 points in reading compared with 2018, while science stayed broadly stable. The mathematics drop was described by the OECD as roughly three-quarters of a year of learning and as much larger than earlier consecutive changes. That makes PISA 2022 a benchmark not only for student achievement but also for how school systems handled crisis, interruption, remote learning, and recovery.[o][p]
The scale of participation also matters. Around 690,000 to 700,000 students in 81 countries and economies took part in PISA 2022, representing roughly 29 million 15-year-olds. For international education research, that is unusual breadth. It gives PISA enough range to compare high-income systems, middle-income systems, small states, federal systems, city-based participants, and economies with very different demographic and institutional structures.[q][r]
PISA 2022 also reminded readers that the top performers are not identical across all dimensions. Some systems stood out for mean performance, some for small shares of low performers, and some for resilience or fairness. OECD summaries highlighted that 18 countries and economies performed above the OECD average in mathematics, reading, and science at the same time. That kind of cross-domain consistency is rarer than one-off strength in a single subject.[d]
| Indicator | OECD-Wide Figure | What It Suggests |
|---|---|---|
| Mathematics change, 2018 to 2022 | Almost -15 points | A large drop in foundational reasoning and problem solving |
| Reading change, 2018 to 2022 | About -10 points | Noticeable decline in comprehension and interpretation |
| Science change, 2018 to 2022 | Broadly stable | Less immediate movement than math or reading |
| Students below Level 2 in mathematics | 31% | A sizable minority lacks baseline proficiency |
| Students at Level 2 or above in mathematics | 69% | About two-thirds reached basic proficiency |
| Top performers in mathematics | 9% | Advanced problem solving remains concentrated |
| Disadvantaged students in top quarter of math performers | 10% | Social background matters, but it does not fully determine outcomes |
Why Policymakers, Schools, and Researchers Care So Much
PISA sits at the intersection of education quality, economic competitiveness, and social mobility. Governments use it to benchmark school outcomes. Ministries examine it when reviewing curriculum, assessment, and teacher policy. Researchers mine the public database to test questions on inequality, school climate, digital use, and learning conditions. Universities use it in comparative education and public policy analysis. Newsrooms often reduce it to rankings, but policy units usually read it as an evidence package on system performance and distribution.[f][s][t]
It also influences reform language. Terms such as foundational literacy, student resilience, socio-economic gradient, digital learning conditions, and learning loss circulate more widely because PISA provides cross-national evidence attached to those ideas. In PISA 2022, the OECD pointed to traits shared by more resilient systems: they kept schools open longer for more students, reduced barriers to remote learning, and strengthened parent-school partnerships. That turns PISA from a scoreboard into a policy laboratory.[l]
Equity Is Not a Side Topic in PISA
A common mistake is to treat PISA as a pure merit ranking. In fact, one of its strongest contributions is the study of equity. The OECD reports how outcomes relate to socio-economic status, immigrant background, gender, school characteristics, and student attitudes. In PISA 2022, 10% of disadvantaged students across OECD countries still scored in the top quarter of mathematics performance in their own countries. That is a small share, but it matters. It shows that disadvantage raises the odds of lower performance without making low performance inevitable.[m]
This is where PISA often adds value beyond national exams. National exams usually tell a country how students performed within that country’s own structure. PISA can reveal whether high-performing systems also manage to spread opportunity more evenly, or whether strong average scores hide a steep social gradient. For ministries that are serious about quality with inclusion, that distinction is not optional. It is central.[m]
What PISA Does Not Do
PISA is influential, but it has limits that readers should keep in view. First, it does not test every child. It tests a sample of students who are still enrolled in school. In countries where many 15-year-olds are out of school, coverage can be far below universal. This means a score cannot be interpreted without asking who is included in the measured population.[c]
Second, it is not a direct measure of everything schools aim to build. PISA gives strong evidence on applied literacy, numeracy, science understanding, and selected innovative domains. It does not capture the full range of civic education, arts learning, moral development, physical education, local language complexity, or community knowledge. Readers who use PISA as if it were a complete philosophy of education usually overreach.
Third, rankings can exaggerate tiny differences. Countries close together in score are not always meaningfully different once sampling and statistical uncertainty are considered. Serious interpretation asks whether score gaps are statistically and substantively meaningful, not just whether one jurisdiction sits two places above another in a media graphic.[u]
Fourth, cultural and institutional context still matters. A PISA item is internationally developed, translated, adapted, and quality-checked, yet no comparative assessment can remove context entirely. That is one reason the OECD publishes technical standards, databases, questionnaires, and manuals alongside headline reports. The system invites scrutiny rather than asking readers for blind trust.[k][u]
How PISA Has Expanded Beyond Math, Reading, and Science
One reason PISA still shapes education debates after more than two decades is that it has moved beyond the three core domains without abandoning them. Earlier cycles explored areas such as problem solving, global competence, and creative thinking. In the PISA 2022 cycle, creative thinking was the innovative domain, and the OECD later published a dedicated volume on it in June 2024. That volume examined how well students can generate diverse, original, and useful ideas across different contexts, extending the discussion from routine performance to idea production and improvement.[v][w]
PISA 2022 also included an optional financial literacy assessment, with separate results published in June 2024. That release linked stronger financial literacy to more responsible financial behaviour, including greater likelihood of saving money and comparing prices before purchases. This matters because it shows how PISA has become a platform for measuring practical capabilities that sit near daily life, not only classroom achievement.[x]
The next step is even more revealing. PISA 2025 includes Learning in the Digital World as its innovative domain. The OECD defines it as students’ capacity to build knowledge and solve problems with computational tools through self-regulated learning and inquiry practices. Results for that innovative domain are expected in December 2027. Then PISA 2029 is set to add Media and Artificial Intelligence Literacy, aimed at how students evaluate digital content, understand AI-mediated systems, and act responsibly in mediated environments. That direction aligns closely with current curriculum debates on digital competence, AI use in schools, and information credibility.[e][h]
Why PISA Still Matters in 2026
As of 1 April 2026, the education sector is in a transition period between the release of PISA 2022 results and the upcoming publication of PISA 2025 core findings later in 2026. That timing matters. The sector is no longer reacting only to pandemic disruption; it is now asking which systems turned disruption into adaptation, how digital learning should be measured, and whether assessment should capture more than content recall. PISA remains central because it is one of the few global instruments already moving in that direction while keeping a stable trend line for the core subjects.[a][e]
This is also why PISA appears in current debates on AI in education, digital pedagogy, curriculum modernization, and student agency. The OECD’s 2029 MAIL plan does not treat media and AI literacy as a side issue. It treats it as a measurable educational competence. That is a notable shift. A decade ago, many systems discussed digital literacy as an add-on. By 2026, the direction of travel is clear: digital judgment, source evaluation, and responsible AI interaction are moving toward the center of what global assessment considers educational readiness.[h]
What Readers Should Watch Next
- PISA 2025 core results in 2026: these will show whether the post-2022 picture points to recovery, stagnation, or further divergence.[a]
- Digital-world findings in 2027: these will add direct evidence on self-regulated learning and computational inquiry.[e]
- PISA 2029 preparation: the reading focus and media/AI literacy domain show where cross-national assessment is heading.[h]
- Better use of public data: the OECD’s data explorer and dashboard let readers move beyond headlines and inspect patterns by country, subgroup, and indicator.[s][t]
How to Interpret a Country’s PISA Story Without Oversimplifying It
When readers open a national fact sheet or country profile, the most useful sequence is straightforward. Start with the average score. Then check the share below baseline and the share at advanced levels. After that, look at socio-economic gradients, gender patterns, and the context indicators on school climate and learning conditions. Finally, ask how much of the 15-year-old population the sample represents. That order prevents one of the most common errors in public debate: treating a ranking position as if it were a complete diagnosis.[c][f]
This matters because PISA is strongest when it is used to ask better questions. Why do some systems combine high performance with narrower inequality? Why do others sustain a solid baseline but struggle to produce advanced performers? Which school conditions seem linked to resilience during disruption? Where does digital learning support stronger outcomes, and where does it merely add screen time without learning gain? Those are the questions that make the assessment useful to ministries, schools, and researchers alike.[l][r]
Seen in that light, PISA is neither a magic verdict on school systems nor a shallow media ranking exercise. It is a recurring, technically structured attempt to measure whether education systems are helping adolescents develop usable knowledge, foundational proficiency, and the capacity to keep learning in a changing world. That is why the term PISA keeps returning to the center of global education analysis, and why its future domains now reach into digital inquiry, media judgment, and AI literacy as naturally as they once centered only on reading, mathematics, and science.[a][h]
Sources
- [a] PISA: Programme for International Student Assessment | OECD — official overview of purpose, domains, cycle timing, and current release schedule.
- [b] What is PISA?: PISA 2022 Results (Volume I) | OECD — official explanation of what the assessment is designed to measure.
- [c] PISA Frequently Asked Questions (FAQs) | OECD — official definitions for target age, coverage, inclusion, and participation logic.
- [d] PISA 2022 Results (Volume I) | OECD — core 2022 results, release date, and cross-domain summary findings.
- [e] PISA 2025 Learning in the Digital World | OECD — official description of the 2025 innovative domain and its expected publication timing.
- [f] PISA data and methodology | OECD — official access point for indicators, explorer tools, and methodological documentation.
- [g] PISA 2022 Database | OECD — official dataset page with student, school, teacher, and parent response files.
- [h] PISA 2029 Media and Artificial Intelligence Literacy | OECD — official introduction to the planned 2029 innovative domain.
- [i] PISA: Programme for International Student Assessment | OECD — official wording on real-life application of knowledge and skills.
- [j] The PISA target population, the PISA samples, and the definition of schools | OECD — technical discussion of sampling, school counts, and participation thresholds.
- [k] PISA 2025 Technical Standards | OECD — official technical standards for test design and administration.
- [l] PISA 2022 Results (Volume II) | OECD — official report on learning during disruption and system resilience.
- [m] Equity in education in PISA 2022 | OECD — official chapter on inequality and academically resilient students.
- [n] What can students do in mathematics, reading and science? | OECD — official chapter on proficiency levels, baseline proficiency, and top performers.
- [o] Full Report: PISA 2022 Results (Volume I) | OECD — full report including the OECD’s comparison of 2018 and 2022 performance changes.
- [p] Executive Summary: PISA 2022 Results (Volume I) | OECD — official summary of the 2022 downturn and its interpretation.
- [q] PISA 2022 Results (Volume II) | OECD PDF — official PDF noting participation scale in 2022.
- [r] PISA 2022: Insights and Interpretations | OECD — official interpretive summary with cross-country context.
- [s] OECD Data Explorer — official OECD platform for exploring indicators and downloadable statistics.
- [t] PISA Dashboard | OECD — official dashboard for trends, comparisons, and selected indicators.
- [u] PISA 2022 Technical Report | OECD — official technical report on validity, reliability, and comparability.
- [v] PISA 2022 Results (Volume III) | OECD — official report on creative thinking results.
- [w] New PISA results on creative thinking | OECD — official summary of the creative thinking release.
- [x] PISA 2022 Results (Volume IV) | OECD — official report on financial literacy results and behaviour links.