Programme for International Student Assessment (PISA)
Detailed information for 2012
Every 3 years
Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society.
Data release - December 3, 2013
Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society. PISA is developed jointly by member countries of the Organisation for Economic Co-operation and Development (OECD).
The survey gathers cross-sectional data, and will use a new sample of 15 year-olds for each cycle of the survey. PISA assessments take place every three years and focus on three domains: reading literacy, mathematical literacy and scientific literacy. While the three domains form the core of each cycle, two-thirds of the assessment time in each cycle will be devoted to a "major" domain.
An international dataset, which includes Canadian data and full documentation for this dataset, can be found under www.pisa.oecd.org.
Reference period: School year (September - May)
Collection period: April to May
- Education, training and learning
Data sources and methodology
The survey population was comprised of students who were 15 years of age and were attending any form of schooling in the ten provinces of Canada. Schools on Indian reserves were excluded, as were various types of schools for which it would be infeasible to administer the survey, such as home schooling and special needs schools. These exclusions represent less than 4% of 15-year-olds in Canada.
A number of students from each selected school were randomly selected to participate in a PISA paper-based assessment in reading, mathematics and science and to answer some questions about their school, their interests, their experiences at school and work (such as extracurricular activities, volunteer work, and summer employment), and their family (such as parental occupation, and parental involvement in the education of their youth). Additionally, a sub-sample of students was randomly selected to participate in a computer-based assessment of mathematics, reading and problem-solving.
In PISA 2012 a rotated test design was used to assess student performance in reading, mathematical and scientific literacy. as well as problem solving. This type of test design ensured a wide coverage of content while at the same time keeping the testing burden on individual students low. For the paper assessment, thirteen different test booklets were distributed at random to students. These booklets included questions assessing reading, mathematical and scientific literacy, but not all booklets assessed the same domains. While mathematics (the major domain for PISA 2012) was assessed in all booklets, reading and science (the minor domains) were assessed in selected booklets. Students were randomly assigned a testing booklet within each of the sampled schools.
This is a sample survey with a cross-sectional design.
Over 65 countries participated in PISA 2012. In most countries, between 4,000 and 10,000 15 year-olds participated in PISA.
In Canada, approximately 20,000 15 year-old students from more than 850 schools participated. The large Canadian sample was needed to produce reliable estimates for each province and for both English and French language school systems in British Columbia, Alberta, Manitoba, Ontario, Quebec, New Brunswick and Nova Scotia.
The PISA 2012 sample for Canada was based on a two-stage stratified sample. The first stage consisted of sampling individual schools in which 15-year-old students were enrolled. Schools were sampled systematically with probabilities proportional to size, the measure of size being a function of the estimated number of eligible (15-year-old) students enrolled in the school. The second stage of the selection process sampled students within sampled schools. Once schools were selected, a list of all 15-year-old students in each sampled school was prepared. From this list, up to 35 students were then selected with equal probability. All 15-year old students were selected if fewer than 35 were enrolled. Additionally, in Newfoundland and Labrador, Prince Edward Island, Nova Scotia, New Brunswick and Quebec and in the French-language school systems in Manitoba and Alberta more than 35 students were selected where possible in order to meet sample size requirements. In addition, in each participating school, a sub-sample of approximately 15 students was randomly selected to respond to the electronic components of PISA in mathematics, reading and problem solving after the completion of the core paper-based components.
Responding to this survey is voluntary.
Data are collected directly from survey respondents.
The PISA session occurs in the school and consists of: a two-hour written PISA assessment, a PISA Student questionnaire (40 minutes) and a 1 hour electronic assessment. In addition, school administrators were also asked to complete the PISA School Administrator's questionnaire. This questionnaire took 30 minutes to complete.
View the Questionnaire(s) and reporting guide(s).
A series of verifications took place to ensure that the records were consistent and that collection and capture of the data did not introduce errors. Reported data were examined for completeness and consistency using automated edits coupled with manual review.
This methodology type does not apply to this statistical program.
The estimation of population characteristics from a survey is based on the premise that each sampled unit represents, in addition to itself, a certain number of non-sampled units in the population. A basic survey weight is attached to each record to indicate the number of units in the population that are represented by that unit in the sample. This basic weight is derived from the sample design.
An initial weight was derived based on the two-stage sample design used for this survey. A number of non-response adjustments are applied in order to obtain final weights. More than one adjustment is required because non-response can occur at various levels (e.g. schools and students).
To ensure high quality data, International Survey Administration Guidelines were followed and supplemented by adherence to Statistics Canada's own internal policies and procedures.
As a condition of participation in the international study, it was required to capture and process files using procedures that ensured logical consistency and acceptable levels of data capture error
Quality and consistency of the coding of items across countries were validated. Persons charged with coding of the assessment received intense training on coding open ended response using the PISA coding manual. To aid in maintaining scoring accuracy and comparability between countries, the PISA survey used an electronic bulletin board, where countries could post their coding questions and received coding decisions from the domain experts. This information could be seen by all participating countries, and they could then adjust their scoring. To further ensure quality, the Consortium developed a cross-country reliability study to ensure that coders were applying the same criteria when coding the items. Two hundred booklets in English and 100 booklets in French were coded 4 times by four different coders in order to compare reliability. Canada had a within-country agreement above 97 per cent across items.
Statistics Canada is prohibited by law from releasing any information it collects that could identify any person, business, or organization, unless consent has been given by the respondent or as permitted by the Statistics Act. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.
Revisions and seasonal adjustment
These data are preliminary and will be revised on a monthly basis.
Data quality is affected by both sampling and non-sampling errors. Non-sampling errors were minimized through testing (focus group, pilot survey and main survey); training of regional office staff; and observation by head office personnel. Quality assurance measures were implemented at each step of the data collection and processing cycle to monitor data quality.
Each country participating in PISA attempted to maximize the coverage of PISA's target population within the sampled schools. Within each sampled school, all eligible students, namely those 15 years of age, regardless of grade, were first listed. Sampled students who were to be excluded by the school had still to be included in the sampling documentation, and a list drawn up stating the reason for their exclusion Students could be excluded based on these three international categories: i) students with a functional disability - student has a moderate to severe permanent physical disability such that he/she cannot perform in the PISA testing situation; ii)students with an intellectual disability - student has a mental or emotional disability and is cognitively delayed such that he/she cannot perform in the PISA testing situation ; and iii) students with a limited proficiency in the assessment language - student is unable to read or speak any of the languages of the assessment in the country and would be unable to overcome the language barrier in the testing situation (typically a student who has received less than one year of instruction in the language of the assessment may be excluded)
The weighted student exclusion rate for Canada overall was 5.5%
In order to minimize the potential for response bias, data quality standards in PISA require minimum participation rates for schools and students. At the national level, a minimum response rate of 85% was required for schools initially selected. School response rates were also considered acceptable where the initial school response rate was between 65% and 85% and replacement schools were selected to achieve a school response rate of 85% or higher. Schools with student participation rates between 25% and 50% were not counted as participating schools, but data for these schools were included in the database. Schools with student participation rates of less than 25% were not counted as participating and their data were excluded from the database.
PISA 2012 also requires a minimum student participation rate of 80% within all participating schools combined (original sample and replacements) at the national level.
In Canada 907 schools were selected to participate in PISA 2012 and 828 of these initially selected schools participated. Rather than calculating school participation rates by dividing the number of participating schools by the total number of schools, school response rates were weighted based on 15-year-old enrolment numbers in each school. Across Canada, the school response rate was 93%. At the student level Canada's response rate after replacement was 81%.
Data quality indicators are presented in all PISA publications. It is recommended that any estimate based on fewer than 30 observations or with a CV greater than 33.3% not be released.