Programme for International Student Assessment (PISA)

Detailed information for 2003

Status:

Inactive

Frequency:

Every 3 years

Record number:

5060

Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society.

Data release - December 6, 2004

Description

Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society. PISA is developed jointly by member countries of the Organisation for Economic Co-operation and Development (OECD).

The survey gathers cross-sectional data, and will use a new sample of 15 year-olds for each cycle of the survey. PISA assessments take place every three years and focus on three domains: reading literacy, mathematical literacy and scientific literacy. While the three domains form the core of each cycle, two-thirds of the assessment time in each cycle will be devoted to a "major" domain.

An international dataset, which includes Canadian data and full documentation for this dataset, can be found under www.pisa.oecd.org.

Statistical activity

PISA/YITS is one project, which consists of two parallel survey programs: the Programme for International Student Assessment (PISA) and the Youth in Transition Survey (YITS).

PISA is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society.

YITS is designed to examine the patterns of, and influences on, major transitions in young people's lives, particularly with respect to education, training and work. Human Resources and Skills Development Canada and Statistics Canada have been developing the YITS in consultation with provincial and territorial ministries and departments of labour and education. Content includes measurement of major transitions in young people's lives including virtually all formal educational experiences and most about-market experiences, achievement, aspirations and expectations, and employment experiences. The implementation plan encompasses a longitudinal survey of each of two groups, ages 15 and 18-20, to be surveyed every two years.

The 15 year-old respondents to the Reading Cohort (conducted in 2000) participated in both PISA (record number 5060) and YITS (record number 5058). Starting in 2002, they were followed up longitudinally by YITS (record number 4435). In 2009, a sub-sample of this cohort also participated in the Reading Skills Reassessment.

Reference period: School year (September - May)

Collection period: April to May

Subjects

  • Education, training and learning
  • Literacy

Data sources and methodology

Target population

The survey population was comprised of students who were 15 years of age and were attending any form of schooling in the ten provinces of Canada. Schools on Indian reserves were excluded, as were various types of schools for which it would be infeasible to administer the survey, such as home schooling and special needs schools. These exclusions represent less than 4% of 15-year-olds in Canada.

Instrument design

In PISA 2003 a rotated test design was used to assess student performance in mathematical, reading and scientific literacy as well as problem solving. This type of test design ensured a wide coverage of content while at the same time keeping the testing burden on individual students low. Twelve different test booklets were distributed at random to students. These booklets included questions assessing mathematical, reading and scientific literacy as well as problem solving, but not all booklets assessed the same domains. While mathematics (the major domain for PISA 2003) was assessed in all booklets, reading, science and problem solving (the minor domains) were assessed in selected booklets. Students were randomly assigned a testing booklet within each of the sampled schools. (For further information, refer to the document available through the Online Catalogue number 81-590-XIE (free): Measuring Up: Canadian Results of the OECD PISA Study: The Performance of Canada's Youth in Mathematics, Reading, Science and Problem Solving, 2003, Vol. 2.)

The national component consisted of a student and parent questionnaire. The YITS-specific 30-minute Student questionnaire was developed for items not covered by PISA. These items gather information on transition experiences, school engagement, attrition rate and activity (possible drop-out rate and reasons why), academic streaming, work load, programs for work preparation, labour force participation, education barriers (stressful experiences, school engagement, career aspirations, early formative influences, deviant behaviour, family relationships, living and learning conditions and other background variables). The Student questionnaire was designed for paper and pencil collection. As the questions were developed, the associated logical flows, into and out of the questions, were specified to be user friendly. This included specifying the default questions or the next applicable question based on edits and checks.

Focus group testing and one-on-one interviews were held to test the YITS Student questionnaire. A pilot survey was held on a sample of 1,800 students nationally and this field-study assisted in the improvement of survey instruments and procedures for the main survey held in 2003. When applicable, questions used in other Statistics Canada surveys were implemented in YITS to improve comparability across surveys.

The YITS Parent questionnaire, a 30-minute parent questionnaire, administered through a telephone interview, was used to collect information on the parents and their household to obtain more reliable data on socio-economic status. This component was developed using computer-assisted interviewing.

The questionnaire was designed for collection as a Computer-assisted Interview (CAI). As questionnaire specifications were written, the associated logical flows into and out of the questions were programmed. This included specifying the type of answer required, the minimum and maximum values, hard and soft on-line edits associated with the questions and what was required in case of non-response. Testing of the programme identified poor question wording or ordering, errors in questionnaire format or instructions, response entry length, flows, ranges, hard and soft edits, and interview length. As well, the assignment of proper in-progress and final codes was verified.

Sampling

This is a sample survey with a cross-sectional design.

Forty-two countries participated in PISA 2003. In most countries, between 4,000 and 10,000 15 year-olds participated in PISA.

In Canada, approximately 30,000 15 year-old students from more than 1,000 schools participated. The large Canadian sample was needed to produce reliable estimates for each province and for both English and French language school systems in Manitoba, Ontario, Quebec, New Brunswick and Nova Scotia.

Data sources

Data collection for this reference period: 2003-04-01 to 2003-05-31

Responding to this survey is voluntary.

Data are collected directly from survey respondents.

The PISA/YITS session occurs in the school and consists of: a two-hour written PISA assessment; a PISA Student questionnaire (40 minutes); and a Youth in Transition Survey questionnaire (20 minutes).

In addition, school administrators were also asked to complete the PISA School Administrator's questionnaire. This questionnaire took 30 minutes to complete.

Following the administration at school, parents of selected students were asked to participate in the YITS parent interview. This 30-minute interview was administered over the telephone by regional office interviewers.

View the Questionnaire(s) and reporting guide(s).

Error detection

A series of verifications took place to ensure that the records were consistent and that collection and capture of the data did not introduce errors. Reported data were examined for completeness and consistency using automated edits coupled with manual review. Some responses reporting uncommon values or characteristics were processed manually.

Imputation

This methodology does not apply.

Estimation

The estimation of population characteristics from a survey is based on the premise that each sampled unit represents, in addition to itself, a certain number of non-sampled units in the population. A basic survey weight is attached to each record to indicate the number of units in the population that are represented by that unit in the sample. This basic weight is derived from the sample design.

An initial weight was derived based on the two-stage sample design used for this survey. A number of non-response adjustments are applied in order to obtain final weights. More than one adjustment is required because non-response can occur at various levels (e.g. schools and students).

Quality evaluation

Partial Non-response:
Partial non-response can occur when a respondent is unwilling to answer sensitive questions, accidentally skips part of the questionnaire, or there is respondent fatigue or there are operational difficulties. Variables within a section tend to share a common subject matter and/or are used together for deriving variables about the same subject matter. If information is needed for only one variable at a time, the codebook should be consulted. Overall, item non-response is lower for the Parent questionnaire than it is for the Student questionnaire, with most variables having less then 1% missing data. The question where parents were asked about the month in which their spouse started working at their current job (PF48M) has the highest overall non-response rate at 30.1%.

Comparing counts to Census 2001:
When dealing with survey data, the sum of the final sample weights for a particular domain of the population will give an estimate of the population size for that domain. These totals were estimated for the gender and province domains and then compared to known population counts obtained from Census 2001 data for those same domains (13 years old in 2001). Note that these census totals aren't exactly representative of the 15 year-old population in 2003 but serve as a good estimate. Because a census count of 15-year-olds, born in 1987, would include individuals that are not part of the target population (e.g. home schooled children, special needs students), the estimated totals based on the YITS weights should be less than the census totals. Overall, it is estimated that YITS covers close to 83% of all 15-year-olds in the population. Note that the sum of the weights differs from the total enrolment in the national defined target population because some of the sampled students ended up out-of-scope for the survey.

Disclosure control

Statistics Canada is prohibited by law from releasing any information it collects that could identify any person, business, or organization, unless consent has been given by the respondent or as permitted by the Statistics Act. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.

In order to prevent any data disclosure, confidentiality analysis is done using the Statistics Canada Generalized Disclosure Control System (G-Confid). G-Confid is used for primary suppression (direct disclosure) as well as for secondary suppression (residual disclosure). Direct disclosure occurs when the value in a tabulation cell is composed of or dominated by few enterprises while residual disclosure occurs when confidential information can be derived indirectly by piecing together information from different sources or data series.

Revisions and seasonal adjustment

This methodology does not apply to this survey.

Data accuracy

Data quality is affected by both sampling and non-sampling errors. Non-sampling errors were minimized through testing (focus group, pilot survey and main survey); training of regional office staff; observation by head office personnel; tabulations of initial data; and adjustment of questionnaire specifications for future cycles. Quality assurance measures were implemented at each step of the data collection and processing cycle to monitor data quality. For sampling error, data reliability guidelines were established based on coefficient of variation (CV). It is recommended that any estimate based on fewer than 30 observations or with a CV greater than 33.3% not be released.

The table below provides an indication of data quality for the estimated average mathematics score for 15-year-olds by province. Additional data quality indicators are presented in all PISA publications.

Report a problem on this page

Is something not working? Is there information outdated? Can't find what you're looking for?

Please contact us and let us know how we can help you.

Privacy notice

Date modified: