Programme for International Student Assessment (PISA)

Detailed information for 2000

Status:

Inactive

Frequency:

Every 3 years

Record number:

5060

Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society.

Data release - December 4, 2001

Description

Programme for International Student Assessment (PISA) is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society. PISA is developed jointly by member countries of the Organisation for Economic Co-operation and Development (OECD).

The survey gathers cross-sectional data, and will use a new sample of 15 year-olds for each cycle of the survey. PISA assessments take place every three years and focus on three domains: reading literacy, mathematical literacy and scientific literacy. While the three domains form the core of each cycle, two-thirds of the assessment time in each cycle will be devoted to a "major" domain.

An international dataset, which includes Canadian data and full documentation for this dataset, can be found under www.pisa.oecd.org.

Statistical activity

PISA/YITS is one project, which consists of two parallel survey programs: the Programme for International Student Assessment (PISA) and the Youth in Transition Survey (YITS).

PISA is an international assessment of the skills and knowledge of 15 year-olds which aims to assess whether students approaching the end of compulsory education have acquired the knowledge and skills that are essential for full participation in society.

YITS is designed to examine the patterns of, and influences on, major transitions in young people's lives, particularly with respect to education, training and work. Human Resources and Skills Development Canada and Statistics Canada have been developing the YITS in consultation with provincial and territorial ministries and departments of labour and education. Content includes measurement of major transitions in young people's lives including virtually all formal educational experiences and most about-market experiences, achievement, aspirations and expectations, and employment experiences. The implementation plan encompasses a longitudinal survey of each of two groups, ages 15 and 18-20, to be surveyed every two years.

The 15 year-old respondents to the Reading Cohort (conducted in 2000) participated in both PISA (record number 5060) and YITS (record number 5058). Starting in 2002, they were followed up longitudinally by YITS (record number 4435). In 2009, a sub-sample of this cohort also participated in the Reading Skills Reassessment.

Reference period: School year (September - May)

Collection period: April to May

Subjects

  • Education, training and learning
  • Literacy

Data sources and methodology

Target population

The survey population was comprised of students who were 15 years of age and were attending any form of schooling in the ten provinces of Canada. Schools on Indian reserves were excluded, as were various types of schools for which it would be infeasible to administer the survey, such as home schooling and special needs schools. These exclusions represent less than 4% of 15-year-olds in Canada.

Instrument design

In PISA 2000 a rotated test design was used to assess student performance in reading, mathematical and scientific literacy. This type of test design ensured a wide coverage of content while at the same time keeping the testing burden on individual students low. Nine test booklets were distributed at random to students. These booklets included questions assessing reading, mathematics and science literacy, but not all booklets assessed the same domains. While reading (the major domain for Cycle 1) was assessed in all booklets, mathematics and science (the minor domains) were assessed in selected booklets. Students were randomly assigned a testing booklet within each of the sampled schools. (For further information refer to the Manual for the PISA 2000 Database.)

Focus group testing and one-on-one interviews were held to test the YITS Student questionnaire portion of the survey. A Pilot survey was held on a sample of 1,800 students nationally and this field-study assisted in the improvement of survey instruments and procedures for the main survey held in 2000.

Sampling

This is a sample survey with a cross-sectional design.

The sample design for the reading cohort entails two-stage probability sampling, with a stratified sample of 1,200 schools selected at the first stage and a sample of eligible students selected within each sampled school. The initial student sample size for the reading cohort which was conducted in 2000 was 38,000.

Stratification: Two types of stratification were used in the design of this survey: explicit and implicit. Explicit stratification consists of building separate school lists, or sampling frames, according to the set of explicit stratification variables under consideration. Implicit stratification consists essentially of sorting the schools within each explicit stratum by a set of implicit stratification variables. This type of stratification is a very simple way of ensuring a strictly proportional sample allocation of schools across all implicit strata. Province, language of schooling and enrolment size were used as explicit strata. To create implicit strata within each explicit stratum, schools were classified by indicator variables for public/private schools and urban/rural class. Implicit strata were used in the systematic sample selection of schools and also in the weighting process.

Data sources

Data collection for this reference period: 2000-04-01 to 2000-05-31

Responding to this survey is voluntary.

Data are collected directly from survey respondents.

The PISA/YITS session occurs in the school and consists of: a two-hour written PISA assessment; a PISA Student Questionnaire (40 minutes); and a Youth in Transition Survey questionnaire (30 minutes).

In addition, school administrators were also asked to complete the PISA School Administrator's Questionnaire. This questionnaire took 30 minutes to complete.

Following the administration at school, parents of selected students were asked to participate in the YITS parent interview. This 30-minute interview was administered over the telephone by regional office interviewers.

View the Questionnaire(s) and reporting guide(s).

Error detection

A series of verifications took place to ensure that the records were consistent and that collection and capture of the data did not introduce errors. Reported data were examined for completeness and consistency using automated edits coupled with manual review. Some responses reporting uncommon values or characteristics were processed manually.

Imputation

No imputation was done for the PISA assessment or PISA questionnaires in 2000.

Estimation

The estimation of population characteristics from a survey is based on the premise that each sampled unit represents, in addition to itself, a certain number of non-sampled units in the population. A basic survey weight is attached to each record to indicate the number of units in the population that are represented by that unit in the sample. This basic weight is derived from the sample design.

An initial weight was derived based on the two-stage sample design used for this survey. Components to this survey are PISA/YITS student and YITS parent. Weights were calculated on these components and on Mathematics and Science. A number of non-response adjustments were applied in order to obtain final weights. More than one adjustment was required because non-response can occur at various levels (e.g., schools, students, parents, non-consent of parents for students) for these respondents. BRR (balanced repeated replication) and Bootstrap replicate weights were derived to allow users to estimate CVs and standard errors for estimates.

Quality evaluation

Partial Non-response:
Partial non-response can occur when a respondent is unwilling to answer sensitive questions, accidentally skips part of the questionnaire, or there is respondent fatigue or there are operational difficulties. Variables within a section tend to share a common subject matter and/or are used together for deriving variables about the same subject matter. If information is needed for only one variable at a time, the codebook should be consulted. Overall, item non-response is lower for the Parent questionnaire than it is for the Student questionnaire, with most variables having less then 1% missing data. The question where parents were asked about the month in which their spouse started working at their current job (PF48A) has the highest overall non-response rate at 22.6%.

Comparing counts to Census 2001:
When dealing with survey data, the sum of the final sample weights for a particular domain of the population will give an estimate of the population size for that domain. These totals were estimated for the gender and province domains and then compared to known population counts obtained from Census 2001 data for those same domains. Because a census count of 15 year-olds born in 1984 would include individuals that are not part of the target population (e.g. home schooled children, special needs students), the estimated totals based on the YITS weights should be less than the census totals.

Disclosure control

Statistics Canada is prohibited by law from releasing any information it collects that could identify any person, business, or organization, unless consent has been given by the respondent or as permitted by the Statistics Act. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.

Revisions and seasonal adjustment

These data are preliminary and will be revised on a monthly basis.

Data accuracy

Data quality is affected by both sampling and non-sampling errors. Non-sampling errors were minimized through testing (focus group, pilot survey and main survey); training of regional office staff; observation by head office personnel; tabulations of initial data; and adjustment of questionnaire specifications for future cycles. Quality assurance measures were implemented at each step of the data collection and processing cycle to monitor data quality. For sampling error, data reliability guidelines were established based on coefficient of variation (CV). It is recommended that any estimate based on fewer than 30 observations or with a CV greater than 33.3% not be released.

The table below provides an indication of data quality for the estimated average reading score for 15-year-olds by province. Additional data quality indicators are presented in all PISA/YITS publications.

Documentation

Report a problem on this page

Is something not working? Is there information outdated? Can't find what you're looking for?

Please contact us and let us know how we can help you.

Privacy notice

Date modified: