Survey of Principals (SOP)

Detailed information for 2004-2005

Status:

Inactive

Frequency:

One Time

Record number:

5065

The main objective of this survey is to evaluate the impact of different changes observed in education such as curriculum changes, budget reductions, new policy directives on teaching and the work of principals in Canadian schools. This survey aims to collect information on principals, their situations and professional practices, the transformations which affect their training, their competencies, as well as their daily work and their interactions with students and other educational partners.

Data release - June 26, 2006 (in the publication "Education Matters")

Description

The main objective of this survey is to evaluate the impact of different changes observed in education such as curriculum changes, budget reductions, new policy directives on teaching and the work of principals in Canadian schools. This survey aims to collect information on principals, their situations and professional practices, the transformations which affect their training, their competencies, as well as their daily work and their interactions with students and other educational partners.

The survey is conducted jointly by Statistics Canada and a team of researchers from Faculties of Education in universities across the country. The survey is part of a research project sponsored by the Social Sciences and Humanities Research Council of Canada (SSHRC). SSHRC is an arms-length federal agency that promotes and supports university-based research and training in the social sciences and humanities. The data will be useful to researchers, policy-makers, provincial and territorial departments or ministries of education, school boards and districts, principals and teachers.

Reference period: Academic year (September 2004 to June 2005)

Collection period: Late October 2004 to Early February 2005

Subjects

  • Education, training and learning
  • Teachers and educators

Data sources and methodology

Target population

The target population of the Survey of Principals consists of all principals of elementary and secondary schools in Canada who held their jobs at the start of the 2004-05 school year, excluding principals of continuing education/adult day schools, trade/vocational schools, language and cultural educational schools, home schools, community education centres, social service centres, distance education centres, virtual schools and schools located in First Nations communities. It includes schools in all provinces and territories. This list is compiled and kept up-to-date by the Centre for Education Statistics of Statistics Canada. A sample of 4,800 schools was invited to participate in the survey.

Instrument design

The questionnaire was designed for a mail-in/mail-back paper-pencil instrument. In 2003, the questionnaire was developed in collaboration with a group of researchers from the faculties of education at the following universities: University of Sherbrooke, University of Montreal, University of Toronto and Simon Fraser University. They provided expertise in the development of questions related to different aspects of education. The Council of Ministers of Education Canada (CMEC) and the Canadian Teachers' Federation (CTF) were also consulted. In addition, Statistics Canada's Questionnaire Development Research Centre (QDRC) has assisted with the development of scales.

Consultations took place in the fall of 2003 with 30 principals (individually and through focus groups) to test the questionnaire. These consultations took place in Moncton, Montréal and Toronto. The sample composition of the focus groups took the following characteristics into consideration: Language of school, school level, schools with a high concentration of visible minorities, school funding, gender and years of experience of principal. In addition, about 200 principals from different locations in Canada were invited to participate in a pilot survey in the spring of 2004. Changes to the questionnaire were made following the focus groups and pilot test. These changes focused on reducing respondent burden and improving data quality.

Sampling

This is a sample survey with a cross-sectional design.

The survey frame consisted of a list of all public and private elementary secondary schools that were part of the target population at the start of the 2003/04 school year.

The schools listed on the frame were the sampled units but the school principals themselves were the responding units. In the case where a principal was in charge of multiple schools selected in the sample, the principal had to respond for each of their chosen schools given that certain questions are school specific.

The sample was stratified according to the region of the school, as well as its level of instruction. The regions used were the Atlantic Provinces (Newfoundland and Labrador, Prince Edward Island, Nova Scotia, and New Brunswick), Québec, Ontario, the Prairie provinces (Manitoba, Saskatchewan, and Alberta), British Columbia, and the Territories (Yukon, the Northwest Territories, and Nunavut). The levels of instruction used were: elementary, secondary, or mixed (both elementary and secondary). Within each stratum, schools were sorted by their size (small, medium or large), their primary language of education (English or French), their funding (public, private or a mixed scheme) and their geographic location (urban or rural).

The allocation of sample to the regions was done proportionally to the square root of the number of schools in each region. Within each region, the sample was then allocated to each level of instruction proportionally to the number of schools in the region with that level of instruction. Within each stratum, the sample was systematically selected.

Sample and population counts by stratum are outlined in the following table:

Data sources

Data collection for this reference period: 2004-10-21 to 2005-02-04

Responding to this survey is voluntary.

Data are collected directly from survey respondents.

The Survey of Principals was conducted as a mail-out, mail-back survey. Prior to sending the questionnaires, school boards were informed of the purpose of the survey. Response by proxy was not accepted and therefore the questionnaire had to be completed by the school principal.

A fax reminder was sent 21 days after sending questionnaires for those schools which had not yet responded. Telephone follow-ups started 10 days after the fax reminder follow-up was done for schools which still had not responded to the survey. Submission of answers via facsimile machine (as opposed to the required mail-back option) was offered at this time in order to increase the response rate. Approximately 275 of the responding principals (12.5%) did take advantage of this option. The final resolved rate was 56% with a 47% response rate (2,226 questionnaires with data).

The answers provided on the questionnaire by the principals were captured using electronic imaging. The only exceptions were questions where the principals could write in their own answers as text; these were captured with traditional keying on microcomputers. Text recognition software was used to capture the data and all answers captured, either manually or electronically, were subject to double verification.

View the Questionnaire(s) and reporting guide(s) .

Error detection

Edit rules were derived to check for logical inconsistencies as well as invalid and missing values in the survey data. This error detection process was performed during data processing. For the most part, the edit rules were applied to the data provided in Section 1 of the questionnaire as this was the section most likely to contain errors. There were two possible sources of error: a principal committing an error in completing the questionnaire or a data capture error by the imaging process.

The SAS System was used to detect potential errors at a micro level, as one questionnaire was examined at a time. Missing values were detected for the socio-demographic questions about the principal (questions 1-7). Inconsistencies in the numeric questions (questions 8-15) were identified with the help of different types of edits. Equality edits verified that certain sums added up (e.g. percentages adding up to 100%, cells adding up to a previously provided total). Inequality edits verified that certain cells were either larger or greater than other cells to which they were related (e.g. number of principals in the school was not greater than the number of teachers). Ratio edits verified that two cells which were related were within certain boundaries of a pre-determined relationship (e.g. ratio of students to teachers). Consistency edits verified that when a concept or piece of information occurred in multiple questions, answers provided to each of the questions in the set were in fact consistent.

Answers provided by principals in "Other/Specify" cells were all examined for every question possessing such an answer category. When appropriate, answers provided in these "Other/Specify" cells were recoded to one of the existing categories.

A code of "not stated" was assigned whenever the respondents did not provide any information to questions that should have been answered.

Imputation

For each missing or inconsistent answer found, the electronic image of the questionnaire was consulted and used to correct the errors.

For the few principals who didn't provide their gender, their name was used to impute their gender manually. For those who reported an aberrant number of ethnicities, their personal combination of ethnic origins corresponded exactly to those of the school's students provided in question 14 and the first declared ethnicity was kept.

For question 9, the sum of full-time employees was adjusted to include any cell values left out by mistake and to correct calculation errors. Data reported in the wrong categories were fixed and the total was adjusted if necessary.

When a school's enrolment was not provided in question 10, the missing figure was imputed by the value provided in question 14. If this value was either missing or seemed nonsensical, the enrolment was imputed with frame data. In cells 1280-1281, the converse of what was asked for was often reported. Obvious cases were corrected in the appropriate fashion.

Percentages provided for questions 11, 12A and 15 which did not add up to 100% were pro-rated whenever the error didn't appear to be due to rounding.

Answers to question 13 provided as percentages instead of student counts were fixed accordingly.

When there were no allophones declared in 12A, the missing values were imputed to 0%. Otherwise, they were imputed by the median of the schools in the area. In question 12B, the percentage of students new to Canada within the previous year were sometimes reported as the percentage that had been in Canada for at least a year. These cases were corrected in the appropriate manner.

Given the poor quality of the data regarding part-time employees in question 9 and question 14, the data in these questions were not corrected and should not be used for analytical purposes.

Approximately 80% of the records had at least one correction made to the data in Section 1.

Estimation

Non-response is the major source of error for a survey such as the Survey of Principals. As this class of errors is not generally random, it is important that it be prevented and also that a proper adjustment strategy be derived to compensate for the presence of systematic non-response patterns. Given that slightly over half of the sampled principals did not respond to the survey, it was necessary to modify the sampling weights to compensate for non-response. Ideally, this adjustment should aim to reduce the non-response bias caused by the fact that respondents could exhibit different characteristics from non-respondents as much as possible.

A logistic regression model was employed to identify variables that appeared to be best able to explain the response pattern. Many variations of this model were considered in order to determine which model would be most appropriate for adjusting the weights. After many analyses and diagnostic tests, the chosen non-response adjustment scheme was the one which formed homogeneous response groups by region, instructional level and size of school, as reported on the frame. This grouping scheme is in fact a refined version of the sampling stratification which was implemented by region and level of instruction. This method of weight adjustment was the one which reduced the non-response bias the most while providing the best coefficients of variation possible.

The Generalized Estimation System (GES) was used to produce the required weighted estimates as well as the corresponding estimates of the coefficients of variation. A two-phase estimation approach was adopted in order to incorporate the variance due to non-response. Indeed, the first of these phases was considered to be the sampling phase while the second phase was taken to be the response phase.

Quality evaluation

An analysis was performed on the quality of the responses received both during collection and following the capture of the data, including an analysis of response rates for every question, as well as thorough consistency checks which resulted in the editing of the data as mentioned previously in this document. For more details regarding strategies adopted to improve response rates, please refer to the section on measures of data accuracy.

An external source was also verified for output consistency. Pupil educator ratios were compared with output from the 2003-2004 ESESP (Elementary Secondary Education Statistics Project - record number 5102).

Disclosure control

Statistics Canada is prohibited by law from releasing any data that would divulge information obtained under the Statistics Act that relates to any identifiable person, business or organization without the prior knowledge or the consent in writing of that person, business or organization. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.

Prior to finalizing the microdata file, certain stratification variables (both explicit and implicit) that could be derived from responses to the questionnaire were attached to records to allow a more detailed analysis of the data. In order to ensure that the addition of these variables did not put any respondents at risk of disclosure, all of the possible three-way crossings of these stratification variables (say, region, grade level and size as in Prairies-Elementary-large, or Quebec-Mixed-small) were examined. The number of times that a given school was alone in a cell among the set of three-way tables was computed to assess which schools had rare combinations of the variables. If one variable contributed a great deal to the number of times certain schools were alone in cells, then it was deemed to be too sensitive to release. The stratification variables were removed in this manner until none of the three-way combinations of the remaining variables compromised the confidentiality of the survey records.

Furthermore, before releasing and/or publishing any estimate from the Survey of Principals, the user should first determine the number of respondents who contributed to the calculation of the estimate. If this number is five or less, the weighted estimate should not be released, in order to respect policies regarding confidentiality.

Revisions and seasonal adjustment

This methodology does not apply to this survey.

Data accuracy

Various strategies were put in place during data collection to improve response rates such as: refinement of the questionnaire following a field pre-test, initiating contact through a letter of introduction, non-respondent follow-up and allowing the return of questionnaires via mail or fax. A total of 4,800 Canadian elementary-secondary schools were sampled for the Survey of Principals. Over the course of the collection period, 124 of the 4,800 schools sampled were determined to be out of scope for the survey. Of the remaining schools, 2,230 provided usable information for the survey, although only 2,144 of these opted to share their information. This gives a response rate of 48% and a sharing rate of 46%.

The frame used for the Survey of Principals was based on the frame used for the ICTSS (Information and Communications Technologies in the Schools Survey - record number 5051) which was conducted one year earlier. As referenced in the section regarding the sample design, this frame consisted of a list of schools that were open at the start of the 2003/04 academic year. Due to the vintage of the frame, the survey estimates are subject to some degree of coverage error.

At the end of the collection window, respondent weights were adjusted in order to compensate for the amount of bias introduced due to survey non-response. The method of non-response adjustment used was specifically selected over others as it appeared to be the most effective in reducing non-response bias. As mentioned in the sections regarding imputation and data editing, extensive procedures were implemented to treat response and processing errors. Variance estimation methods took this weight adjustment for non-response into account.

The quality level of any estimate should be determined prior to its release and/or publication. For weighted estimates based on sample sizes of 5 or more, users should determine the coefficient of variation of the estimate and follow the guidelines below (in the "Additional documentation" link). Estimates of marginal or unacceptable quality must include a warning to caution subsequent users.

To illustrate the concept of coefficients of variation, a table of CVs produced for a variety of SOP estimates is also presented below.

Documentation

Date modified: