Information and Communications Technologies in Schools Survey (ICTSS)
Detailed information for 2003-2004
The Information and Communications Technologies in Schools Survey (ICTSS) collects data on the infrastructure, reach and some usage patterns of information and communications technologies in all elementary and secondary schools in Canada.
Data release - June 10, 2004
- Questionnaire(s) and reporting guide(s)
- Data sources and methodology
- Data accuracy
The main purpose of this survey is to obtain critical benchmark data on the integration of ICT in education. The Information and Communications Technologies in Schools Survey (ICTSS) collects data on the infrastructure, reach and use of information and communications technologies in all elementary and secondary schools in Canada.
The survey was sponsored by Industry Canada's SchoolNet program which works with Canadian learning partners to increase access to and integration of ICT into the learning environment in order to develop an ICT-skilled population, capable of participating in the knowledge economy. Support to the initiative has been provided by the Library and Archives Canada.
The survey asked a variety of questions about the reach, use, infrastructure and outcomes of ICT being used in schools. The main topics include:
-the current ICT infrastructure in the school;
-information about the time when students can access computers;
-information about the location of computers in the school;
-the types of internet and intranet connections;
-teacher skills and training in ICT;
-capabilities in regards to online courses and videoconferencing;
-attitudes toward ICT;
-challenges encountered in using ICT.
The data will be used to assess the current status of ICT infrastructure and access and some usage patterns in the schools, from a Canadian perspective. Survey findings will also guide future policy and program development and provide the basis for future research related to the use of ICT in learning. The survey data will also provide the education authorities and the public with information in order to measure the quantity and quality of the ICT infrastructure in the schools in Canada and to develop programs and policies aimed specifically at schools.
- Education, training and learning
- Information and communications technology
Data sources and methodology
The target population for ICTSS is all elementary and secondary schools in Canada, excluding continuing education/adult day schools, trade/vocational schools, language and cultural educational schools, home schools, community education centres and social service centres. It includes schools in all provinces and territories and also schools located in aboriginal communities.
The questionnaire was designed for a paper-pencil instrument. In the fall of 2002, the questionnaire was developed in collaboration with the Industry Canada SchoolNet program. An advisory committee was created with members from several provincial and territorial ministries and departments of education, education organizations as well as academic experts. This advisory committee provided expertise to the development team at Statistics Canada and Industry Canada to ensure that the questionnaire was adequate and relevant to measure ICT access, use and challenges. Library and Archives Canada provided expertise in the development of questions related to school libraries.
A number of consultations took place in the fall of 2002 with several principals (individually and through focus groups) to test the questionnaire. In addition, about 100 principals from different locations in Canada were invited to participate in a pilot survey in the spring of 2003.
The questionnaire was designed for a mail-in/mail-back instrument. Also, an Electronic Data Reporting (EDR) application was created for this survey. Consequently, principals were offered to either fill the paper questionnaire or, alternatively, answer the questionnaire through the Internet.
This survey is a census with a cross-sectional design.
Data are collected for all units of the target population, therefore, no sampling is done.
Data collection for this reference period: 2003-10-09 to 2004-06-30
Responding to this survey is voluntary.
Data are collected directly from survey respondents.
The respondents were school principals who provided both the data available to them as well as their views on ICT. In some cases, however, respondents may have consulted or involved others in their responses.
A combination of a paper questionnaire and an Electronic Data Reporting (EDR) option were provided to respondents for this survey. Each respondent was assigned a unique ID-number and EDR password that was printed on the questionnaire along with name, address and telephone number of the school. The respondents were asked to either complete the paper questionnaire and mail it back using the envelope provided or complete the EDR application.
Although participation in the survey was voluntary, a reminder fax was sent to respondents who failed to return their questionnaires on time, followed by telephone calls to encourage their participation. For cases in which the timing of the interviewer's call was inconvenient, an appointment was arranged to call back at a more convenient time.
View the Questionnaire(s) and reporting guide(s) .
Errors affecting a survey such as ICTSS can be random or they can occur systematically, in which case the estimates produced could be inaccurate even for very large subpopulations. Proper planning was done for ICTSS to minimize errors. The collection process included initial contacts with provincial and territories ministries or departments, school boards, relevant important agencies relating to the target school population, and of course the school principals. The questionnaires were tested in several occasions with school principals to ensure that the concepts were understood and relevant. The collection process was closely monitored to detect the presence of total non-response (i.e. when no questionnaire was returned) or partial non-response (i.e. when a questionnaire was returned but some questions were either not answered or were in conflict with other data provided by the respondent). Follow-up activities were in place to resolve these problems. The collection window was extended by one additional month to improve the response rate, and the possibility of answering only a subset of critical questions (i.e. questions 2, 3, 6, 15, 16 and 49) was offered to the respondents as a last resort.
Responses to survey questions were captured using two methods: Electronic Data Reporting (EDR) and Intelligent Character Recognition (ICR). If the EDR option was used then the respondent entered their data directly into the application and transmitted it back to Statistics Canada via a secure (File Transfer Protocol) ftp site. If they completed the paper questionnaire then the data was captured using Intelligent Character Recognition (ICR). ICR technology combines automated data entry (which uses optical character, mark and image recognition) with supplementary manual capture by operators who 'key from image' some of the survey information using a heads-up data capture approach.
To insure the quality of the captured data using ICR, all write-in fields were double keyed for accuracy and a 20% quality control procedure was employed. For quality control purposes, of every batch of questionnaires that was captured, 20% would be sampled and the images would be compared to the actual data.
Range edits were programmed with the capture of both EDR and ICR. If information entered was out of range (too large or small) of expected values, or inconsistent then the data would have been verified and changed or not accepted.
This methodology does not apply.
Non-response is the major source of error for a survey such as ICTSS. As this class of errors is not generally random, it is important that it be prevented and also that a proper adjustment strategy be derived to compensate for the presence of systematic non-response patterns. Based on the analysis of the response rates, and operation constraints, the non-response patterns and the adjustment strategy were investigated assuming a two-phase approach:
Phase 1: Focussed on only the critical questions for all respondents.
Phase 2: Focussed on all questions for only those respondents that answered beyond the critical questions.
The table 1 presented in the DATA ACCURACY section provides the sample and response rates distribution by province/territory for each phase. These figures are based on "usable" questionnaires (i.e. questionnaires with a minimum set of critical information), a subset of the 7,311 returned questionnaires or a rate of 47%.
These data sets were determined to have low partial non-response rates for the majority of the critical questions (Phase 1) and the majority of questions (Phase 2) respectively. Consequently, a weighting methodology based on key auxiliary information available on the frame (province/territory, language of school, instructional level of school, location of school, administration of school and size category) was adopted for each phase to correct for total non-response. In each phase, the weight assigned to each school represents the number of other schools in the population with similar characteristics.
Various strategies were put in place during data collection to improve response rates such as: respondent follow-up, allowing the return of surveys via mail, internet or fax, and extending the collection period. At the end of the collection window, respondent weights were adjusted in order to compensate for the amount of bias introduced due to survey non-response.
An extensive analysis was performed on the quality of the responses received, including an analysis of response rates for every question, the detection of outlying values for all numeric questions (e.g. number of computers), the detection of outlying values of relationships between numeric variables (e.g. student-to-computer ratio), and the analysis of data gathered under non-standard collection processes.
External sources were also verified for output consistency. Key estimates were compared with output from the 2000 and 2003 cohorts of PISA (Programme for International Student Assessment -- Survey #5060), as well as the Second International Technology in Education Study (SITES). Data were also compared to figures published in other countries, such as the United States, Australia and the United Kingdom. Finally, members of provincial and territorial governments and education associations as well as academic experts were involved during the thorough review of the outputs from the ICTSS.
For more details regarding the treatment of non-response, or methods of quality evaluation performed for the ICTSS, please refer to Sections 11 and 12 of the ICTSS Microdata User Guide.
Statistics Canada is prohibited by law from releasing any data which would divulge information obtained under the Statistics Act that relates to any identifiable person, business or organization without the prior knowledge or the consent in writing of that person, business or organization. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.
Before releasing and/or publishing any estimate from the ICTSS, users should first determine the quality level of the estimate. The standard quality levels are acceptable, marginal and unacceptable. Data quality is typically affected by both sampling and non-sampling errors.
In establishing the standard quality level of an estimate, the user should first determine the number of respondents who contributed to the calculation of the estimate. If this number is less than 5, the weighted estimate should not be released, in order to respect policies regarding confidentiality. For weighted estimates based on sample sizes of 5 or more, users should determine the coefficient of variation of the estimate. These quality level guidelines should be applied to weighted rounded estimates. Any estimate of marginal or unacceptable quality level must be accompanied by a warning to caution subsequent users.
Revisions and seasonal adjustment
This methodology does not apply to this survey.
A total of 6,676 of the 15,541 schools provided usable information for the survey for a response rate of 43%. Although ICTSS was intended to be a census, weights have been derived in order to correct for the non-response in each province and territory.
In addition, for the release of the data, methods have been implemented to estimate the coefficient of variations for several key characteristics. To produce estimates of the coefficient of variation for this survey, standard error estimates were derived assuming an equal chance of response amongst schools having similar characteristics according to key auxiliary information available on the frame (province/territory, language of school, instructional level of school, location of school, administration of school and size category). The methods adopted take into account adjustments made in producing the survey weights based on the key auxiliary information. Most estimates derived from this survey involve ratios of two totals (for example, percentage of internet-connected computers which is derived by calculating the ratio of the total number of Internet-connected computers over the total number of computers in the school). When a ratio of two totals is estimated, a linear function (called Taylor series linearization) of the two totals is used to approximate the standard error estimate of the ratio.
The application of these methods differed depending on whether the estimation was based only on critical questions (i.e. questions 2, 3, 6, 15, 16 and 49), or also on non-critical questions. When only critical questions were needed to produce an estimate, the standard error estimate was derived in one phase based on all respondents to the survey. However when non-critical questions were used, it was necessary to proceed in two phases. The first phase involved all respondents, and the second phase only the subset of respondents that answered beyond the critical questions. For example, the standard error for the average number of internet-connected computers per school (questions 15 and 16) was derived in one phase. However the standard error for the percentage of schools with word processing software(s) available to students (question 18) required a two-phase approach.
To illustrate the concept of coefficients of variation, two tables of CVs produced for a variety of ICTSS estimates are presented in the additional documentation link below. Table 1 presents estimates of the coefficient of variation for several key characteristics by province and territory. Table 2 presents estimates of the coefficient of variation for several key characteristics by type of school.
- Microdata User Guide: Information and Communications Technologies in Schools Survey, 2003/04