Aboriginal Peoples Survey (APS)
Detailed information for 2001
The purpose of the Aboriginal Peoples Survey was to provide data on the social and economic conditions of Aboriginal people in Canada. More specifically, its purpose was to identify the needs of Aboriginal people and focus on issues such as health, language, employment, income, schooling, housing, and mobility.
Data release - September 24, 2003
The Aboriginal Peoples Survey (APS) provides data on the social and economic conditions of Aboriginal people in Canada. Its specific purpose was to identify the needs of Aboriginal people focusing on issues such as health, language, employment, income, schooling, housing, and mobility. The survey was designed and implemented in partnership with national Aboriginal organizations.
- Aboriginal peoples
- Population characteristics
Data sources and methodology
The target population comprises adults and children living in private dwellings in the 10 provinces and three territories who are North American Indian, Métis or Inuit, and/or are a Treaty Indian or a Registered Indian as defined by the Indian Act of Canada and/or are members of an Indian Band or First Nation and/or who have Aboriginal ancestry. All residents of collective dwellings are excluded from the survey. (Collective dwellings include lodging or rooming houses, hotels, motels, tourist homes, nursing homes, hospitals, staff residences, communal quarters (military camps), work camps, jails, missions, group homes and so on.)
The 2001 APS collected data on both adults and children. The adult questionnaire was administered to respondents aged 15 and over while the children's questionnaire was directed at children and youth aged 0 to 14 years. The adult questionnaire consisted of a "core" portion, which was administered to all Aboriginal participants, as well as a Métis component and an Arctic component.
The questionnaires were developed based on a proposed list of topics that was developed by the Implementation Committee for the survey. The 1991 APS questionnaire as well as other STC questionnaires were reviewed and used in formulating the questions. The development process included several rounds of testing. Draft questionnaires were qualitatively tested (round one in December of 1999 and round two in late May and June of 2000). A Phase I pilot test was carried out in the spring of 2000 and a Phase II pilot in the fall of 2000.
This is a sample survey with a cross-sectional design.
For a description of the sample design, please refer to the link below.
Data collection for this reference period: 2001-10-01 to 2002-06-30
Responding to this survey is voluntary.
Data are collected directly from survey respondents.
Collection for the 2001 APS was conducted in two phases: Phase I, which took place from October to December 2001, focused on the "Aboriginal identity population". This population is defined as all individuals with a positive answer to the Aboriginal identity in question 18 or Band/First Nation membership in question 20 or Registered Indian status in question 21 on the 2001 Census. Phase II, which took place from April to June 2002, focused on people who reported in question 17 that they had Aboriginal ancestry, but who did not report Aboriginal identity in question 18 or Band/First Nation membership in question 20 or Treaty or Registered Indian status in question 21. This population is referred to as the "Aboriginal origin only population". A small portion of the Aboriginal identity population was also covered in Phase II.
The 2001 APS collected data on both adults and children. The adult questionnaire was administered to respondents aged 15 and over while the children's questionnaire was directed at children and youth aged 0 to 14 years.
The adult questionnaire consisted of a "core" portion, which was administered to all Aboriginal participants, as well as a Métis component and an Arctic component. The Métis component was administered only to the adult population living in non-reserve areas and outside of Inuit communities who self-identified as Métis and/or who have Métis ancestry. The Arctic component was administered to the adult population residing in Inuit communities.
All components of the survey were interviewer administered and in all cases, a paper and pencil methodology was employed. Interviews were generally conducted in two steps. Selected respondents were contacted by telephone in order to conduct the screening portion of the questionnaire. A personal interview was then conducted with persons who were screened in. (A proportion of interviews were completed on the telephone.) Initial refusals were followed-up by senior interviewers in an attempt to convince the respondents to participate.
Interviews by proxy were allowed. The interviews for the children's questionnaire were conducted with the "person most knowledgeable" about the child. The interviews were conducted using paper and pencil questionnaires. Optical character recognition, optical mark recognition and key entry were used to capture the questionnaires.
View the Questionnaire(s) and reporting guide(s).
The first stage of error detection was done during the data collection. Interviewers were asked to check their questionnaires to ensure that everything had been filled in correctly and clearly and that skips had been followed correctly. When problems were found, they were instructed to contact the respondent again to obtain the missing information.
The second stage involved editing all the survey records according to pre-specified edit rules to check for errors, gaps and inconsistencies in the survey data. Validity checks on each variable were made to ensure, for example, that numerical answers to certain questions fell within acceptable logical ranges and that invalid multiple responses to certain questions were identified. Checks were also made to ensure that the questionnaire flows were followed properly. Inconsistencies between sections of the questionnaire or with the Census were not corrected. It was felt that it would be inappropriate for STC to choose one response over the other.
Where errors were found, the erroneous information was either blanked out, replaced by a "not stated" or "invalid" code, or corrected based on the answers to other questions. Although the corrections were generally done in an automated way, analysts reviewed some problematic situations.
Finally, a macro-level verification was done by analyzing frequency distributions to identify anomalies (eg., missing categories or large frequencies).
For APS, generally the only type of imputation done was deterministic imputation. Questions related to each other were edited simultaneously. Valid responses were imputed for missing responses if sufficient information was available in the related questions. Otherwise, they were coded to "not stated". If a question with a missing answer (coded to "not stated" should have been used to determine if subsequent questions were to be asked, these subsequent questions were set to "path not known" because it was not possible to determine whether or not they should have been asked.
However, the filter questions (i.e., questions 1 to 4 in the Identification section) and the date of birth and sex variables were sometimes imputed from the Census. The filter questions were NOT imputed for records corresponding to non-respondents. For respondents, question #1 was always cleaned using Census information. (This decision was based on the design of this question. Many interviewers only checked the applicable response leaving the other questions blank.) If a record had enough of the questionnaire completed to keep it BUT did NOT have a "yes" to one of the filters AND one or more of the filters were blank, Census responses were imputed (both "yes" and "no") in all blank questions.
Census data was also used to correct missing or invalid entries in the date of birth and sex fields. Note: There were about 50 records in which date of birth was imputed AFTER the tables had already been prepared for the September 24, 2003 release. The small number of cases should not impact on the percentages released in September and will have only a minor impact on the numbers for the age breakdowns.
For a description of this methodology, please refer to the link below.
Where possible, results from the APS were compared with data from other sources (eg., Census, General Social Survey, Canadian Community Health Survey) in an attempt to identify large inconsistencies.
Statistics Canada is prohibited by law from releasing any information it collects which could identify any person, business, or organization, unless consent has been given by the respondent or as permitted by the Statistics Act. Various confidentiality rules are applied to all data that are released or published to prevent the publication or disclosure of any information deemed confidential. If necessary, data are suppressed to prevent direct or residual disclosure of identifiable data.
Data based on a count of fewer than 10 respondents are suppressed to ensure confidentiality of respondents. To further reduce risk of disclosures, all estimates are rounded to the nearest 10 units.
Two types of errors occur in surveys, namely sampling and non-sampling errors. The difference between the estimates obtained from the sample and those that would result from a complete census taken under similar conditions is called the sampling error of estimates. Errors that occur during the survey process which are not related to sampling are referred to as non-sampling errors. Examples of such errors include: interviewers misunderstanding instructions, respondents making errors in answering questions, answers being incorrectly entered on the questionnaire, errors during processing and so on. Actions were taken to reduce these errors to a minimum. Following is a description of measures that were taken for that purpose.
Several rounds of testing were carried out before the survey to evaluate the entire survey process from questionnaire content to data capture and processing.
The sample selection procedures were tested using questionnaires from the 1998 National Census Test in order to identify difficulties that would be faced by the sample selection clerks in the Census field collection Units (FCUs). In addition, quality control procedures were used during the sample selection in these FCUs.
High response rates are essential for data quality. To reduce the number of non-respondents, Aboriginal interviewers were hired as much as possible. Further, interviewers were all trained by Statistics Canada staff and provided with detailed Interviewer Manuals and were under the direction of interviewer supervisors. A number of attempts were made to contact persons not at home and refusals were followed up to encourage respondents to participate in the survey.
In addition, some measures were taken to identify and correct errors that could result from misinterpretation of a question by the respondent or from a wrong flow followed in the questionnaire. Following the interviews, interviewers reviewed the questionnaires and called respondents back if need be. Supervisors also reviewed the completed questionnaires. A detailed set of edit rules were then used during data processing to identify errors in the responses provided.