This text was included in: PIMA BULLETIN NO 47 SPECIAL ISSUE HONOURING CHRIS DUKE, April 2023
Co-Editors: Heribert Hinzen, Phuoc Khau, Dorothy Lucardie, Maria Slowey, Shirley Walters.
Technical Assistance: Leslie Cordie
Co-Editors: Heribert Hinzen, Phuoc Khau, Dorothy Lucardie, Maria Slowey, Shirley Walters.
Technical Assistance: Leslie Cordie
Dear Chris: I know you like stories. Am sure you will enjoy this one.
In 1988-1990 I directed the National Literacy Campaign "Monsignor Leonidas Proaño" in Ecuador. The day after the campaign was finalized, I had many journalists in my office asking for one figure: how much had the campaign reduced the illiteracy rate in the country.
I kindly explained them that:
1. Illiteracy and literacy data (in Ecuador and in many countries) are not reliable since they reflect selfperceptions in response to the question - Do you know how to read and write? YES/NO, rather than some sort of evaluation or verification.
2. In a literacy process there are three main data: a) number of persons registered, b) number of persons who completed the programme, and c) number of persons who learned (and how much they learned). Very often the data that are given as final are registration and/or termination data.
In Nicaragua, the National Literacy Crusade started on March 23, 1980 with an estimated illiteracy rate of 50.2% (722,431 illiterate persons). I was at the Plaza de la Revolución in Managua when the Sandinistas shared the Crusade results: 406.456 persons had become literate and the illiteracy rate was reduced to 12%. In reality, as I realized later, those were registration or termination data, not learning data.
3. In the campaign in my own country I decided to place learning at the centre, conduct and disseminate widely a final evaluation of the campaign, and differentiate the three data: registration, termination, and learning. Literacy learners were asked to read a short text out of the primer and to write a short letter to the literacy teacher. For the literacy teachers - most of them students from the last two years of secondary school whom we trained for several months and assisted through radio during the campaign we prepared a written questionnaire where we asked their opinions on the various components of the campaign and on their experience in it.
4. It would take us several months to collect the data since it was a national campaign, covering urban and rural areas. Before leaving, literacy teachers were asked to collect the required information and deliver it to the brigade coordinators or to the campaign personnel in the various places.
5. I decided to accept adolescents between 12 and 15 years of age into the campaign. Illiteracy statistics usually start at 15 years of age. Therefore, we would have to eliminate the participants below the age of 15.
6. Besides the internal evaluation conducted by the pedagogical team of the campaign, we requested UNESCO-Santiago to assist us with an independent external evaluation. We would publish both reports as well as an integrated one.
7. The campaign in indigenous languages, in charge of the National Directorate of lntercultural Bilingual Education (DINEIB), would be evaluated following its own parameters. Its final evaluation would be published in a separate report.
Explanations were useless. Journalists insisted on obtaining a number. Today, not in six months or a year! Next day, newspapers and other media indicated that the campaign and the Ministry of Education were hiding information.
Almost a year later, in August 1990, when we published and distributed the final evaluation report (246 pages), nobody was waiting for it and nobody was interested to know what had been learned by the nearly 300,000 people who finalized the campaign, and by their literacy teachers.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.