The Drum Beat 381 - Participatory Communication: The Case for Quantitative Assessment
***
This Drum Beat is one of a series of commentary and analysis pieces. In this piece, Tom Jacobson examines the issue of assessing participatory communication programmes and strategies. He suggests a method of quantitative assessment through measuring participatory dialogue based on Habermas' theory of communicative action. Jacobson here asserts that such assessments could provide the kind of hard data that donors need to justify their support.
We continue to feature a range of critical analysis commentaries of the communication for change field. These appear regularly on the first Monday of most months and are meant to inspire dialogue throughout the month. Though we cannot guarantee to feature your commentary, as we have a limited number of issues to be published each year, if you wish to contribute please contact Deborah Heimann dheimann@comminit.com Many thanks!
***
Participatory Communication: The Case for Quantitative Assessment
The Problem of Assessing Participation
The participatory approach to social change for development has been around for some time now. This approach represents a move away from programme planning and implementation in which programme goals are determined beforehand and communicated to "beneficiaries" via mass media campaigns using print-media, posters, leaflets, radio or television. Many development agencies, non-governmental organisations, and individual field workers have become sensitised to the desirability of leaving the power for decision-making in the hands of the communities that may be in need. There is still an important role for preplanned programmes employing mass media. However, it is widely understood today that extensive community participation should be employed whenever possible throughout the planning, implementation, and evaluation of programmes.
Wide recognition of the desirability of community participation represents a significant step forward in terms of what is known about development processes and how to facilitate them. And beyond general recognition of the importance of participatory processes, specific approaches have been developed for planning and implementing participatory programmes, including Participatory Monitoring and Evaluation (PM&E), Participatory Rural Appraisal (PRA), and Participatory Action Research (PAR), among others. These approaches have developed useful techniques for facilitating community involvement in, and even control of, change efforts, including focus group discussions, preference ranking, mapping and modeling, seasonal and historical diagramming, and many others.
Some progress may still be needed, however, in the area of programme evaluation. Participatory projects are often seen as being difficult to evaluate. The PM&E, PRA, and PAR approaches each offer evaluation techniques. These include counting numbers of community members attending programme meetings, assessing the nature of leadership processes, analysing the structure of decision-making processes, evaluating the ease with which both genders are able to contribute to discussion, and more. And community members play a central role in all evaluation activities. Generally, these methods employ qualitative research techniques on the principled belief, among at least some advocates, that quantitative assessment tools are unsuitable for evaluating participation processes.
Additional tools are needed in the form of evaluation methods that can concisely and quantitatively estimate the contribution of participation to targeted programme outcomes. Differing opinions were expressed over the need for such "hard" data at the recently concluded World Congress on Communication for Development, in Rome, October 2006. Some attendees argued that hard data are needed to convince donors of the effectiveness of participatory processes because these are the kind of data donors understand and trust. Other attendees argued that "anecdotal" data and stories are most convincing to donors and decision-makers. This debate has not been resolved, but it seems reasonable to assume that at least some large donor organisations would welcome hard data generated via standard quantitative evaluation techniques, if only participation could be evaluated in this way.
It appears to this writer that one useful approach might be found by focusing intently on dialogic communication processes. Meeting attendance, leadership structures, decision-making processes, and gender balance are important. But from a participatory perspective it is the communication that takes place within these processes that makes them so. If the communication that takes place within meetings, leadership activities, decision-making processes, and gender relations is itself participatory then so will these processes themselves be participatory. And here participatory dialogue is of key importance because these activities can, and often do, take place without it. From this perspective an efficient evaluation of participation in interventions must focus singly on the extent to which dialogue takes place. The focus on participatory dialogue may then point the way to a useful quantitative assessment tool designed to indicate participation in many of its forms.
Evaluating Programmes
To explain this approach a little further, and how it might be useful, recall the standard approach to communication evaluation, and to its use of measured variables. In this standard model experts decide development aims, either alone or in association with local voices. Mass media campaigns are designed to help implement these aims. Evaluation of the media communication programme involves measuring and analysing two things. First, the mass media communications are evaluated by trying to estimate the extent of "exposure" to campaign messages among target beneficiaries. For this purpose media exposure variables are employed such as "message recall." Second, a targeted programme aim, such as health behaviour change, is measured in terms of visits to health clinics, condom use, or hygiene practices. These are the independent and dependent variables, respectively, in a communication evaluation model. Analysis consists of statistically estimating the correlation between the media exposure and the behaviour change variables. The idea is that when media exposure is higher, then behaviour change will hopefully be higher. Significant positive correlations are generally interpreted to mean that the expense of a mass media campaign has been justified by its contribution to achieved planned outcomes.
Using this approach to evaluation, if participatory dialogue is to be employed in place of mass media campaigns during change-oriented interventions then the variables measuring the communication intervention must change. Dialogic variables must be developed to replace, or perhaps augment, media exposure variables indicating the communication intervention in evaluation models.
Evaluating Programmes by Assessing Participatory Dialogue
This raises the most important unanswered question. "What exactly is dialogue, and how can it be measured in a meaningful way?" At one level asking local citizens whether they felt they were listened to during a given programme intervention should answer this question. They could simply be asked whether they felt that programme officers, decision-makers, and/or community leaders working closely with programme officers, "listened to them" throughout planning, implementation, and evaluation stages of a project. At the same time, while such a single question might represent the essence of participatory communication, such a single measure would not likely serve adequately to measure participatory communication. A more thorough analysis of elements of participatory dialogue would be advantageous.
A number of specific aspects of participatory dialogue could usefully be addressed during the evaluation process by directly asking community members: 1) Did you feel you were allowed/empowered to speak as often as you wished? 2) Did you feel that the organisers/facilitators allowed you to raise any proposal or criticism you wished to raise? i.e. was everything "on the table?" 3) Did you feel that every proposition or criticism raised was dealt with fully and to your satisfaction? 4) Did you feel free to challenge organisers/facilitators' grasp of relevant local facts? 5) Did you feel free to challenge the cultural appropriateness of organisers/facilitators' behaviour and the way they conducted meetings? 6) Did you feel free to challenge organisers/facilitators' sincerity, i.e. whether the project was oriented toward solving local problems or just pursuing a donor organisation's goals? and 7) Did you clearly comprehend everything the organisers/facilitators were trying to say in their programme materials and processes?
A thorough discussion of participatory dialogue is not possible here, but a methodological proposal can at least be made. To begin we can agree that there is no simple way to define dialogue. Theorists such as Paulo Freire, Michele Foucault, Hannah Arendt, Jurgen Habermas and others have all defined it, in different ways. But we could nevertheless decide that some means of assessing participatory dialogue should be developed that "gets at" the question of whether community members are "listened to," via an analysis of dialogue that is richer and admittedly somewhat more complex than the simple idea of being listened to.
There is at least one theory I believe can offer a suitable theoretical basis for such a project. It is the sociologist Jurgen Habermas's theory of communicative action. [1] The seven elements of dialogue specified above reflect four "validity claims" and three "symmetry conditions" that Habermas uses to specify action oriented toward understanding. Action oriented toward understanding is precisely a participatory process of dialogue. Perhaps other theories would be suitable as well. However, more fully conceptualising dialogue, designing measures, and comparing different theoretical approaches is another discussion. The more general point to be considered here is whether dialogic conditions can be measured, and if so whether such measurements could be useful for assessing participation in social change programmes for the purposes of programme evaluation.
Summary
The proposal I would like to advance is that new assessment tools should be developed focusing on the measurement of dialogue, or more specifically perhaps the measurement of community member feelings about dialogic conditions. Questions 1 through 7 above, or others like them, could be asked and then answers combined to produce scaled quantitative indicators of participant evaluations of the extent of participatory dialogic conditions. These tools should be useful regardless of what kinds of communication modalities were implemented in a given programme, whether these were community meetings, festivals, or media campaigns because all these can be conducted with various levels of authentic participation. The measures would aim to evaluate just how participatory those modalities turned out to be.
Reflecting the standard evaluation model, if increases in measured participatory dialogue were accompanied by increases in desirable behaviour, then evaluation data would support donor investment in participatory programmes. Perhaps such tools could not fully assess the extent to which Freirean styled empowerment took place as an outcome of "conscientization." But they could help produce meaningful assessments of the extent to which community members felt free to participate during planning, implementation and evaluation of programme interventions. (Note: Increases in measured participatory dialogue within communities could also be interpreted to reflect increased community capacity for collective deliberation, i.e. participatory communication as an outcome, and not just a means.)
The proposed measures could be gathered by means of surveys administered through face-to-face interviews, paper-and-pencil surveys, or focus groups. Local community members could be involved in questionnaire design, administration and interpretation. Such quantitative assessments of participatory dialogue would be inexact and partial, as are all measurements. But such quantitative assessments could be employed in association with standard means of estimating error, with humility as all assessment methods must, and in association with qualitative techniques where possible. Then they might offer an important addition to the toolkit of participatory communication techniques already available. In specific, they might help provide the kind of quantitative hard data that at least some donors require to justify their financial support.
Tom Jacobson
School of Communications and Theater
Temple University
tlj@temple.edu
[1] See Habermas, J. (1984, 1987). The Theory of Communicative Action, volumes 1 and 2. Boston: Beacon Press.
***
This issue of The Drum Beat is meant to inspire dialogue and conversation among the Drum Beat network.
Please engage in dialogue, beginning February 8 2007, through the DrumBeatChat forum.
***
Please participate in a Pulse Poll related to this commentary.
Do you agree or disagree?
Measuring participatory dialogue, as proposed by Tom Jacobson, would provide a good quantitative assessment of participatory communication.
***
This issue of The Drum Beat is an opinion piece and has been written and signed by the individual writer. The views expressed herein are the perspective of the writer and are not necessarily reflective of the views or opinions of The Communication Initiative or any of The Communication Initiative Partners.
***
RESULTS of recent poll:
2006 was a successful year for using communication towards achieving the Millennium Development Goals (MDGs). If you agree, please indicate how? If you disagree, why not? If possible, please provide examples.
38.10% Agree
16.67% Disagree
40.48% Unsure
Total number of participants = 42
***
The Drum Beat seeks to cover the full range of communication for development activities. Inclusion of an item does not imply endorsement or support by The Partners.
Please send material for The Drum Beat to the Editor - Deborah Heimann dheimann@comminit.com
To reproduce any portion of The Drum Beat, see our policy.
To subscribe, click here.
- Log in to post comments











































