In this photo, everyone, including the baby, appears to be paying attention to the person sitting in the center. Rakesh took this photo in 2007 when he visited his village in India where he spent first two years of his life. This was his first visit to the village in nearly 50 years.
“Who is your boss?” Asked a senior legislator shortly after I started as the director of the Office of Performance Evaluations more than 12 years ago. I quickly responded, “JLOC.” (JLOC, or the Joint Legislative Oversight Committee, assigns evaluation projects to our office.) Pleased with my answer, the legislator said, “Correct, and make sure you never forget that.”
So who are our audiences?
Yes, I haven’t forgotten that. At the time, it made perfect sense to me. However, later I realized that JLOC should not be the only audience of our work. For one, the mission of our office is to promote confidence and accountability in state government. That means our audience has to include the general public and the press. Two, if we want to produce useful evaluations, we have to think about the various stakeholders, such as policymakers, agency and program officials, people who are directly affected by the program or the policy, and lobbyists representing different interest groups who have a stake in the evaluation.
There is one more, a kind of latent audience: our evaluation colleagues and peers. We always want to know what they think about our work, because this helps us gauge our professional credibility.
These audiences have varying levels of interest in our evaluation depending on their role and stakes in the evaluation. Not everyone is interested in technical details, although those details are the foundation of our evaluation work. As shown in the table, of the seven different audiences we have, only two are primarily interested in technical details. Others care more about the evaluation’s key message and desire presentation in plain language with clear and simple data visualization.
Tailoring our written products to meet audience needs
To meet the needs of these audiences, our office prepares different products to disseminate evaluation results. As shown in the figure, our experience tells us that the more technical the product is, the less reach it has in terms of number of people it connects with. For example, technical appendices of an evaluation report will be of interest to only a small number of people compared to the press release that is distributed to all media outlets.
Traditionally, our reports (both print and electronic versions) have had four components:
- The transmittal letter is the first page inside the report cover. We use a transmittal letter to draw the attention of policymakers and the press to the most important message of our evaluation. These letters are candid in their message and are written using the simplest of language.
- The executive summary summarizes the entire evaluation, including findings, conclusions, and recommendations. It usually spans 2-6 pages of the report.
- The main body of the report discusses the evaluation context, explains the program and the policy that are the focus of the evaluation, describes various data analyses, and details findings, conclusions, and recommendations.
- Appendices include the study request(s), study scope, evaluation methodology, selected bibliography, additional details about a particular analysis, and formal responses to the evaluation by the governor and heads of relevant agencies.
In addition to the report, we always prepare a press release for each evaluation we conduct. We distribute our press releases to all media outlets. The purpose of a press release is to keep the public informed about our work so it can judge the value of state policies and programs and hold policymakers and government officials accountable.
In the past couple of years, we have added three other methods to extend our reach to a larger audience and to effectively communicate our evaluation message.
- One-page report highlights, or one-pager, is an easy to understand document that is detached from the main report. It communicates the most important information (both quantitative and qualitative) of the evaluation to multiple audiences. Depending on the nature of the evaluation, some one-pagers might reflect more quantitative information than others. Clearly knowing what the main message of the report is and using effective data visualization are critical for producing useful one-pagers. For us, preparing one-pagers is the most difficult part of reporting results, but they are also probably the most used written product of our work. The information can also be presented in a Q&A format.
- Fold-out pages allow us to get out of the traditional reporting format of 8.5 x 11 sized paper. Don’t be shy to use fold-out pages in your report if you have something to show and you need more space. Examples include illustrating complex flow of funds, budget management processes, and relationships.
- Interactive data visualization can sometimes be useful for engaging certain audience members, such as policymakers who are interested in knowing more details and program managers who may want to test the workings of a particular model or analysis. This can be done by providing links to your website and by using the site during a live presentation. Here are examples one and two.
Of course, there will always be a handful of folks in each type of audience who would want to know technical details such as the sample size, sampling methodology, standard deviation, and r-squared. For those folks, you need to be prepared to answer their questions adequately when and wherever necessary. However, you don’t need to lose most of your audience members by cluttering your report and presentations with technical details. Remember, it is not about dumbing down your evaluation message for certain audiences. It is all about conveying the message in a format that makes sense to the people to whom it matters most.
Rakesh Mohan has been the director of the Office of Performance Evaluations, OPE, since 2002. OPE is an independent, non-partisan agency of the Idaho State Legislature. Under his leadership, OPE received the 2011 Alva and Gunnar Myrdal Government Evaluation Award from the American Evaluation Association.