Building in an Assessment Plan - From a Single Work to Multiple Scholarly Spin-Offs - Writing as Professional Development

Writing for Publication: Transitions and Tools that Support Scholars’ Success - Mary Renck Jalongo, Olivia N. Saracho 2016

Building in an Assessment Plan
From a Single Work to Multiple Scholarly Spin-Offs
Writing as Professional Development

As a doctoral student, I enrolled in a required course on evaluation models and, even though the instructor was well-known in the field, his teaching methods made the class difficult for me to tolerate. He would arrive at class each day with a stack of books marked with post-it notes and proceed to read aloud passages from each book. For the remainder of the class, we were put into groups to work on a program evaluation. Nearly all of the projects were outside my field, so I assumed there was little to be learned. I sold back my book to the bookstore for a few dollars rather than keep it as I had my other texts. Shortly after I was hired as a faculty member, the dean encouraged me to attend a three-day federal grant writing workshop in Washington, DC with three other faculty members. Much to my surprise, most of what was shared had been addressed in that course I wished I could drop. Actually, material from that course probably has been more widely applicable in my professional life than the material from any other course in my program. I even ended up purchasing the latest edition of the textbook (Fitzpatrick, Sanders, & Worthen, 2011) and teaching a similar course to doctoral students myself.

An evaluation plan assures funding agencies that the money would be well spent. One of the most important—yet frequently shortchanged—aspects of a grant proposal is the evaluation plan. A clear assessment strategy that specifies outcomes consistent with the funding agency’s goals and clearly linked to the project’s purpose and objectives is the surest way to convey this information. Increasingly, funding agencies are looking for what are referred to as “theory of change” approaches (Taplin & Clark, 2012), evaluation models (Posavac, 2011; Stufflebeam & Shrinkfield, 2007), or “logic models” (Crawley, 2001; Graig, 2016; Knowlton & Phillips, 2012)

Online Tools

To understand logic models, begin with a simple example—the process of buying a home from Innovation Network Next, watch the narrated PowerPoint Tutorial from Usable Knowledge, LLC and study How to Write the Evaluation Section of Grant Proposal

Those who are awarded a grant surely will need to write an evaluation report and submit it. This can be a particularly high-stakes writing task for multi-year projects. Be sure to gather all of the evaluation data along the way rather than waiting until the deadline for the report. That is the surest way to have what you need in order to prepare a compelling argument that the money was well invested, that the project was worthwhile, and that it merits continued support. The American Evaluation Association (2004) has established guidelines for writing evaluation reports (Yarbrough, Shula, Hopson, & Caruthers, 2010). Use the guidelines in Table 11.7 to guide you in preparing a report.

Table 11.7

Questions to guide writing an evaluation report



Stakeholder identification

Are the audiences clearly defined to include their perspectives? Does the report thoroughly explain how the evaluation information will address their needs? Are the needs of various audiences discussed/juxtaposed?


Report clarity

Does the report clearly and accurately describe the program, including its context, stakeholders, purposes, and curriculum? Are descriptions thorough, elegant, comprehensive, and fully supported by the data?


Values identification

Are the rationale and standards used to guide the evaluation, interpret the findings, and make value judgments that are insightful, fully justified, and comprehensive?


Evaluation impact

Does the evaluation use compelling evidence to offer clear and appropriate direction for programmatic improvement that would enhance the mission/goals of the program?



Is there ample evidence that the evaluation was conducted in a practical and efficient way that was response to the context/culture?


Resource analysis

Are estimates of time and money detailed and defensible? Does the report include a thoughtful analysis of the available resources?


Management plan

Does the management plan specify dates for various activities and identify potential pitfalls so that stakeholders can track progress and avert problems with keeping the evaluation on schedule?


Ethical issues

Do the methods, data, and narrative indicate that the evaluator exhibited legal, ethical, and due regard for protecting the welfare of those involved in the evaluation?


Description of methods and sources of information

Are the descriptions of methods and sources thorough, elegant, comprehensive, and fully supported by the data?


Valid and reliable information

Are the information gathering methods chosen, developed, and implemented to assure that both the evidence and its interpretation are valid and reliable?


Justified conclusions and recommendations

Does the evaluator draw critical, insightful conclusions and make recommendations that are explicitly justified with connections to the evidence?


Other, e.g., timely delivery

Was the report submitted in advance for corrective feedback and is the final copy delivered on the due date?

This scoring rubric is based on the American Evaluation Association’s (2004) Guiding Principles for Evaluators

Too often, the work that is done in conjunction with grants is known only to the participants. Wider dissemination is one convincing way to persuade the funders that the project was particularly meritorious. Plan to make a presentation at a major conference and/or to publish an article, book chapter, or book about the grant activity. Generating presentations and publications from grants makes the most of your efforts (Sternberg, 2014).