Winter-Spring 2004

This Evaluation Will Be Televised: An Interview with Chet Davis, Ohio Community Computing VISTA

Chet Davis

Chet Davis
From the beginning, volunteerism has served as the foundation of the Community Technology Center movement. With initial funding scarce to nonexistent, CTCs survived only on the strength of the true believers, the volunteers. And the federal AmeriCorps*VISTA program has taken the altruistic spirit a step further by supporting hundreds of CTCs with full-time volunteers who dedicate an entire year of their lives to servicing these organizations. The Ohio Community Computing Network (OCCN), a statewide alliance of over 60 CTCs in Ohio, has hosted 140 volunteers over the past six years, many of whom have performed innovative and unique tasks. Chet Davis is one of them.

Chet serves as an Evaluation Designer for the local public access station, Media Bridges, in Cincinnati, OH. He has helped develop, coordinate, and implement a whole new system of data collection that will be appreciated for years. Chet was an especially attractive candidate for VISTA work since he holds a PhD in Sociology that he received from Oklahoma State University in 1999. We caught up with him recently towards the end of his VISTA term of service.

What was your favorite accomplishment of the past year?

CHET: The best thing was probably getting my Web surveys written, tested, posted, and used. These forms, especially the classes and facilities feedback form, will be extremely valuable for ongoing planning. Now I’m working on getting more people to use them.

So, what is evaluation design and why is it important?

CHET: Evaluation research is about studying the effectiveness of programs, policies, or organizations to help decision-makers decide what to do next in regards to strategic planning or spending. It allows us to tell if our services are really working, what unmet needs exist, and whether a program or service should be continued. This is why evaluation research is important. Plus, many funders require grant recipients to have a plan for outcomes-based evaluation. That is, grantees have to have a plan for producing evidence that their programs are having the desired effects on constituents.

Evaluation design is tricky stuff... do you think this kind of work is something that you could train most potential VISTAs to do—and do well?

Evaluation research has become a highly complex discipline of its own, so it would be unrealistic to try and provide comprehensive training for new VISTAs. However, the evaluation needs of most sites are going to be relatively simple and there are some evaluation tools that VISTAs could learn and use at their sites.

At most sites, the staff will be satisfied with surveys that ask questions about satisfaction, learning goals, experiences, and outcomes (such as improved job skills or enhanced comfort with browsing the Web). Conducting basic surveys would be something that could be taught in VISTA training. They could also learn to review the surveys and write a summary report, since most sites won’t be interested in statistical analysis or content analysis. Most sites will have no need for or interest in the complex research designs that evaluation research sometimes uses. This could include teaching VISTAs to ask the right kinds of questions to tease out the real goals of the survey and the questions that need to be asked. Writing survey questions is not as easy as it looks but for most purposes you only need to remember about 20-25 rules of thumb.

So, VISTAs could be trained to use simple survey instruments to collect data and summarize it for the site supervisor or other users. More advanced training than this is certainly possible but there may not be enough time or money for it.

What are some of the evaluation resources that VISTAs—and others—might find helpful?

CHET: There are plenty of free resources on the Internet that can help CTCs conduct evaluation research. Try CTCNet, the America Connects Consortium, and Project Star. They have guides, questionnaires, and a self-paced course on evaluation research. There’s a lot more out there but those three sites probably have everything that most CTCs will need.

When you leave Media Bridges, how will it be different from when you arrived?

CHET: I know that Belinda Rawlins [the Executive Director] and Lisa Newbold [Underwriting/Sponsorship Facilitator] have already been using what I write about in my progress reports in grant proposals and in speaking with people. So, the evaluation project has already been helpful in promoting the organization, raising funds, and helping staff make decisions. I’m also leaving in place databases, three Web survey forms, new pen-and-paper evaluation forms, evaluation advice, and some miscellaneous questions that can be used in the future, plus a written plan for ongoing evaluation. It looks like Media Bridges will continue to have and use a detailed evaluation strategy.

Chet is one of 20 OCCN VISTAs located in various sites around the state. Currently, OCCN VISTAs are developing programs ranging from educating troubled youth to express themselves through self-produced video (Coshocton), to integrating assistive technology to help seniors utilize computers (Toledo), to soliciting books to create a library where one didn’t exist (Dayton), to raising money to build a skate park to attract kids to the local CTC (Shawnee). OCCN VISTAs are proving that CTCs can be more than just computer labs.

Gabriel Gloden

Gabriel Gloden is an OCCN Americorps*VISTA providing central headquarters support in Columbus, OH.

Post a comment

Remember personal info?

* Denotes required field.