Holmes+-+Thursday+edited+version


 * Part 3**


 * Evaluation Method**

To help FWR decide whether or not to move forward with the DIP program, the proposed evaluation will be comprised of the following components:

‍**Expert evaluation of content**-‍ Content will be evaluated by one in-house subject matter expert (SME) together with two outside experts. They will provide feedback on the accuracy and quality of the content, for individual modules, individual units, and combinations of units. Our in-house expert will give feedback in the form of a written report, and the two other experts will be independently interviewed after they are given the opportunity to familiarize themselves with the materials. Besides helping FWR decide whether to disseminate the DIP program, this component of our evaluation can help fulfill the secondary purpose of the evaluation, acting as a review and guide for interested school administrators.

‍**Trial runs**- SCEC will arrange for multiple trial runs of DIP components, in either a Bay Area graduate school or school district setting. Each trial run will be observed by a SCEC consultant, and if possible, each trial run will include one full unit of the DIP program. The total target observation time is 40 hours, with 6-14 trainees per trial and a minimum of 25 individual participants. Trials will be preceded by pre-tests and followed by both post-tests and surveys.

Our pre- and post-tests will focus on trial run participants’ proposed actions in hypothetical administrative planning scenarios, in alignment with the DIP’s instructional materials. Surveys will seek the participants’ appraisal of the module(s) in terms of format, appeal, and applicability, as well as gauging their interest in continuing to participate in the program. Quantitative, Likert scale components of surveys will be used to describe overall reactions and opinions, while SCEC staff will compile responses to qualitative questions into meaningful overall trends.

SCEC will also conduct guided telephone interviews with trial run coordinators to determine their opinions of the program. The guiding focus of information gathering will be to determine whether trainees have benefited, and if so, in what tangible forms these benefits have taken shape. The primary purpose of these trials is to give FWR information to inform their decision regarding deployment of the program, but this may also result in material useful to communication with potential clients.


 * Attitudes of potential clients‍**- SCEC will take advantage of its nationwide connections to K-12 educators and graduate educational administration instructors to distribute electronic surveys and receive completed surveys from a minimum of 40 potential clients. These instruments will include items assessing the need for a program like DIP, the expected presentation modality, budget constraints, and degree of fit with current practices in school training or graduate school instruction. Survey data will again be both quantitative and qualitative.


 * Part 5**


 * Project Personnel**

‍San Carlos Educational Consulting has been providing evaluation and guidance to educational institutions since 2002. Our clients include schools, school districts, post-secondary institutions, and creators of educational and training resources. Two of our most recent projects were an evaluation of the NSF-funded high school pre-service physics teacher training camp (PPTTC) and a large-scale evaluation of the practicum component of San Francisco State University’s nurse practitioner degree program. We have ample experience evaluating instructional packages, and we are eager to furnish references from our many satisfied clients.


 * Lead Project Personnel**

Dr. Barry Janzen is our chief evaluator and the founder of SCEC. He will supervise the evaluation process and act as the primary point of contact with FWR. He holds an Ed.D. degree from Boise State University, and his work has been published in numerous educational journals. Dr. Janzen has authored a highly regarded textbook on educational evaluation, now in its 4th edition. ‍

Michaela Pacesova will design our data collection instruments and analyze our findings. She is an evaluation specialist with 10 years of experience, including multiple evaluations of educational training programs, and she has been working with SCEC since 2004. Ms. Pacesova holds a Master’s Degree in Educational Technology from Boise State University.

J. Raphael Holmes is our in-house subject matter expert on school administration training, and he will also be in charge of observing trial runs. He has been working with SCEC since 2005. He holds an MS in educational administration from Boise State University, and he has worked as an educational administration instructor at Rutgers University.