Final+Group+Copy

=Part 1=

Introduction

Far West Laboratory for Educational Research and Development (FWR) has requested evaluation proposals for its training program Determining Instructional Purposes (DIP). This document is an evaluation proposal by San Carlos Education Consulting (SCEC).

= = =Part 2 =


 * Description of Program **

The DIP training program was created to train administrators and graduate students in educational administration in planning effective school programs. The package includes three separate units and a Coordinator's Handbook. Each unit consists of 4 to 6 modules with training on several objectives. The modules include reading material correlated to the module skills, individual and group activities to practice the skills, and feedback for the practice activities. The trainees are divided into planning teams to work on real problems in a fictive school district.

The design of the units allows the administrators to choose only one unit, or any combination of two, or all three, thereby allowing potential purchasers to access only the unit required, or together as an entire three part sequence of instruction. Participants will work step-­‐by-­‐step on the materials and activities in each unit to acquire the needed outcomes. The training can be delivered either in a concentrated short-­‐term workshop, or in several individual sessions within a few days or weeks. Each units lasts for 10 to 18 hours.

A coordinator should be familiar with the material before the training begins, but there is no prior knowledge of the subject area required. The coordinator should organize, guide, and monitor activities where the trainees use the materials and procedures in each of the units. All important information regarding to the coordinator's role are described in the Coordinator's Handbook.

All materials come in print form. The units are between 155 and 259 pages long and cost $8.95 per unit or $24.95 for the whole set containing all three units. The Coordinator␣s Handbook is available for $4.50 per copy.

=Part 3=

Given that the purpose of the evaluation is to determine the effectiveness of the training program, a variety of data sources and evaluation procedures will be used to establish recommendations for continued development and investment of the instructional purposes training program.
 * Evaluation Method**

• Expert evaluation of content- Content will be evaluated by one in-house subject matter expert (SME) together with two outside experts. They will provide feedback on the accuracy and quality of the content, for individual modules, individual units, and combinations of units. Our in-house expert will give feedback in the form of a written report, and the two other experts will be independently interviewed after they are given the opportunity to familiarize themselves with the materials. Besides helping FWR decide whether to disseminate the DIP program, this component of our evaluation can help fulfill the secondary purpose of the evaluation, acting as a review and guide for interested school administrators.

Print samples of the training program will be sent to many active school administrators and graduate students with written explanation and request for participation in reviewing the program. A follow up phone call will be made shortly after to acquire a commitment from a minimum of 25 individuals to participate in the three part training program, to be administered by one of the developers of the program form Far West. Participants will be asked to provide feedback in the form of a survey and through one-on-one interviews at the completion of each of the three units. Evaluators will collect achievement data from the participants at the completion of each unit as well as through observation of small group activities. The test group coordinator will also be interviewed at the completion of the program to gain anecdotal information based on his/her observations during the program. Trial runs- SCEC will arrange for multiple trial runs of DIP components, in either a Bay Area graduate school or school district setting. Each trial run will be observed by a SCEC consultant, and if possible, each trial run will include one full unit of the DIP program. The total target observation time is 40 hours, with 6-14 trainees per trial. Trials will be followed by surveys to the trainee participants regarding their appraisal of the module(s) in terms of format, appeal, and applicability, and surveys will also gauge their interest in continuing to participate in the program. Quantitative, Likert scale components of surveys will be used to describe overall reactions and opinions, while SCEC staff will compile responses to qualitative questions into meaningful overall trends. The group will go through the material and after they finish the training program, they will fill out a survey about the program. Also, they will be given a pre-­‐ and post-­‐test to compare the knowledge before and after the DIP training program. SCEC will also conduct guided telephone interviews with trial run coordinators to determine their opinions of the program. The guiding focus of information gathering will be to determine whether trainees feel they have benefited, and if so, in what tangible forms they see these benefits taking shape. The primary purpose of these trials is to give FWR information to inform their decision regarding deployment of the program, but this may also result in material useful to communication with potential clients.

• Attitudes of potential clients - SCEC will take advantage of its nationwide connections to K-12 educators and graduate educational administration instructors to distribute electronic surveys and receive completed surveys from a minimum of 40 potential clients. These instruments will include items assessing the need for a program like DIP, the expected presentation modality, and degree of fit with current practices in school training or graduate school instruction. Survey data will again be both quantitative and qualitative.

__ Practical Application __

In order to assess the long-term impact of the training sessions participants will be asked to participate in a follow up survey that will examine the long-term effectiveness and practical application of the knowledge, skills, and attitudes obtained during the program. 

= Part 4 =

Far West || October 2, 2011 || Far West || October 14, 2011 || The test group will also be selected at this time. || SCEC || October 31, 2011 || Far West || November 25, 2011 || Far West || February 15, 2012 || Far West || February 29, 2012 || Far West || July 10, 2012 ||
 * Task Schedule**
 * Appendix A **
 * Task Schedule **
 * **Task** || **Agency Responsible** || **Deadline Date** ||
 * 1. *Meet with Far West staff to discuss IPTP evaluation proposal and revisions. || SCEC/
 * 2. Provide feedback for implementation into evaluation || Far West || October 5, 2011 ||
 * 3. Development and submission of EPD (evaluator’s program description) to Far West || SCEC || October 14, 2011 ||
 * 4. Expert Evaluation of Content || SCEC || October 14, 2011 ||
 * 5. Submit data collection documents including all surveys and interview protocol for all participating parties to Far West for review and approval. || SCEC/
 * 6. Provide feedback for revision || Far West || October 19, 2011 ||
 * 7. Identify potential participating administrators and graduate students - mail out sample training packages and information documents. || SCEC || October 24, 2011 ||
 * 8. Follow up contact made to recipients of mail out and confirmation of a minimum number in each sample group. (25 administrators and 25 graduate students)
 * 9. Delivery of training program to participants. || SCEC || November 1, 2011 ||
 * 10. Distribution of electronic survey from potential clients || SCEC || November 1, 2011 ||
 * 11. Survey information collected and telephone interviews conducted from administrators and graduate students. || SCEC || November 21, 2011 ||
 * 12. Interview of test group coordinator. || SCEC || November 21, 2011 ||
 * 13. Collection and analysis of data from potential clients. || SCEC || November 21, 2011 ||
 * 14. *Formative assessment report to Far West based on data gathered to this point in the evaluation. || SCEC/
 * 10. Revisions to data collection instruments and recommendations for changes to evaluation for the trial run. || Far West || December 2, 2011 ||
 * 12. Pretest delivered to trial run participants. || SCEC || January 4, 2011 ||
 * 13. Test Group trial run of program begins. || SCEC || January 6, 2012 ||
 * 14. Completion of trial run and collection of data. || SCEC || February 3, 2012 ||
 * 15. Interviews with trial run coordinators. || SCEC || February 3, 2011 ||
 * 16. *Meeting with Far West to discuss data and analysis. || SCEC/
 * 17. *Submission of final report and recommendations to Far West. || SCEC/
 * 18. *Updated report and recommendations to Far West. || SCEC/


 * Requires meeting between two parties in person or through video conferencing.

=Part 5=


 * Project Personnel **

San Carlos Educational Consulting has been providing evaluation and guidance to educational institutions since 2002. Our clients include schools, school districts, post-secondary institutions, and creators of educational and training resources. Two of our most recent projects were an evaluation of the NSF-funded high school pre-service physics teacher training camp (PPTTC) and a large-scale evaluation of the practicum component of San Francisco State University’s nurse practitioner degree program. We have ample experience evaluating instructional packages, and we are eager to furnish references from our many satisfied clients. Lead Project Personnel Dr. Cynthia Transler is our chief evaluator and the founder of SCEC. She will lead the authoring data collection instruments and will act as the primary point of contact with FWR. She holds an Ed.D. degree from Boise State University, and her work has been published in numerous educational journals. Dr. Transler has authored a highly regarded textbook on educational evaluation, now in its 4th edition.

J. Raphael Holmes is our in-house subject matter expert on school administration training, and he will also be in charge of observing trial runs. He has been working with SCEC since 2005. He holds an MS in educational administration from Drexel University, and he has worked as an educational administration instructor at Rutgers University.

Michaela Pacesova, an evaluation specialist with 10 years of experience, will be the responsible person for the Far West Laboratory for Educational Research and Development project. Ms. Pacesova has worked with the company before on many other projects and is familiar with their situation. Ms. Pacesova holds a Master␣s Degree in Educational Technology from Boise State University.

Barry Janzen Mr. Janzen has been an educator in the public school system for over twenty years and possess a Masters Degree in Educational Technology from Boise State University.

=Part 6=


 * BUdget**
 * Requested Payment Schedule **