Jump to:Page Content
You are seeing this message because your web browser does not support basic web standards. Find out more about why this message is appearing and what you can do to make your experience on this site better.
Volume I, Number 1, Spring 1995
Issue Topic: Evaluating School-Linked Services
In 1990, the Kentucky legislature passed the Kentucky Education Reform Act (KERA) to reform all levels of the state's education system. Family Resource Centers (at elementary schools) and Youth Service Centers (at middle and high schools) were established under KERA on the assumption that by reducing “barriers to learning through school-based family support and parent involvement initiatives” (Illback, 1993, p. 4), children would do better in school.
To have a Family Resource or Youth Services Center (FRYSC), schools must have at least 20% of their student body eligible for free or reduced lunches. Yet, all children and families in participating schools may use center services and programs. This year, 478 FRYSCs are operating (Education Week, 3/29/95, p. 6). Family Resource Centers offer preschool and after school care, parent training, and health services and referrals. Youth Services Centers provide health services and referrals, employment services, substance abuse counseling, and family crisis or mental health counseling. Both types of centers may also opt to offer other services such as recreation programs, help with housing or other basic necessities, volunteer programs, or coordination with the legal system (Illback, 1993, p. 11).
Each FRYSC operates independently and relies on site-based management. Empowerment is a primary goal of this method of governance and service delivery: services offered are those that families themselves identify as needed. As Phil Rogers, coordinator of the Community Family Resource Center in Scottsville, explains:
Everything that we do comes out of our needs assessment. Our parents identified adult education as a priority, and now we have a part-time adult learning center. We have done the same with child care; we established daycare centers in our school system because, prior to the Center, there was no licensed child care in the entire county. So, we are constantly sensitive to what parents tell us (personal communication, 4/4/95).
This kind of responsiveness to community needs is a strength of the FRYSCs, yet it is this very strength that makes evaluating them so difficult.
FRYSC Evaluation Goals
• A collaborative approach to gathering and interpreting data.
• Measurement of multiple aspects of the program (collaboration, service delivery, parent involvement, student outcomes).
• Consideration of multiple levels of the program: clients served and kinds of services available in each center, as well as aggregate data across centers.
• Inclusion of perspectives from different stakeholders (children, youth, parents, teachers, center coordinators).
• Use of both formative and summative questions.
• A management information system for sharing evaluation and program information with center coordinators.
• Assessment of family support and empowerment.
• Reliance upon multiple and repeated measures over time.
Evaluating the FRYSCs
Robert J. Illback, Executive Director of R.E.A.C.H. of Louisville, Inc., evaluates Kentucky's FRYSCs. This evaluation is supported by grants from the Annie E. Casey Foundation. While there are several goals to his FRYSC evaluation (see box), Illback's primary one is to make the data useful for center coordinators, so that they can improve program offerings (Illback, 1993, p. 25). Each center has a management information system (MIS) that enables evaluation to be built into coordinators' daily routines. Coordinators enter data regularly into the MIS and send it twice a year to Illback, and throughout the year they access their own data to use for program improvement.
Responding to stakeholders' concerns. As an evaluator of the FRYSCs, Illback has two primary audiences: state legislators and local center coordinators. Thus, he has the related but distinct goals of making evaluation findings useful for policymaking as well as formative feedback. This requires that Illback regularly share data with center coordinators, a governor-appointed evaluation subcommittee, and other constituent groups (personal communication, 3/30/95). Such sharing of information enables Illback to work more closely with both program personnel and policymakers than do most evaluators.
Evaluation challenges. Close contact with center coordinators and state level policymakers may help invest more people in evaluation, but it doesn't solve the problem of how to evaluate an effort that has:
These program characteristics make it difficult to define the FRYSC intervention, standardize data across sites, satisfy the information needs of different stakeholders, ensure that evaluations reflect centers' programmatic changes over time, monitor process and outcomes variables, and measure goals like “empowerment.” In addition, it is difficult to attribute outcomes to the centers' programs without control groups, comparison groups, or experimental design; and it is hard to construct a helpful and accurate evaluation when there are multiple and/or unspecified conceptual assumptions about the relationship between the programs' services and expected student outcomes (Illback, 1993, pp. 27-28; Illback and Kalafat, nd, p. 6).
This “complex ecology” (Cannon, et al., 1994, p. 17) in which FRYSCs operate leaves Illback with the unenviable task of determining what, if any, conclusions can be drawn comfortably about the relationship of outcomes to programs when the programs themselves are so diverse. In addition to the causal attribution problem due to FRYSC program complexity, KERA itself has spawned multiple treatments to the problem of low educational achievement—in particular, changes in curriculum, teacher training, and educational funding. Since these have occurred alongside the establishment of FRYSCs, they additionally confuse Illback's ability to attribute changes in student performance to FRYSC services.
While conducting needs assessments, Dan Clemons, coordinator of the Youth Services Center in Fairdale, Kentucky, found that the biggest need at his school was for health services. Taking together county data on teen pregnancy and data from self-reported students' needs assessments, Clemons sought funding for what is now the only full-time school-based health center in Jefferson County, where the Fairdale Youth Services Center is located. “Without the data generated from the MIS on students' health needs, I could never have expanded our services this way because I would not have had an objective base for establishing need,” Clemons said. Having the MIS allows Clemons to show, for instance, that helping a family obtain housing does have a positive impact on a youth's school attendance. “As a result of being able to capture data,” he explained, “I'm able to present information in grants and basically double the size of my program. When people are doling out money, they want hard facts, and the MIS helps me provide them” (personal communication, 4/4/95).
Facing Evaluation Challenges
Since attribution of outcomes to FRYSC program offerings is so problematic, Illback has sought to look at implementation of the FRYSCs first, and then to examine preliminary outcomes using pre/post analyses based on teachers' subjective reporting of student performance, needs assessments completed by center coordinators or students, and participant surveys (personal communication, 8/23/94). The MIS system, located at each site, provides him with descriptive statistics such as participant demographic information, number of services offered and received, and referral sources. Rather than worry about whether there is a standard FRYSC intervention, Illback uses the MIS to track trends across sites and to encourage coordinators to use data from their own centers to improve services.
In his 1993 evaluation, Illback found that teachers perceived that the children who participated in FRYSCs were improving on completion of homework, following directions, and staying on task; and that consumers rated the centers very highly (Illback, Summary of Evaluation Studies). More recently, through the use of Hall and Hord's (1987) innovation configuration analysis, Illback has, with colleague Dr. John Kalafat, sought to identify best practices (Illback & Kalafat, nd, p. 15). They have discovered that a center's success depends largely on its coordinator's ability to foster collaboration and service delivery. Successful center coordinators display an unusual degree of flexibility in their professional roles. In their jobs, they regularly cross:
Helping coordinators use evaluation for self-assessment. The importance of center coordinators' adaptability has forced Illback to seek ways to help coordinators assess their own job performance and begin to work more flexibly. To that end, he has developed a self-evaluation that coordinators can use to see how they manage their program relative to overall program goals and indicators of successful implementation. “Since we've found that most program variation is with coordinators, this evaluation is meant to support them and, by extension, the families they work with,” Illback explained. The self-assessment, said Illback, “forces coordinators to think strategically and managerially about their jobs.”
Illback is also field testing a project where experienced coordinators mentor newer ones. In addition, he is piloting a framework for coordinators to use to assess how their programs are doing. “I'm trying to move coordinators from an evaluation mentality of monitoring and compliance to one of program improvement,” said Illback (personal communication, 3/30/95).
Using the MIS and doing evaluation: Mixed blessings. Although most coordinators do agree that the MIS system helps them do their jobs better, some coordinators—like Marilyn Coffey at the Casey County Family Resource Center—have “mixed feelings” about it (personal communication, 4/3/95). Coffey identified three primary problems with the demands of the MIS:
Cannon, G. T., Kalafat, J., Illback, R. J. (1994, June). Implementation of Kentucky Family Resource and Youth Services Center Program: Qualitative analyses, Year two: 1994-95 school year. Draft document. Louisville, KY: R.E.A.C.H.
Cohen, D. L. (1995, March 29). Going the extra mile is hallmark of family-resource centers in KY, Education Week, 6-7.
Hall, G. E., & Hord, S. M. (1987). Change in schools: Facilitating the process. Albany, NY: State University of New York Press. (Describes innovation configuration analysis.)
Illback, R. J. (nd). Summary of evaluation studies. Louisville, KY: R.E.A.C.H.
Illback, R. J., & Kalafat, J. (in press). Initial evaluation of a school-based integrated service program: Kentucky Family Resource and Youth Service Centers. Special Services in the Schools.
Illback, R. J. (1993, October 15). Formative evaluation of Kentucky Family Resource and Youth Service Centers: A descriptive analysis of program trends. Draft document. Louisville, KY: R.E.A.C.H.
Phil Rogers noted another barrier to using the MIS: there has been slow state response in providing technical assistance to center coordinators. Only this year did the Kentucky Department of Education hire an MIS specialist to help site coordinators with their evaluation and reporting duties (personal communication, 4/3/95).
Even Illback acknowledges that the MIS has not yet been utilized to its fullest potential. It takes time, he explained, for coordinators to learn to be “brokers of services rather than direct service providers” (personal communication, 3/30/95). In addition, it takes time to learn to use the MIS system efficiently and correctly.
Understanding Diversity and Sharing Information
Rogers reflected the views of several coordinators when he said that the MIS, however burdensome it may be at times, does give coordinators “the ability to share information with local people and with legislators.” Clemons added that the MIS also “captures the changing nature of the American family,” by showing how students define their families.
“Are families blood-related,” asked Clemons, “or people that live together to meet each other's needs?” The ability of the MIS to illustrate different students' experiences of “family” leads Clemons to conclude simply that “evaluation should be part of the way we do business.” Clemons feels that with various definitions of “family” illustrated in students' MIS data, policymakers may better understand why different services and service delivery mechanisms are needed to meet different families' needs.
As Kentucky offers the nation an example of “scaling up” services for children and families beyond just a few demonstration sites (Education Week, 3/29/95), Illback, his colleagues, and FRYSC coordinators offer equally important lessons in participatory evaluation, self-assessment, and evaluation for program and policy improvement.
The Kentucky General Assembly's Office of Educational Accountability has published its 1994 Annual Report. This report evaluates all aspects of KERA, and it offers recommendations for the FRYSCs. For a copy of this useful resource, please contact the Kentucky Office of Educational Accountability at 800-242-0520.
Elaine Replogle, Research Assistant, HFRP