NCSALL

This page is located at: http://www.ncsall.net/?id=360

Guiding Improvement: Pennsylvania's Odyssey

Guiding Improvement: Pennsylvania's Odyssey

by Cheryl Keenan
Pennsylvania has used a combination of direction from the top and innovation at the grassroots level to devise and institute an accountability system that is intimately linked with program improvement

Four years ago, the Pennsylvania Bureau of Adult Basic and Literacy Education asked 10 agencies to volunteer to pilot a program improvement process for adult education called Educational Quality for Adult Literacy. "EQUAL" seemed to be an appropriate acronym for a project that sought to provide quality for all adult learners in the Commonwealth, rather than in isolated pockets of programs around the state. We provided the pilot sites with a systematic way to look at their programs and learners and to select problems they wanted to solve. After each program selected a problem, they used a form we developed to guide them through a process of asking a question, gathering information, analyzing that information, and drawing conclusions that would lead to action. These days, participants from the original pilot sites still laugh when they recall that first meeting. They had left the meeting saying, "Yeah, but what does the state really want?"

Thinking Back

In December, 1993, I became a new State Director of Adult Education. I struggled to get a handle on the critical issues within adult education, reading General Accounting Office (GAO) reports that disputed the value of the federal investment in adult education and reviewing national evaluations that could not demonstrate positive outcomes for adult learners. My impression was that within Pennsylvania, adult educators felt generally that they did made a difference in people's lives. I heard practitioners express frustration about the reputation of adult education being an unaccount- able and unproved program. I saw program quality vary from place to place and recognized differences in the way the professional development system supported teachers. And I saw the pending cloud of new federal legislation and the implications of the reengineered federal accountability system.

Reflecting back on "what the state really wanted," I can say that we - the state Bureau of Adult Basic and Literacy Education (ABLE) - wanted a way to approach accountability that was connected to improving quality of services to learners. To me, this meant making significant changes in Pennsylvania, triggering reform of both local and state practices. Goals at the state level included being more responsive to local program needs by improving timelines and ease of funding, conducting meaningful evaluation and monitoring of programs, and investing our federal funds targeted to teacher training and experimental programs (Section 353 funds), in activities that supported program improvement. At the local level, it meant engaging teachers, tutors, program coordinators, and administrators in a team approach to solving problems identified through the systematic use of data. The assumption we made was that building an accountability system in isolation from quality improvement would demonstrate poor program results.

Early Years

In the very first years of EQUAL, we in ABLE struggled to keep a balance between identifying problems at the grassroots level and directing areas of focus from the state level. For example, to make informed decisions about accountability at the state level, we needed some sites to focus on assessment. In addition, we learned early on that many of the pilot sites were concerned about the same issues. Program improvement teams were recreating the wheel time and time again. We at the state level had some needs, and we wanted to create some "efficiencies" as future EQUAL sites came on board. However, we did not want to be viewed as directing local programs in the process. The 1992 revisions to the National Adult Education Act created a mandate for states to adopt Indicators of Program Quality and to use those Indicators to evaluate and monitor program effectiveness. As the middle ground in the top-down versus bottom-up debate, we developed a guide that used the Indicators of Program Quality as a framework for local program self-assessment. We eventually asked pilots to direct attention toward specific indicator areas, including assessment, to determine what practice issues sites would identify.

With the help of a national consultant, we developed forms that allowed the pilot sites to record any assessment activity used with a sample of learners. The consultant analyzed these records to determine what local assessment practices were being used. We began to see trends that led us to develop specific professional development opportunities concerning assessment. The data collected on these forms became the basis for our current program performance standards. These steps were the beginning of aligning state policy with practice and identifying program improvement needs.

By focusing the pilot sites on assessment, the EQUAL program improvement process helped us approach issues related to program accountability. We discovered that the pilot sites used a variety of assessment tools. Many forms being practiced, while valuable to teachers and students, were not standardized and could not be used for statewide performance measures. We learned that where formal assessments were used, they were not being administered and scored properly, rendering the subsequent scores invalid. By providing training and technical assistance to the pilot sites on the proper use of standardized tests, we were able to collect valid and reliable data on student learning. We used the data to set performance standards on student learning, which is the heart of our accountability system. Without a program improvement process that focused programs on measuring student learning and supporting them in collecting that information, the program performance standards may have been based on data that was not valid and reliable, giving an erroneous measure of expected student performance.

Local Ownership

The three-year pilot experience exceeded all of my expectations. The pilot grew to include more than 20 sites. Besides the goals we had set for the project, one of the most beneficial outcomes was the extent to which the pilot sites embraced the project. Once staff at the pilot sites understood that the state had no hidden agenda, they began to use the program-improvement process to solve other problems. By using data to answer questions rather than making assumptions about their programs, they began to see gaps between what they believed and what was real. By attacking small problems first, they began to experience success and began to solve more challenging issues. These factors helped them feel as if EQUAL belonged to them.

We developed a core of leadership that embraced the project and has served to carry the initiative into its second year of a scheduled three-year phase-in of all adult education agencies in Pennsylvania. At the close of the pilot, in June, 1997, I sat through the "celebration"  and reflected on how the level of conversation about practice had changed dramatically in the three-year period. Pilot sites that had broached "retention" as a student issue in their first data-for-decision-making logs were now looking within their programs for the solution. They talked about why students were leaving based on real data gathered from learners and shared how they used that information to change their program operations. One team had surveyed learners about why they left the program and found that many students felt as if they were not making progress. When the team looked at data about student learning, they were able to demonstrate to learners that learning has natural peaks and plateaus. The team realized that they did not share information about learners' progress with individual learners as much as they thought they did. They changed their program operations to increase individual student counseling that emphasized the review of progress. They were excited to see retention numbers rise.

Pilot site participants, in turn, became the "opinion leaders"  in the state by telling others how EQUAL helped their programs and learners. During the pilot, we brought staff from the sites together several times a year to share their experiences. At first, many participants were nervous because they were not sure if what they were doing was "right." Once they overcame this fear, they spoke easily about their successes and the value of the EQUAL process. When the state began statewide implementation in September, 1997, the training included presentations by representatives from the EQUAL pilot sites. They presented at conferences and spoke at other professional development opportunities. In many cases, they went on to become statewide leaders in the next phase of EQUAL and provided support to new sites by drawing on their own successful experiences.

Going to Scale

It is one thing to run a successful pilot program of 20 quality adult education programs. It is quite another to spread the practice consistently to all 230 adult education programs in Pennsylvania. The next phase of EQUAL attempted to bring it to scale using the expertise and leadership of our pilot participants. Our goal was to bring on one-third of our agencies each year for a three-year period.

To prepare for the effort we again took steps to align state policy with program improvement by strategically investing Section 353 funds into a structured training and technical assistance network using pilot personnel and expertise. We began to integrate our existing professional development centers into the EQUAL network. By so doing, we developed a training infrastructure capable of supporting a large-scale reform effort.

What remained a major challenge was how to create an incentive for new agencies to sign on to program improvement without mandating it from the state level. Although I understood that eventually the Bureau of Adult Basic and Literacy Education would need to develop a mandatory participation policy, I did not feel the time was right. We needed more people behind EQUAL to convince others of the intrinsic value of program improvement. The solution came with passage of the federal budget, which contained a 36 percent increase in the adult education appropriation. Our Bureau Management Team began to strategize about how we could use the increase to realize our EQUAL participation goals. We created a new application procedure that tied funding for program expansion to participation in EQUAL program improvement activities. Through that process, we realized our first year participation goal.

We are now in the second year of "going to scale" and no longer have the financial incentive available to the extent of the first year. However, the success of the first 81 agencies has influenced others to sign on. EQUAL is being viewed by program staff as something that can help them meet the newly implemented program performance standards. Program year 1999-2000 will bring on the last set of programs: we have mandated participation by this final year. The timing appears right. Resistance to new requirements for program improvement and corresponding mandates for data collection and assessment training is minimal. Program staff realize that Pennsylvania is now well positioned to perform in the new federal environment imposed by the Workforce Investment Act.

Lessons Learned

We have learned much about implementing program improvement as a means to increase quality and subsequent accountability. First, high-quality training and technical assistance that match actual program improvement needs are critical in assisting local program in implementing change. Willingness to make the financial investment necessary to support that is critical. Second, building a core of leadership and expertise has been one of the greatest keys to reaching our goals. In addition, maintaining the balance between what is directed at the top and what is driven by ground-level practice is a constant struggle. And last, but of course not least, grounding the program improvement team in issues related to teaching and learning is only accomplished by involvement of instructors and tutors in the team process.

We have also learned that given adequate support, the program-improvement process is effecting significant changes in how adult educators do business. We have observed programs improving retention by actively seeking input from learners about why they leave and why they stay; improving decision-making about how to use assessment in their programs; aligning curriculum, instruction, and assessment; and improving the quality of data collection and analysis. Most importantly, it appears that agencies that have embraced EQUAL as a means of continuous improvement are capable of using program information in meaningful ways and are able to respond to state program performance standards.

If someone asked me if continuous program improvement really makes a difference in implementing statewide program accountability systems, I would answer a definitive "Yes." But the more important question may be "Why?"  What we are trying to measure in Pennsylvania's Performance System is relatively straightforward. Do learners enroll in programs, persist in learning, accomplish learning, and achieve goals as a result of that learning? We are beginning to answer these questions. As teams of program staff systematically explore their program operations and student learning, they develop a deeper understanding of their program and their learners. Equally important is that teams become more comfortable collecting and using data. They have looked at numbers, recognized where they are low, and can formulate a plan to improve areas of weakness. When local programs are using data regularly to inform program operations, they place a higher value on that data. They devote more time and attention to working with teachers and volunteers to collect better-quality information. Local programs and state staff are better able to communicate about performance based on real program information. So, at the state level, we can determine if standards need adjustment or alternative standards are needed for certain target populations.

The Future

It is too soon to have the answers from all the work we have done, but we now have the structure necessary to be accountable and to concentrate on quality. We have the capacity, at the state and local levels, to run the system.

We are continuing to invest in keeping a set of programs engaged in "leading edge" activities. Some sites are assisting us in learning about student articulation, for example, so that we can continue to forge new directions for program improvement activities. Continuous improvement never ends. It just prepares us for the next set of changes.

About the Author

Cheryl Keenan is the Director, Bureau of Adult Basic and Literacy Education, in the Department of Education in Pennsylvania. She holds bachelor's and master's degrees in special education.