Noncredit Progress Indicator Pilot

February
2012
Noncredit Task Force Chair

The Noncredit Task Force—composed of 42 people from all disciplines and roles in noncredit across the state, including 17 different institutions—was developed in Spring 2010 to address specific noncredit resolutions about accountability reporting and to oversee a pilot project which is revolutionary to many areas of noncredit: the use of progress indicators for students’ work. Some areas of noncredit have always graded or indicated progress, while other areas have evaluated student work and carefully advised students’ next steps based on the students’ abilities. No matter what the strategy, all grades submitted to the Chancellor’s Office from noncredit areas have been converted to UG (ungraded) and reported out in statewide reports in this manner. Therefore, the success and progress in noncredit has not been reported at a statewide level, often lending an opportunity for people to incorrectly judge the work of noncredit without valuable data representing noncredit success.

The Noncredit Task Force has been working on the following strategies in an effort to craft a means of reporting the great work done in noncredit:

  • Defining progress indicators/grades.
  • Educating faculty and others about the purpose of indicators or grades.
  • Developing a pilot project to document, report, and analyze progress indicators from participating colleges.
  • In conjunction with the Academic Senate’s Noncredit Committee, developing and conducting training for faculty about using indicators and addressing reporting gaps.
  • Sharing and developing methods, based on the pilot college experiences, to help other institutions review and plan their own processes for reporting.
  • Collecting feedback from faculty on the effects of pilot indicators.

 

Background

Noncredit serves over 300,000 FTES in our system and represents about half of the total basic skills and ESL work in the California community colleges. Noncredit students are significantly more diverse and commonly have the greatest socioeconomic needs. Many noncredit students are less likely to succeed in higher education without the benefits noncredit provides, such as flexible schedules, increased contact hours, opportunities for self-paced learning, and no fees. Helping students through noncredit fulfills an essential role for our state: providing adults with basic life, literacy, and employment skills. This function has become even more important with the reduction of many Adult Education programs in K-12 districts in spite of California’s growing needs in these areas. Because accountability has become so important and funding is often dependent on documenting student success, noncredit education faces a huge challenge. With no grades in most of the courses and no documented progress or success beyond career development or college preparatory certificates, high school diplomas, and a few other measures, the good work of noncredit becomes invisible and the funding is easily eliminated. Noncredit has always been funded at a rate far below that of credit instruction, and although noncredit “enhanced” funding became available through SB 361(2006, Scott), this funding is far below the credit funding rate and tied to documented metrics and annual accountability reporting.

The Task Force is addressing several Academic Senate resolutions related to noncredit which state major concerns of faculty, staff, and administrators. Among these resolutions are 9.01 F09 (Appropriate Noncredit Accountability Measures), 13.01 S08 (Noncredit Accountability Measures), and 13.04 S10 (Improving Noncredit Accountability Reporting through Progress Indicators). This most recent resolution reads as follows:

Resolved, That the Academic Senate for California Community Colleges develop a task force of primarily noncredit faculty and administrators representing all noncredit areas and other representatives, as appropriate, to research options and develop progress indicators and implementation strategies and to prioritize and address accountability issues as soon as possible, continuing into the 2010-2011 academic year;

Resolved, That the Academic Senate for California Community Colleges develop a voluntary pilot using interim noncredit indicators with a goal of beginning in Summer 2010 and continuing into 2010-2011 academic year, with results to be used as research information for the taskforce and others; and

Resolved, That the Academic Senate for California Community Colleges pursue necessary changes in Title 5 and Board of Governors’ policies with a goal of implementation of official noncredit progress indicators beginning in Fall 2011.

Update on Progress

Eight colleges submitted noncredit progress indicator data in Fall 2010, about 11 colleges participated in the Spring 2011 reporting process, and more will participate in the final reporting period of Fall 2011. The data were rich in providing many lessons beyond what we expected and will inform future efforts to track progress in noncredit courses. The Task Force provided training for a variety of colleges and posted training materials on the Academic Senate BSI3 website. But gaps in actual reporting at each level were discovered – classroom to administration, administration to MIS data reporting at the local campus, and reporting from the local campus to the Chancellor’s Office. Local researchers were sometimes confused because their practice has been to change any noncredit grades reported to “UG” or ungraded, and some researchers automatically did this even though the pilot had been approved at the college. In many cases the researcher or administration refused to accept progress indicator reports because they do not include noncredit data in their credit MIS reports; rather, the college uses alternative methods (shadow systems) to collect noncredit positive attendance only. Colleges that use these positive attendance reports offered little cooperation to include progress indicators. While most credit grades are reported electronically, noncredit faculty, over 80% of whom are adjuncts, often do not have computer access. Several colleges submitted progress indicators via scantrons and tallied the results by hand.

On the positive side of the data and gap analysis, we were amazed at the overall results and the outstanding professional development faculty did to norm the meaning of NP (no pass), P (pass), and SP (satisfactory progress). We have surveyed over 108 faculty that participated in the progress indicator/grading pilot and have seen overwhelmingly positive feedback. Some of that feedback included the following:

Overall 83.8% of the faculty felt the use of progress indicators was practical. Many responded that documenting the progress was beneficial to them, to their students, and to evaluation of the curricular work they were doing. Some described the benefit of the tangible record of learning and more clearly indicating promotion for those ready to register at the next level. Perhaps more importantly, faculty reported the benefit of having documented areas where students could target improvement. Some felt it motivated students to focus their work. Faculty also indicated that it adjusted focus more clearly to each individual student’s needs. On a larger scale, faculty felt that it was important to document and report how students are moving through the noncredit system. Current reporting is very inadequate and under-reports the good work of noncredit due to technical problems regarding the way cohorts are selected and the method by which progress is determined without any indicators.

The recording of progress indicators in noncredit has yielded numerous positive results at various colleges. Using the data collected, Santa Ana College has been able to identify the number of average hours necessary for a student to get SP or P. Imagine how useful it is to tell positive attendance students that at their specific level of ESL, 108 hours usually translates into a P. In another instance, North Orange CCD – School of Continuing Education implemented the progress indicator data into their newly developed program review process. This practice allowed the institution to make a variety of important decisions and perform program budgeting based upon data.

Another analysis actually showed that success is very cost efficient in noncredit. The unsuccessful students usually attend few hours, and because noncredit allocation is based upon positive attendance, unsuccessful attempts are very cheap –unlike credit allocations of an entire semester for both successful and unsuccessful students.

Noncredit is currently experiencing an exciting time. The Task Force will be collecting final data for the Fall 2011 term. At a time when budget crises loom, the collection of data through progress indicators will allow us to report the good work done in noncredit education where the paradigm addresses student success regardless of the time required.