Program Review Process

PROGRAM REVIEW PROCESS

The current Program Review Process was piloted in summer, 2016, and all RTC programs will have started the 3-year review process during the 2016-17 school year.

Program Review Timeline and Reviewers

Program Review Report Template

Program Review Data Guide

Assessment 101:  Who, What, When, Where, Why and How

 

Program review

Frequently Asked Questions, Helpful Data Glossary, and Easy Instructions

What is a program review?

A program review is like a wellness check with the doctor: it’s an opportunity to evaluate all systems and make sure the program is vibrant and healthy. Program reviews occur in 3-year cycles.  Program Reviews are useful for:

  • Creating improved pathways for students to succeed
  • Evaluating and updating current practices, credit load and expectations in programs
  • Reflecting on ways to provide equity and access for all students
  • Year One: Year One is the primary review. It is a time when data is collected, discussed, and an initial plan for assessment and growth is developed.
  • Year Two: In this year, the faculty/dean team formally regroups to assess student learning through three potential steps:

Step 1: Summarizing collected course assessment data

Step 2: Analysis and evaluation of data

Step 3: Implementation of change(s)

  • Year Three: In the final year of Program Review, the faculty/dean team closes the loop on assessing the course outcome measurements. They also review the work accomplished over the three years, and plan for the future.

Who is involved in program review?

The program review team (faculty and their dean), IR (for data purposes and data consultation), an Advisory board member, and one non-program faculty chosen by the Office of Instruction and the advisor from the program area.

Who does What?

The Dean schedules the yearly meetings, working well in advance to ensure full participation. See the Dean’s Checklist.

The program faculty reflect on all aspects of their program, and works to find ways to improve retention and student success. They lead the review meetings.   

IR (Institutional Research) members provide the most recent data for the program, as well as information on the changing job outlook in the area.

The Advisory Board member sheds light on the current needs of the job market, as well as their perspective on the effectiveness of the program.

The non-program faculty looks at the data and gives an outside perspective to the program, providing suggestions and ideas based on the data and program needs and direction.

The program advisor provides a unique perspective based on their work with the program students as well as prospective students that may be interested in the program.

What documents are used in a program review?

  • Program Review Report: The official document where all data, comments, and recommendations are entered.
  • Program Review Action Plan: The addendum where the team lists needs or ideas to continue or improve the program. This can include recommendations that involve many departments on campus.
  • Supporting documents: These sources can help the program review participants prepare the official Program Review Form, or add context to information presented in the review. These sources can include:
    • Program Outlines
    • Program and Course Outcomes
    • Advisory Committee Meeting Minutes

Why do we need program reviews?

Program review serves many purposes. First, it is a requirement of the Northwest Commission on Colleges and Universities, which is the accrediting body governing two-year public institutions in Washington State, including RTC.  RTC is required to evaluate programs in a regular cycle, and to evaluate the effectiveness of program outcomes. 

Program Reviews are a chance to see how well your students are learning, what they are learning, and what can be done to ensure that student learning is taking place. It is a chance to look at data with an eye for improvement in retention and completion. It is also a chance to update or adjust courses and programs to meet current industry needs.

Next, it helps the institution prioritize program needs by enrollments, curricular change, industry trends, and student satisfaction rates. This makes it possible to identify and advocate for resources. 

Finally, it is an evaluative process that ensures faculty, deans, and the college in general are making strategic decisions for a program. This is an opportunity to reflect on success and strive for improvement. In this sense, it is a form of professional development for all members involved.


PROGRAM REVIEW – YEAR ONE INSTRUCTIONS

What happens in Year One?

Summary: Year One is the primary review. It is a time when data is collected, discussed, and an initial plan for assessment and growth is developed.  

Programs will be selected for review in a randomized process in fall/winter quarters.

The Office of Instruction will notify the deans of what programs are selected.

Institutional Research (IR) sends the dean and faculty a Program Review Template with most of the data already entered.

The dean will schedule time with faculty to add information to these sections:

  • Curriculum Development –list courses added, deleted or revised in past three years. This information can be retrieved from archived program outlines.
  • Program Enhancements – highlight technology implemented, partnerships added, and program-specific accreditation, if applicable.
  • Future Program Design Ideas – summarize ideas to keep pace with industry.
  • Professional Exams/Licenses – Pass Rates (if known)
  • Program Facilities and Equipment – highlight facility and equipment improvements or needs.
  • SWOT Analysis – the team will look at the strengths, weaknesses, opportunities and threats for the program.This will include previous planning information, anecdotal, industry trends, and so on.A short summary of this analysis is added to the document.
  • Additional comments or analysis – each data set provides the faculty and dean to provide extra comments and analysis.

The dean and faculty will choose a program outcome that aligns with the college-wide outcome to measure in Year Two and add simple information to the form:

  • Choose one program outcome that aligns to the college-wide outcome and identify the college level outcome with which it aligns.
  • Add that program outcome to the review form.
  • One - two relevant courses and outcomes that align with program outcomes will be selected together with your dean, to ensure that there are a variety used from your department. Choose relevant courses and course outcomes.
  • Talk about how student learning is measured in the courses. Are these written or oral assessments, demonstrations, or a combination of measures?
  • Choose a variety of examples (3-5 assessments) in which the course outcomes are measured for the selected courses, and add it to the review form. These methods of student learning assessment will be evaluated in Year Two by the faculty/dean team, to make sure they accurately measure the course and program outcomes. Faculty, see Assessment 101:  Who, What, When, Where, Why and How on how to align outcomes to your assessments.

A Review Team meeting will be scheduled in the first quarter. The team will be sent the report ahead of time for consideration. This team includes

  • All fulltime and IBEST Faculty; at least one part time if applicable
  • IR
  • Advisory or community member, if available (Selected by Dean)
  • One non-program faculty or staff selected by the VPI
  • Student Services Counselor currently connected to the program

Together, this team will look at the overall direction and performance of the program.

  • Their feedback will be compiled into the report.
  • Team members will fill out a program review survey within one week.
  • Based on the feedback, IR will compile the feedback and give one of these recommendations:
  • Move Forward: The program is on the right track to meet industry demand, growth projections, and use of existing resources. No major changes will be recommended.
  • The program needs some assistance to grow or evolve.  This may include marketing, revision of curriculum, equipment or facilities improvements.  Enhancements will help programs maintain or expand offerings.   These needs will be added to the comment section and listed in the Action Plan Addendum.
  • Intensive Review: More people and/or departments need to review the program’s data, and major changes may be needed for program health, or its active status.  If this recommendation is submitted to the Office of Instruction, it needs to be noted in the comments section.
  • The final report is submitted to the Office of Instruction, where it will be reviewed with the Dean if necessary and filed for future reference.
  • The Action Plan Addendum is monitored by the Dean and Faculty over the next year. This can be scheduled at quarterly program meetings.

PROGRAM REVIEW – YEAR TWO INSTRUCTIONS

What Happens in Year Two?

Summary: In this year, the faculty/dean team formally regroups to assess student learning through three potential steps:

Step 1:  Summarizing collected course assessment data

Step 2:  Analysis and evaluation of data

Step 3:  Implementation of change(s)

This may lead to a revision of a course outcome and/or assessment to see if outcomes improve as well as continued data collection into Year Three. Also, the faculty/dean team reviews the initial Action Plan to see if tasks identified in Year One have been completed.

If program enhancement or intensive review was recommended in Year One, the people and/or departments asked to review initial data and work together on program improvement continue to meet.  This helps the institution as a whole evaluate its progress in supporting program growth, restoration, or active status.

Throughout Year Two, faculty will collect data from course outcome assessments they chose in Year One.Examples of the data collected by quarter or course offering can be:

  • numbers of students who pass with a specific GPA;
  • numbers of students who succeed at a specific assessment at a specific level;
  • numbers of students who earn a specific industry credential in the course; or
  • numbers of students who demonstrate a competency in a lab setting.

If fluctuations or a negative trend is detected, the faculty will want to research why this is the case. It could be as narrow as pinpointing which questions are consistently missed on a test or as broad as a lab outline that misses a sequence that would enable a student to perform a task. Faculty will document their research on the provided program review form.

At the end of Year Two, the faculty/dean team reviews the data to determine possible improvements.

Faculty will design the necessary assessment and/or outcome change(s), and it will be documented in the Program Review Report.

The faculty/dean team will review and update items on the Action Plan Addendum.

PROGRAM REVIEW – YEAR THREE INSTRUCTIONS

What happens in Year Three?

In the final year of Program Review, the faculty/dean team closes the loop on assessing the course outcome measurements. They also review the work accomplished over the three years, and plan for the future.

In spring quarter, collection of another year’s worth of data related to the course and program outcome assessment measures identified in Year One will be reviewed.

  • If no changes were made in Year Two, the review will be to look for continued or positive trends.
  • If changes were made in Year Two, a review of the data related to the new changes will be done to see if results of student learning have improved from the previous year.

The Program Action Plan will be reviewed again and summarized. Questions to consider include:

  • Were the ideas or resources implemented?Did they prove effective?
  • Did we see improvements in student success?
  • If an idea or resource was not implemented, why not?

The faculty/dean team will start looking to the new Year One for planning.

The Dean schedules the yearly meetings, working well in advance to ensure full participation. See the Dean’s Checklist.


Data Glossary:

CIP Code: Classification of Instructional Programs. Each program at RTC is assigned a CIP, while similar programs have the same CIP code, e.g., Accounting Clerk and Accounting Paraprofessional. These are 6-digit codes that are assigned by the US Department of Education to describe the subject area of courses and major areas of study. There is a difference between federal CIP codes and local (SBCTC) CIP codes. The IR office analyzes the majority of program-level data based on CIP codes.

Cohort: Group of students who fit specific criteria that are tracked for retention and graduation statistics.

Program Enhancement Plan (PEP) Cohort (Used for Program Review): Derived from the Student Achievement Initiative database. Students are included in this cohort if they: (1) Earned zero credits prior to that year, (2) Attempted at least 1 credit during that year, and (3) Had a degree-seeking intent code.

Completion Rate: The completion rate is calculated as the percentage of students in an identified cohort that received a certificate or degree within a specified amount of time (i.e., three years).

Degree-Seeking: A student who indicates they are seeking a degree or a certificate.

DLOA (Data for Linking Outcomes Assessment): Database containing data compiled on an annual basis to meet college and SBCTC needs for outcomes data related to employment and further education of college students. Includes data for completers and leavers of vocational, academic, worker retraining or apprenticeship programs that left the system during the previous academic year. This database is used mainly to report estimated placement for the previous year and median wages for completers and/or early leavers of a program.

EPC Code: Educational Program Code. Each program at RTC is also assigned an EPC code, which is different for each level of a program. For example, the Accounting Clerk certificate has a different EPC code than the Accounting AAS degree.

FTE: “Full time equivalent” equals 15 credit hours per quarter.

FTE-F: Full time equivalency for faculty. Per the SBCTC: The percent of a full-time teaching load assigned to a course. The value is assigned in the SMIS database based on the FTEF distribution/calculation process which matches personnel and distributes FTEF based on the methodology a college chooses to use (i.e., full time faculty load per the RFT contract).

FTE-S: Full time equivalency for all students.

Headcount: Number of students enrolled in the program (averaged across all quarters).

Intent Codes: The student’s intent for enrolling, as coded by the colleges. This data element is collected each quarter that the student is enrolled and may change for each student over time.

Program Outcomes Program Outcomes describe how the student will integrate knowledge, skills, and abilities learned throughout the program in a complex role performance after completion.

Retention Rate: Retention rate is expressed as a percentage of the students who return each quarter or year. In a 2-year community or technical college, retention is usually assessed over shorter time periods such as Fall to Winter or Fall to Spring, as some students might graduate within a year.

SOC Code: The Standard Occupational Classification (SOC) system is used by Federal statistical agencies to classify workers into occupational categories for the purpose of collecting, calculating, or disseminating data. All workers are classified into one of 840 detailed occupations according to their occupational definition. 

Student Achievement Initiative (SAI) Database: The SAI Database helps colleges track their students for the incremental gains they make during the year in basic skills, pre-college (college ready) courses, earning their first 15 and first 30 college-level credits, earning their first 5 credits in college level math or reasoning and for certificate, degree, and apprenticeship awards. These gains are termed “momentum points” because they add momentum to a student’s longer term success. The database is constructed so that a college can see the college-level momentum students have behind them to start and the momentum they build during the academic year and to analyze students to develop achievement strategies. All state-supported, award-seeking, and Running Start students from SMS except international and Department of Corrections students, and non-state, non-award seeking students are included.  

Student/Faculty Ratio: The number of students per faculty member, based on student and faculty FTE.

Student Management System (SMS): The SMS system is used to administer and manage all student-related business functions on campus and can only be accessed by designated employees on campus such as Enrollment Services and Institutional Research staff.