Higher education is in a unique spot today. According to recent findings from National Student Clearinghouse Research Center, student persistence and retention rates have hit a decade high. Yet higher ed institutions know better than to become too comfortable — various social and economic factors could upend those successes in an instant. As more colleges and universities shut their doors due to economic pressures, it is becoming clearer than ever that higher ed institutions must distinguish themselves to stay competitive and meet the needs of current students. One of the best ways to do so is focusing on building high- quality academic programs.
A well-designed academic program review process can help your institution achieve substantial improvements in academic rigor and course quality — and provide the evidence you need to better engage both current and prospective students.
Persistence rate refers to the percentage of students who return to any institution for their second year of college. Retention rate is the percentage of students who return to the same institution.
An academic program review is a periodic evaluation process that assesses how well a given academic program accomplishes its stated purpose. It also identifies opportunities for improvement.
It involves phases in which stakeholders gather and examine hard data to determine whether the program:
The three phases of a program review include the following.
A program review self-study is similar to an accreditation self-study in that the process involves a program evaluating itself and identifying potential areas for improvement.
An effective self-study focuses on more than making sure the program looks good for the review team. Its goal is to identify and address the program’s strengths and weaknesses in order to optimize student outcomes.
The core areas to address in your self-study include:
Program mission:
Does the program’s stated mission align with the institution’s overarching mission and strategic direction? How well does the program accomplish its mission?
Program quality:
Is the program relevant to students’ current needs? Is it rigorous enough to meet the institution’s academic standards?
Assessment standards:
Do you have evidence that the program and the division it belongs to regularly assesses its curricula, instruction quality, and support services?
Future direction:
What resources and actions are required to ensure the program is sustainable moving forward? What data will inform the decisions made for future improvements?
Decision-making:
What data are program faculty using to make decisions regarding their teaching and assessment methodologies?
The committee will create a report summarizing their findings and submit it to the external review board upon completion of the self-study.
Typically, the administration will check in on the self-study committee at least once during the process to gauge their progress. Depending on how far they have come, the committee may have a report ready by this time.
Although institutions tend to vary in their individual approaches to self-study reporting, all rely on historical, current, and projected data to power their insights.
Gathering and analyzing all that data can take anywhere from six months to a full year, so it’s best to start the process early to ensure you meet your deadline.
After the self-study committee wraps up its report, a team of experts from outside the university will visit for an onsite evaluation. During this time, the external review team will consult with program faculty, staff, students, and other important stakeholders to get a comprehensive idea of the program’s performance.
They will also meet with department leadership, the deans of the corresponding college, and institutional administrators after completing their review to discuss their findings.
This phase typically lasts two to three days, though it may take longer to properly review larger, more complex programs.
Additionally, effective activity scheduling is essential for ensuring review committee members can access everything and everyone they need for a comprehensive evaluation.
The goal of this final phase is to determine the program’s future direction — specifically, which actions the institutions should take to enhance the program.
Once the external review has been completed, the program leads will conduct a comprehensive evaluation of the information from both the self-study and the external review reports.
Using these findings, they will draft an action plan that includes the steps they plan to take to improve the program. They will also establish a timeline for implementing these changes, which may span several years.
Other key program review deliverables include:
These documents should be easily accessible to all administrators and program faculty, as they will include valuable information for implementing changes over the coming years.
Over a lifetime, college graduates earn roughly $1 million more than high school graduates. According to the National Center for Education Statistics, in 2022, 25- to 34-year-olds who worked full time and year round had median earnings 59 percent higher if they held a bachelor’s degree compared to those who only completed high school.
Despite such a clear long-term benefit, earning a degree often comes with a high up-front cost — and debt that many will struggle to pay off for years to come. These factors mean that many prospective students still question the value of higher education. That’s why regular program reviews are essential for sustained institutional growth and success. The benefits of academic program reviews include, but are not limited to:
More engaged students:
Finding ways to improve program relevance and rigor can help students stay engaged with their studies, leading to higher student retention rates.
Increased enrollment:
Continuously improving your programs can help you differentiate your offerings from those of competing institutions, drawing more students each year.
Better student outcomes:
Better programs mean students are more prepared
to take on more advanced courses, pursue further education, and perform well at jobs in their chosen field.
Accreditation:
Regular program review helps institutions adapt to changing standards within specific disciplines, making it easier to maintain their accreditation status.
median earnings* were
59% higher
with a bachelor’s degree compared to only a high school degree
*for 25- to 34-year olds in 2022 working full time, year round
Most academic program reviews involve several groups, one for each phase of the process. Each group is composed of various members from different parts of the institution.
A typical self-study committee will include:
The committee may also include faculty from other programs to help guide the process along, hold program faculty accountable, and add fresh perspectives to the discussion.
In addition to completing the tasks involved in a typical self-study, this committee will be responsible for appointing and reaching out to external reviewers.
Similar to bringing in institutional faculty from outside the program, the purpose of the external review committee is to broaden your perspective and hold the program accountable for its work.
This committee’s key responsibilities include:
Members of the external review committee should be reputable peers with demonstrated experience and proficiency in the program’s areas of specialization. These individuals may be from your state, or they may come from other states and territories to prevent conflicts of interest from affecting the review process.
Additionally, the size of this committee will depend on the type of program you are reviewing. At a minimum, you should find two reviewers for non-degree programs and three for degree programs.
Including other institutional stakeholders in the review process helps your self-study committee build a more substantial report. For example, input from current students and alumni can reveal areas for improvement that the program faculty may not have considered otherwise.
Some important academic program review stakeholders to include are:
Online surveys distributed via email are an effective way to source feedback from stakeholders.
As with any assessment, academic program reviews have a list of key parts that your team should address. Using the right approach can help you ensure you’re covering all your bases and conducting a complete evaluation.
The following are some of the most effective academic program review methodologies for institutions.
Mapping courses within the program to learning outcomes helps demonstrate relevance, effectiveness, and overall coherence.
Some of the program insights curriculum mapping can provide include:
Course sequence:
Curriculum mapping reveals whether the order of courses in a program makes sense for achieving the desired learning outcomes. For example, are students learning concepts in a way that makes sense?
Learning gaps:
There may be gaps in the course sequence that leave certain desired skills and concepts behind. Visually mapping outcomes to courses helps identify where these gaps are and how to resolve them.
Mission alignment:
When you can connect each course to specific learning outcomes, you can determine whether the program achieves its mission.
A curriculum management software solution streamlines the curriculum mapping process by providing a centralized hub for materials like lesson plans and educational resources.
You should be able to easily pull these materials and evaluate them without having to switch back and forth between systems — this lack of friction will help you save time and prevent errors from making their way into your final review documents.
One of the best sources of information regarding program quality is the people directly involved in it. Surveys are a simple yet incredibly valuable method for gathering data from all program stakeholders, such as:
Students:
End-of-course surveys provide valuable feedback on the effectiveness of each course currently required for program completion.
Alumni:
Surveying recent graduates about their experience reveals how well the program prepared them for their next steps, such as pursuing a higher credential or securing a well-paying job within their field.
Faculty:
Faculty surveys can provide valuable insider information about the program, such as resource availability and assessment effectiveness. Instructor input can also reveal whether there is a disconnect between faculty members and students regarding student engagement.
Employers:
You may also consider surveying local employers to understand whether the program’s reputation affects their hiring decisions. For example, if a company has hired program alumni, were they satisfied with their performance?
A digital survey tool that integrates with a holistic data collection and analytics solution helps you maximize your survey results by allowing you to run surveys, analyze your results, and generate effective reports all from one unified place. Plus, by consolidating your institution’s tech stack, you can maximize efficiency and save valuable time and money.
One of the biggest challenges in academic program reviews — and higher ed in general — is balancing institutional costs with educational quality.
Many institutions have begun cutting less popular programs as a way to save funds, but taking this action often puts affected students in a tough position. Instead, it can help to approach the program review process with the intention of determining where and how to most effectively distribute fiscal resources among departments.
Some examples of strategic reallocation decisions that a program review could prompt include:
Approaching your program review with institutional finances in mind will help you come up with effective solutions for preserving underperforming programs.
A program’s instructors are just as important as the concepts each course covers. That’s why you should include faculty evaluations as part of your program review process.
Understanding how an instructor’s teaching methods and ability to build rapport with students impact learning outcomes adds another dimension to your program review that can help you determine the best course of action to take.
Items to assess in a faculty evaluation include:
All of this information will serve as important evidence in creating your action plan.
Landing a good career within their field is one of the top reasons students choose to pursue higher education.
Labor market information (LMI) is an excellent tool review committees can use to inform their evaluations, as it can reveal whether programs are meeting the needs of today’s students.
Here are some ways review teams can incorporate LMI into an academic program review:
Skills mapping:
Similar to curriculum mapping, skills mapping connects a program’s courses and experiential learning opportunities to the skills employers are looking for. After the review period, program leaders can use this skills map to determine how best to close any skills gaps that may have appeared.
Strategic planning:
With access to the most current labor market trends, program leads can more accurately devise an action plan that addresses specific deficiencies and strengths within their programs.
Adjusting to meet demand:
LMI can reveal which career paths are growing fastest, which is a great indicator of which programs will need to grow or downsize moving forward. As mentioned previously, these insights are helpful for determining the best ways to allocate and distribute resources among academic units.
After the assessment is complete and your administrators wrap up their final review, it’s time to implement your action plan for the program.
The goal here is to use evaluation data to close the assessment loop — to take the correct actions based on the insights you gained from your review. That’s where the concept of data-driven decision-making comes in. The more evidence you have to back up your actions, the more effective your decisions are likely to be. And that evidence can only come from a strong analysis.
Integrated planning and self-study software tools help by:
Ideally, the solution you use will be user-friendly and intuitive. Easily navigable software reduces the time it will take for your faculty to learn how to use it effectively — and the amount of time you’ll need to spend refreshing their memory during the next review cycle.
Ideally, the solution you use will be user-friendly and intuitive. Easily navigable software reduces the time it will take for your faculty to learn how to use it effectively — and the amount of time you’ll need to spend refreshing their memory during the next review cycle.
It’s important to remember that academic program review is a cyclical process. Most institutions will follow either a 5-year, 7-year, or 10-year cycle, but your institution may need to use different intervals based on your specific requirements.
It is also an agile process that allows for adjustments in real time. Each review is iterative, improving on the changes made in previous cycles; allowing this process to be a one-and-done endeavor defeats the purpose of academic program review.
Smaller program evaluations should take place each year until the next review cycle begins. That’s why monitoring key performance indicators (KPIs) between review cycles is so important.
Sustainability and agility are essential for navigating the current higher ed landscape. Tracking KPIs on a quarterly or yearly basis provides concrete evidence of your action plan’s effectiveness, so you can make adjustments that will enable sustained progress.
Some helpful metrics to monitor include:
Building a culture of continuous improvement can help make it easier to implement action plans and adapt to new challenges in the program when they arise.
Faculty and administrators should be open to the possibility of change and hold themselves accountable for their current performance. Additionally, everyone within the institution should be encouraged to contribute suggestions and honest feedback to help identify areas for improvement at the program level.
The proper tools help make program reviews a breeze. Leveraging emerging technologies like those in the Watermark Educational Impact Suite (EIS) simplifies academic program reviews by streamlining analytics and data storage.
Our award-winning Watermark Planning & Self-Study within the EIS enables you to complete your program reviews on time and easily share your findings with key stakeholders. Plus, complete integration with all other Watermark products means you can seamlessly pull data on demand from anywhere for powerful, actionable insights. Our recent strategic partnership with Lightcast, for example, gives
users access to the most current labor market data available so departments can see how their courses connect to the demands of the modern workforce.
See how our tools are helping clients right now, get in-depth information on topics that matter, and stay up-to-date on trends in higher ed.