This article was published in the March, 2017 edition of Career Education Review.
—
In my nearly 30 years of experience in higher education, I have participated in the development and roll out of dozens of new academic programs at the associate, bachelor’s and graduate levels. As an accreditation evaluator I have reviewed many dozens of other programs as well. One theme that has spanned nearly 30 years of work with academic programs is the frequent absence of a comprehensive review of the underlying viability of programs before they’re actually launched. In many cases, there is anecdotal evidence about the market demand for a given program (and that perceived demand often turns out to be real), but even when that happens, there has usually been limited, formal evaluation applied to a long list of other factors that comprise “viability.” As a result, it is common to see programs hastily launched based on perceived or real market demand that result in operational problems down the line.
That is unfortunate because new programs are critical to the sustainability of private sector colleges and universities, and, in fact, generally speaking, new programs and new students account for about 30% of revenue in any given year. One would think that due to the criticality of new programs that schools would approach new program development with the same focus and accountability that they do with other things such as admissions, retention, career services, clinical education, compliance, etc. Ironically, that is usually not the case and, in fact, new program development is often a remarkably haphazard process in which someone in Academics is assigned the task of putting together a new program accreditation application after someone else in the institution has already decided to launch the new offering.
On the other hand, a great advantage of private sector schools is that they are generally quick to market and curricula are almost always application based, which serves graduates well in terms of employable skills. However, sometimes being quick to market comes at the expense of due diligence, that had it taken place, would have either strengthened programs or resulted in a decision not to launch a program at all.
Because I had seen so many programs that lacked a solid vetting process, I eventually developed my own “New Program Viability Review” which I have been using for nearly a decade. It is a comprehensive rubric that covers several basic areas and which requires that items within those areas be individually scored. Although the process does require some modest investment of time and labor, it is quite simple, and ultimately leads to a program viability score. An institution can set whatever minimum score it wants as “minimum viability,” but the higher the score, the more likely it is that the program will be successful. I typically set 50 as a minimum score for consideration. In my experience, the entire review can be completed in two to four weeks in a private sector school if someone is dedicated to the process.
The areas of review include:
- market and employer demand and competitive landscape
- availability of faculty and subject matter experts
- availability and applicability of existing physical plant and technology
- accreditation, licensure, and regulatory implications
- compatibility with existing programs
- incremental delivery costs
The model also includes an additional, separate rubric for online delivery.
As someone who began his career as a “pure” academic, I tend to see higher education operations through an academic lens, which certainly informs the viability review, but each area of the rubric was chosen based on my operational experience in higher education and the hard won lessons that come from leaping into new program areas only to learn later than my institution was not fully prepared for effective implementation. In other words, developing a new program application is almost the easy part. Actually creating a new curriculum, then effectively delivering it to students is much more challenging and that speaks to why the viability rubric has the items it does.
Most schools instinctively understand that market and employer demand are baseline requirements for new programs and it’s exceedingly rare for any institution to launch a new program without some sense of the demand. It is less common for schools to confirm that they have or can find adequate and affordable faculty for all new courses or subject matter experts for curriculum development or industry experts for program advisory committees. Institutions eventually put together PACs for new programs, but almost always after the program has been developed and launched. This should actually happen first and the PAC should contribute to the development of the curriculum!
It is also important for colleges and universities to carefully assess the needs of new programs relative to physical plant and technology. What can be shared? What will be purely incremental? Moreover, what existing curriculum can be used or adapted for new programs? I have a personal goal of any new program using at least 30% of existing curriculum and the more the better.
As it relates to accreditation, licensure, and regulatory issues, most schools see this part of the new program process as limited to the required applications and approvals, but it is ideally much more than that. For example, if a program requires that students pass a licensure exam in order to be employed, it is critical that the process ensure that the curriculum is built from the ground up to support passing the exam, both through course content and through direct exam preparation. Similarly, while programmatic accreditation may be optional from a regulatory perspective, is it really optional from a competitive perspective?
Another key area of the viability review is compatibility with other existing programs as well as program “verticals” (the ability to stack vertical program levels in the same discipline). It is common for schools to launch new programs simply because there is a compelling market opportunity. However, this can be a big mistake even if a new program does provide access to new students. For example, if a new program requires separate, stand alone labs or technology or faculty that do not exist in any other programs, the cost alone could be a net negative. Moreover, institutions should generally only launch programs that fit their “identity” and mission. I have seen examples such as an allied health school that started a computer programming degree because of the perceived market opportunity. Of course schools can always expand their program offerings into totally new subjects, but new programs in areas in which you already have expertise and a good reputation are almost always more likely to be successful and of higher quality than programs in areas that are foreign to the institution.
In short, the New Program Viability Review rubric is a straightforward process for ensuring that key areas of program viability have been evaluated and quantified. The process requires a “project manager” (usually a program director or other academic personnel) who completes the rubric with the assistance of others in the organization. The project manager generally conducts brief interviews or questionnaires with individuals or small groups that have the expertise to evaluate each item on the rubric. In some cases, the score for a given item may have some subjectivity to it, but it is at least based on expert input. In addition to providing a viability score, the process also serves as a framework for the work that has to be done to effectively launch the program if the decision is made to do so—and it is important to remember that there is a difference between assessing the viability of a new program versus actually developing and launching a new curriculum.
A sample rubric is included with this article. It can of course be modified for the needs any particular institution or program. There might be items related to international students or clinical education or partnerships etc. that are not in the sample rubric.
New Program Viability Review
Program Name:
Level (Certificate AA/AS BA/BS Graduate)
Sponsoring School:
Reviewer:
Strength of Student Market
1 2 3 4 5
Weak Very Strong
Data Source:
Comments:
Market Competition
1 2 3 4 5
Very Strong Weak
Data Source:
Comments:
Availability and Cost of Leads
1 2 3 4 5
Low Availability High Availability
Data Source:
Comments:
1 2 3 4 5
High Cost Low Cost
Data Source:
Comments:
Employer Demand
1 2 3 4 5
Weak Very Strong
Data Source:
Comments:
Faculty Availability and Cost
1 2 3 4 5
Low Availability High availability
Data Source:
Comments:
1 2 3 4 5
High Cost Low Cost
Data Source:
Comments:
SME Availability and Cost
1 2 3 4 5
Low Availability High availability
Data Source:
Comments:
1 2 3 4 5
High Cost Low Cost
Data Source:
Comments:
Physical Plant Needs
1 2 3 4 5
Extensive Needs No New Needs
Data Source:
Comments:
Technology Needs
1 2 3 4 5
Extensive Needs No New Needs
Data Source:
Comments:
Accreditation Requirements
1 2 3 4 5
Extensive Requirements No New Requirements
Data Source:
Comments:
Licensure Requirements
1 2 3 4 5
Extensive Requirements No New Requirements
Data Source:
Comments:
Other Regulatory (State, Federal, Local) Requirements
1 2 3 4 5
Extensive Requirements No New Requirements
Data Source:
Comments:
Clinical Requirements
1 2 3 4 5
Extensive Requirements No New Requirements
Data Source:
Comments:
Compatibility with Existing Curricula
1 2 3 4 5
Low Compatibility High Compatibility
Percentage of shared courses:
Data Source:
Comments:
Opportunity for Program Verticals
1 2 3 4 5
Zero verticals Multiple Verticals
List potential verticals:
Data Source:
Comments:
Incremental Delivery Costs
1 2 3 4 5
High Cost Low Cost
List sources of incremental costs:
Data Source:
Comments:
Additional Comments about Program Viability:
Viability Score:
16 30 45 60 80
Unviable Highly Viable
Only programs scoring in excess of 50 may be considered for development by the sponsoring school.
—
New Program Viability Review – Online Addendum
Mass Market Appeal
1 2 3 4 5
Narrow Market Mass Market
Data Source:
Comments:
Identifiable Lead Source
1 2 3 4 5
Unidentified Identified
Data Source:
Comments:
Online Platform
1 2 3 4 5
Poor Capability No New Capability
Data Source:
Comments:
Acceleration of Curriculum
1 2 3 4 5
Poor Adaptability Excellent Adaptability
Data Source:
Comments:
Short Courses
1 2 3 4 5
Poor Adaptability Excellent Adaptability
Data Source:
Comments:
Course Sequencing
1 2 3 4 5
Extensive Sequencing Limited Sequencing
Data Source:
Comments:
Open Admissions
1 2 3 4 5
Extensive Requirements Open Admissions
Data Source:
Comments:
—
You may contact the author for more information about the Program Viability Review process or other higher education issues at:
719-247-0486
Submit graduate study in Queen Mary College of England also offers
opportunity to develop abilities by way of scholar neighborhood action initiatives or volunteering opportunities run by
way of accredited volunteering programme called Provide https://math-problem-solver.com/ .
This was all about how to cut back fractions.