Data tied directly to budget reallocation targets for the 2019 fiscal year at the University of Idaho lacked reliability assessments necessary for publication in a reputable academic journal, according to one UI mathematics professor.
Associate professor Rob Ely was among faculty who raised methodological questions on the budget reallocation process at a public meeting in September. In response, Provost and Executive Vice President John Wiencek said the ranking of academic and non-academic programs to guide budget reallocation was about being inclusive, not publishable.
“By getting a significant number of votes, you’re expressing, as best as you possibly can, the relative objective opinion of the university community here at large. That was the intent,” Wiencek said.
He agreed with Ely and other critics that the lack of a reliability process for raters made the data subject to personal preference and bias.
“Inherently, you have to assume that people are giving everybody a fair review and being level-handed in what they do,” Wiencek said. “But, the reality is everybody brings their perspective and their life experience to their decisions and they don’t agree with each other.”
Each academic department’s alignment with UI’s strategic plan counted for 27.5 percent of the rankings. Evaluators could be anyone working in an academic department or program. This was to be inclusive by offering all UI employees a chance to participate in the process, officials said.
The methodology behind the program prioritization rankings presented many problems for professor of mathematics David Yopp, but he said the very basis of the process, peer review without expertise, made the data only useful for understanding the raters, not programs.
Yopp said in the case of program prioritization, it is not clear whether raters had expertise on programs in the fields they rated or even in understanding UI’s mission and strategic plan.
“What you’re really getting is data about the raters, not about the narrative itself,” he said. “In their data, the only opportunities I see are how well do people that are not in mathematics, for example, understand the role of mathematics in the university.”
When it comes to knowledge about the strategic plan, a 20-page document that outlines UI’s plan for 2016-2025, UI Vice President of Finance Brian Foisy said there were no methods of assuring evaluators understood it.
“There was not any expectation that you’ve read it at a level of competency. Certainly, they were given a copy of it and you would expect any decent evaluator to educate themselves,” he said.
Wiencek said the process relied on individual accountability.
“It’s their responsibility to make an informed choice and be aware of the strategic plan. It’s akin to voting,” he said. “You walk into a booth, and (vote) on public issue number four and there’s going to be all this legal gobble-dee-goop, and hopefully you have read what that issue is about.”
The rankings, along with other data about programs, were tied directly to reallocation targets released Sept. 7. These targets indicate the Provost’s executive level, which includes all academic programs, and an executive level labeled “other internal sources” will face the brunt of the $4 million internal reallocation.
After reviewing the data informing the targets against other enrollment data, Wiencek said in an Oct. 5 memo that the targets would remain the same. Vice presidents then created reallocation plans within their respective executive divisions — such as research and finance — and submitted them to President Chuck Staben, who said he expects to complete the plans by Jan. 1.
Staben acknowledged the criticism of the ranking process.
“We recognize the process was not perfect and we may modify the process in years to come,” he said in an interview last week. “But we anticipate sticking with this iteration of the p
rocess through our waypoint one (of the strategic plan,) which will be about another two years from now, and (we) don’t anticipate going through another rating process.”
Asked whether administrators would be moving away from the rating process, Staben said, “I didn’t say that. I said we may modify it.”
Staben later clarified his statement, saying administrators don’t anticipate re-rating programs until completion of Waypoint 1 of the strategic plan.
“As we head past that point, or into Waypoint 2, I think we would be wise to re-examine the prioritization method,” he said further in the email. “We would certainly accept input on revising that method; just as we accepted and used a good deal of input on devising this method.”
The program prioritization process and the problems with peer review
Last spring, UI faculty and staff who participated were given a set of essays from programs, whose chairs or directors each wrote two essays each, explaining how they fit into UI’s mission in one, and the strategic plan for 2016-2025 in the other.
The Institutional Planning and Effectiveness Committee (IPEC) presented the targets and rankings at two open forums Sept. 11.
Though administrators said the process involved more faculty and staff than it had in years past, some faculty said they felt the process was unfair. Several professors spoke up about what they saw as the university’s use of flawed data.
Despite facing sharp criticism over the methodology from Yopp and other professors at the town hall meetings, IPEC carried forward with the process and weighed the ranking data against other enrollment data. Wiencek said in an Oct. 5 memo the findings did not support a change in reallocation targets.
“In the absence of a clearly decisive mistake, I opted to just go with the – highly controversial, I’ll admit – cuts that were out there because there was really no fundamental rationale for changing those,” Wiencek said. “You have to understand that if I reduce the cut to a unit that has, let’s say, a large student credit hour load because of that correlations, that decreases the funds to a unit that was rated very highly, so I have to make very strong evidence for making a change,” Wiencek said.
Ely said raters who evaluated academic and non-academic programs did not undergo a process of reliability evaluation to ensure they were evaluating programs based on ob
jective criteria and not personal preferences. He said this process of reliability evaluation is necessary for publication in reputable academic journals related to humanities or social science.
“We probably ought to make sure we’re using the same kind of rigor in the studies we’re conducting on the campus as we do for the studies that we’re trying to publish as fac
ulty members,” Ely said. “That said, we can work on this (for next time.)”
To verify raters are making judgements based on the same criteria, rather than personal preferences, Ely said researchers must create a rubric of qualifying criteria, discuss raters’ differing scores and repeat the process multiple times to assess how reviewers agree with each other.
“If there’s no objective criteria, you can’t make it any more than a popularity contest,” he said.
Evaluations of narratives for academic programs and for non-academic programs, were “translated into a normalized score” and weighed at 47.5 percent against other department data, such as student credit hours and the number of students seeking terminal degrees.
Provost Wiencek said narratives were sought after because they provided a better way to solve problems with comparing programs from different fields and prioritizing them based on their relevance to the strategic plan. He said for evaluating outreach programs, the quality of teaching or research, narratives helped to level the playing field between programs that serve different roles.
“That would allow each program to say, ‘within my field, this is how we measure the quality of research and this is how we are doing and this is how we self-assess ourselves in this area,’” Wiencek said. “The challenge now is that the reviewer is going to have to compare fine arts, perhaps, to chemistry. That’s a challenge, but that’s the reality we’re dealing with.”
IPEC, the committee responsible for a significant portion of the creation of the program prioritization process, is likely to discuss ways to move away from narrative evaluations for future program prioritizations in their next few upcoming meetings, Wiencek said.
“There’s going to be differences. Inherently there has to be. We’ve learned that perhaps that wasn’t as robust an evaluation tool as it should have been and therefore let’s look for something that might be better,” he said. “Now, we’re just going to be proposing things and its going to be pushed out to the university audience again, like we did the last time.”
From the Provost’s Point of view
The chance for UI employees to participate in the program prioritization process came even before narrative evaluations.
Faculty and staff were asked by Faculty Senate and Staff Council to join program prioritization working groups for their respective employee titles, Wiencek said.
These two working groups, which each consisted of 20 UI employees, created evaluation criteria and weights for program prioritization.
“There were feedback periods where we had people providing feedback through slide-o or through email,” Wiencek said. “Really, all faculty and staff on campus, at various times, had opportunities to provide written feedback. That was evaluated and read by a variety of audiences.”
He said he had even tried to reach out to scientists and researchers for assistance.
“I am not an expert at developing surveys or polls. I never claimed to be,” Wiencek said. “I know faculty who are experts in this are upset that their expertise wasn’t brought into this. I wish they would had been involved as well, and so again, a learning lesson for us as well, to make more proactive outreach. That being said, perhaps they could have been paying attention as well and realized that these things were coming.”
A 2013 Idaho State Board of Education (SBOE) directive requires all Idaho four-year public universities look to reallocate current funds, before requesting more, focusing on the institution’s “Mission, Core Themes and Strategic Plans,” according to SBOE policies.
The SBOE mandate that required UI do program prioritization made no stipulations for how programs were to be ranked, or how criteria were weighed, SBOE spokesperson Blake Youde said. Foisy said UI received approval for a barebones version of the plan from the board, which he said did not include peer-review plans yet but met the board’s standard of programs by quintiles, within five evenly sized groups.
“What they wanted at the end of the day was a quintilized ranking of academic programs and non-academic programs,” Foisy said.
The future for program prioritization: to rate, or not to rate?
IPEC met Thursday and reviewed the program prioritization process, including evaluation criteria and the transparency of the process. Wiencek said the committee will likely spend the next few meetings discussing how to approach the state-mandated process the next time around.
“There’s already a good linkage to the strategic plan. We think it could be stronger and (we’re) trying to move away from narratives and polling to the cascaded plans, which hav
e been developed, and having those cascaded plans evaluated in some kind of review process,” he said. “And that review process will have to be socialized and vetted with the faculty at large, like we did the last time in developing the criteria and to get people to agree that this is the way we want to move forward.”
The strategic plan outlines numeric goals for UI to meet, but allows a lot of freedom for individual departments and programs, Wiencek said. Cascaded plans allow for individual emphasis in reaching different goals in the strategic plan, and allow for programs to develop plans that prioritize the strength’s they have in reaching the goals.
“These become localized operational plans, and a waypoint is just trying to take that nine-year (strategic) plan and break it into chunks,” Wiencek said.
Three waypoints divide the nine-year plan evenly into sections of three years, and each waypoint emphasizes a specific goal and numeric targets for reaching it. The first waypoint, which the current program prioritization contributes toward, focuses on student success and improving the workplace environment at UI, he said.
The cascaded plans were developed early fall of 2016, before UI began the program prioritization process. Administrators have yet to create evaluation methods.
“Where we are right now in IPEC, is … ‘Let’s figure out how we’re gonna evaluate the people’s success in completing their cascaded plans,” Wiencek said. “What’s the rubric? Who’s gonna evaluate? How are we gonna break that up into groups that make sense?”
Kyle Pfannenstiel can be reached at firstname.lastname@example.org or on Twitter @pfannyyy
What does it mean?
Four million dollars will be cut from academic and non-academic programs at UI and will be reallocated towards market-based compensation for faculty and staff and competitive pay for teaching assistantships.
President Chuck Staben has received all the reallocation plans from vice presidents, who he will review them with, before the finalized budget is released. He expects them to be complete by Jan. 1. The changes take effect July 1, 2018 for the duration of the 2019 fiscal year, which ends June 30, 2019.
UI compared the program prioritization process data, which included rankings and other program data, against enrollment data and Provost Wiencek said the data does not support modifying those targets in an update on program prioritization Oct. 5.