This post is part of an occasional series reviewing state reports evaluating economic development programs. Here we look at the October 2013 audit report, “A Performance Audit of the Utah Science Technology and Research Initiative (USTAR),” from the Office of the Legislative Auditor General, State of Utah. The findings are not pretty.
This audit is of interest beyond Utah because the USTAR program is structured to pursue several popular approaches to technology-led economic development: funding research, supporting commercialization, and conducting outreach to transfer technologies from universities to companies. Its weaknesses provide valuable lessons for communities hoping to leverage these activities for economic growth.
Established in 2006, USTAR is intended to “enhance economic development in the state” and “enhance Utah’s research universities” in order to deliver technology-based start-up firms, high paying jobs, and tax-generating business activity in the state. It does this by funding research teams at the University of Utah and Utah State University, constructing research buildings at these universities, and establishing technology outreach programs throughout the state.
The USTAR program has received plenty of accolades and was recently recognized by the State Science and Technology Institute (SSTI) at its 2013 Excellence in Technology Based Economic Development award ceremony as a national model for state and regional investments “in science, technology, and innovation to grow their economies and create high-paying jobs.”
Unfortunately, USTAR has not generated the intended results and, in fact, has substantially overstated its economic impact. Further, the audit team calls the organization to task for poor management and oversight of its programs.
Among the major findings:
- USTAR’s reported return on investment (ROI) was flawed and commercialization success has been limited.
The auditor – and I believe the statute – define ROI as “expansion of the state’s tax base driven by growth or formation of companies associated with new high-quality jobs.” USTAR, however, reported revenues such as sponsored research, engineering contracts, private investment and other program funds as the “return” on the program’s approximately $330 million investment, which included $134.2 million from general fund appropriations. This approach grossly overstated the state’s ROI.
USTAR has not tracked technology commercialization at all. Nor has clear guidance been provided on how to monitor performance among the outreach programs designed to connect with entrepreneurs around the state.
- USTAR’s reported revenues and jobs are overstated and inaccurate.
Even using the “wrong” measure of return, USTAR over-stated its impact. The audit states that reported revenues were “unrealized, invalid and overreported.” Funds that were not attributable to USTAR were counted, and company-specific funding was self-reported, was not verified and often overlapped with revenue reported by universities.
The jobs data are worse. Jobs reported “included jobs that no longer exist, were based on projections instead of actuals, and included duplicate counts.” USTAR was unable to document the jobs data, and among the sample set of companies the audit team examined, 49% of jobs could not be validated. Reported jobs associated with research were based on an economic impact model, which overstated actual job creation by a factor of ten. And nobody ever bothered to define “high-quality jobs” for this program, so it is not possible to determine how well USTAR has met this core legislative objective.
Several other findings address the need for oversight, guidance for programs, and improving compliance with legislative intent and statutory requirements associated with the USTAR program, as well as compliance with other laws, such as open meeting rules.
The audit team does a nice job recognizing and defining the difficulties associated with measuring, monitoring and managing state funding of university research programs for economic development and fiscal gain. Anyone who manages this type of program should read the USTAR audit report, not necessarily for answers, but for a good description of the governance challenges such programs face.
The USTAR board has taken the audit findings seriously. In its response, the Governing Authority agreed with and accepted all recommendations, and it appears to be preparing to “define and produce procedures needed to accurately capture our organization’s outcomes with increased accountability and transparency.”