The Business Incentives Initiative is coalescing around a set of core themes to improve data, accountability and transparency in economic development incentive use.

Launched in April 2014, the Business Incentives Initiative is designed to “improve decision-makers’ ability to craft policies that deliver the strongest results at the lowest possible cost” by identifying better ways to assess and report on incentive programs.

The Business Incentives Initiative is a joint project of the Center for Regional Economic Competitiveness (CREC) and the Pew Charitable Trusts. Smart Incentives is pleased to be part of the CREC team working on this valuable effort.

During a recent meeting of the six states participating in the Business Incentives Initiative, state decision-makers shared a promising set of policy approaches and technical fixes that will help economic development professionals better manage their programs in order to obtain valuable information on incentive program outcomes.

Defining goals and metrics

Effective evaluation begins with clear, measurable program goals. Whether created through legislation or administratively, incentive programs should have a defined purpose. Vague language related to jobs or economic development makes meaningful evaluation difficult.

Metrics to drive evaluation should reflect the goals of the incentive and:

  • consider available data
  • use clear and consistent definitions
  • allow for comparisons among programs with similar goals
  • connect to the economic well-being of residents

Overcoming data-sharing barriers

Economic developers need access to state data sources to determine costs and outcomes for individual incentive agreements in order to evaluate their effectiveness. The project team identified tax data to determine the costs associated with tax incentives and workforce data to verify job creation and retention as the most valuable resources to help assess programs.

Steps to enable data-sharing for program evaluation include:

  • Prepare strong MOUs defining data terms, codifying responsibilities and explaining goals and benefits for both parties.
  • Focus data-sharing on aggregate rather than individual records.
  • Clarify statutory language on confidentiality to reduce the risk to the data “owner” of sharing data.

Data limitations and confidentiality rules may limit the value of data sharing for monitoring compliance for individual transactions.

Implementing an effective evaluation process

Conducting a quality evaluation is not a simple undertaking. Assessing whether a program is achieving its intended goal requires adequate time, resources and expertise.

  • Tax-focused incentive programs should be assessed every 3-5 years, while incentive programs with a wider scope such as community investment programs should be assessed every 6-10 years.
  • Evaluators need a strong analytical skill set and should be willing to make policy recommendations.
  • “Deal makers” should not conduct the evaluation. Third-party organizations such as legislative fiscal offices, audit offices, or universities may be better-suited for this task.
  • Evaluations should not strive to simplify all program outcomes into a single number, but should provide context and analysis.
  • Improving performance – not finding fault – should be the ethos.

Reporting for transparency and accountability

Annual omnibus reports don’t necessarily serve the economic development community or stakeholders well. These reports, which are often statutorily required, can be cumbersome to prepare and uninformative to readers trying to understand incentive programs in their communities.

Reporting should reflect different types of information and recognize the needs of different audiences and timeframes. A simple framework might contain:

  • Basic information on incentive agreements released quarterly or on a rolling basis and made available through a searchable online data portal.
  • Summaries of program activity, accomplishments and costs provided every year in an annual report format and/or reported on a web site.
  • Program evaluation reports released and designed to generate conversations with lawmakers and other stakeholders. Since outcomes may take several years to achieve, reports conducted every 3, 5, or even 10 years make sense for this purpose.

Creative presentation combined with stronger context and storytelling can make a tremendous difference in an audience’s ability to absorb complex information.

State efforts

The six participating states are developing plans for improving data collection, reporting and evaluation of business incentives. The specifics are still under development, but areas of focus include:

  • Indiana is focused on effectively implementing HB1020, a tax incentive evaluation law that was enacted earlier this year.
  • Maryland is working to streamline legislatively mandated reports and share data more effectively across agencies.
  • Michigan is developing new avenues for sharing data among state agencies involved in administering and evaluating business incentives.
  • Oklahoma is considering creating a process to evaluate the effectiveness of the state’s 70+ economic development incentives.
  • Tennessee is working to forge an interagency agreement that will allow the state’s economic development agency to access job creation and wage information from the state’s workforce agency.
  • Virginia is seeking to make public reporting about incentive programs more transparent, comprehensive and user-friendly.

“What policymakers in all these states understand is that reviewing tax incentives regularly and rigorously is both pro-business and pro-economic development. The question that states are trying to answer is not whether they should engage in economic development, but rather how they can ensure that their efforts are as effective as possible.“ (Josh Goodman, Evidence Mounts, Site Selection Magazine, November 2014)