Embedded Research & Evaluation – The Process, The Story Continues Part 3

View Report

In my last few posts, I described our Embedded Research & Evaluation (ER&E) process.

Our ER&E Process: A Step-By-Step Guide, continued

Steps 1 and 2 describe how to identify an Embedded Research and Evaluation (ER&E) project opportunity, and the importance of setting up the project team with the program staff leading the effort. Steps 3, 4, and 5 discuss how to ensure researchable issues and methods align with program theory - what the program is designed to do and how it intends to do it. The steps are the same for ER&E and traditional evaluation; the difference is in the timing of the evaluation effort.

ER&E occurs alongside of, and in conjunction with, program delivery. In addition to helping program staff assess program operations in real time (the evaluation component), ER&E ensures that key data and information needed to inform future program direction (design, interventions, delivery channels, marketing, etc.) are collected throughout the program offering.

Steps 6 and 7 (discussed below) describe the importance of defining and tracking meaningful metrics, data, and information.

Step 6. Define metrics. Determine data and information needs.

Metrics, data, and information can support both the evaluation and research components. Metrics are fundamental for understanding program performance. Metrics also help to demonstrate, and ensure, the value of the ER&E effort by establishing the baseline performance against which to measure progress – or lack thereof.

The data collected is used to calculate metrics while information gathered is used to understand results – to provide the context and story behind the numbers. Data and information should also support the research objectives of the ER&E effort.

For example, let’s say there is an ER&E effort underway to understand the underperformance of a midstream residential HVAC program (units sold/installed are behind targets and program cost per sold/installed unit is increasing). Also, to support a research objective, we want to gather data and information to understand the current market conditions for residential HVAC equipment, and where the market is likely to head in relation to the demand for energy efficient equipment. This will provide program staff with data and information needed to define future program targets and the budget required to achieve objectives should the program continue. Note, this is not a potential study effort; however, it is possible that through the ER&E effort we might determine that a potential study is better suited to provide the data and information required.

Examples of metrics we could define are:

  • Market penetration of program qualifying equipment
  • Market opportunity for program qualifying equipment
  • Number of program partners per number of possible program partners
  • Program cost per qualified unit sold/installed

Examples of data we could obtain and track are:

  • Number and type of program qualifying units incentivized for each program partner (where program partner is the participating distributor)
  • Number and type of program qualifying units sold/installed but not incentivized for each program partner
  • Number and type of program qualifying units sold/installed outside of the program (non-program partner sales/installations, where non-program partners are non-participating distributors)
  • Number and type of non-program qualifying units sold/installed for each program partner
  • Number of non-program qualifying units sold/installed outside of the program (non-program partner sales/installations)
  • Program participation trend with program changes (if pertinent) noted along timeline
  • Program cost trend with program changes (if pertinent) noted along timeline

The information critical for understanding the results of the metrics and data can include (and is not limited to):

  • Changes, if any, to program design or delivery
  • Regional/territory differences (population, demographics, and economy for example)
  • Barriers to program partner participation
  • Availability of program qualifying equipment within program delivery territory
  • Availability of program partners within program delivery territory

Step 7. Determine how metrics, data, and information will be obtained and tracked.

Once metrics, data, and information is defined, ensuring it can be collected and tracked is essential. A consideration during this step is the cost of data collection and tracking as well as the impact of the effort on program operations and participants. ER&E shouldn’t create a barrier for participants or cause undue burden for the program.

In the Next Issue

Next up is mapping methods, activities, metrics, and data to the researchable issues. This is a critical step meant to ensure that the ER&E effort can and will deliver the data and information needed to meet evaluation and research objectives.

Picture of Teresa Lutz

Teresa Lutz

Earlier in my career, I worked for a utility supporting the design and delivery of energy conservation programs through evaluation and research. At that time, I did not love the evaluation process or the evaluation community. The value of evaluation was a tough sell to my coworkers, and I agreed the evaluation process and results could be better. We wanted more timely feedback, recommendations we could implement, and insight beyond what we already knew. As a consultant, I hold those experiences close. I avoid doing ‘evaluation for evaluation’s sake’. I am fixated on figuring out the Big WHY of what we do, what works and what doesn’t. It is through knowing this that we can improve and prosper in this industry.

More posts by Teresa Lutz