Monitoring production data is one of the most helpful ways to determine if an application is successful. Depending on the data that is captured, one can learn anything about traditional IVR performance indicators, from recognition accuracy to the system's ease of use.

Production calls, either in whole or in part, are often used to assess application success. The most important thing to note about using any part of a recorded call, and this also means even if you're just recording utterances for tuning purposes, you are legally obligated to let the caller know you're doing so. This can be done with a simple "calls monitored for quality" message at the beginning of the call. Several law suits have successfully wrested away significant monies from otherwise well-intended businesses because a call recording notification was not played. We on the AVIxD editorial committee are not attorneys, and cannot provide legal advice, so if you have further questions, please consult with appropriate counsel.

Monitoring data is essential to improving recognition performance. Without knowing what callers are saying, there is no way to improve the performance of the grammars. Most of the tuning activities that are performed stem from monitoring production data. See Chapter 8 for further details around grammar optimization and Chapter 15 for more information about Tuning.

Determining trouble points in the call flow is also aided by monitoring production data. With such data, we can determine where callers are hanging up or erroring out. It gives us specific states to focus on for improvement. One caveat, however, is that production data is often open to multiple, equally-reasonable interpretations. Without knowing users' motivations or understanding the details of the environments from which they are calling, it is often impossible to determine with absolute certainty what a specific item, or even pattern, in production data truly means.

Another benefit of monitoring production data is that reporting really helps track specific performance metrics. This is especially critical if there are contract implications (service level agreements) for things like containment, transfer rate, abandon rate, caller satisfaction, etc. Without data to build a report, there's no easy way to track such metrics. During the design phase, these reports are designed so that all parties are in agreement on how metrics are calculated.

Often callers will ask what they can or should expect to achieve in terms of KPIs upon initial deployment. This is a challenging question to answer, because almost all IVRs are custom applications. Very few are similar enough to one another to provide a solid comparison. What you can securely answer is what KPIs you should be monitoring and reporting on. In general, take a look at containment (i.e., the number of callers who transfer to an agent for further assistance vs. who do not -- but also see Logging Strategy and
Partial Automation vs. Full Automation), and any IVR-specific CSAT measurements. Recognition accuracy is also a good measurement of success, because the higher this measure, the higher containment and CSAT will both be.