What should the goal be for document automation—data entry or validation? Should success only be measured by how much data can go straight-through with no validation? This measurement is the overwhelming factor for many organizations looking to add automation or refresh their old technology.
Optimal Accuracy with High STP Rate
Our efforts at Parascript are focused on the optimization of our technologies to achieve both a high rate of accuracy and a high rate of straight-through processing (STP). Whether this strategy works for your business depends upon two things:
- Accuracy requirements of your data; and the
- Level of automation you can receive based upon those accuracy requirements.
Using Data Entry to Meet Data Quality SLAs of 99%
Let’s take a business service provider (BSP) that has a data quality service-level agreement (SLA) of 99%. In this case, all the BSP’s data extraction is performed by human operators that key from the image. It is well-known that human error ranges between approximately 2% to 5% although it can be higher for some data types. In this scenario, human data entry cannot achieve the needed SLA level. What needs to be done? The standard method of achieving high accuracy is a double-blind data entry workflow. In this instance, two operators each perform data entry on the same set of documents. Where entered values for each field are the same, it is treated as accurate. If there is a disparity between the two operators, a third operator intervenes. Regardless, the minimum level of intervention is two staff with potentially three staff involved. Given this type of workflow, it is easy to see that achieving high-levels of accuracy with manual data entry is very expensive, even for offshore data entry.
Leveraging Automation to meet Data Quality SLAs of 99%
Most service providers seek automation to remove as much of their manual data entry as possible. This is to be expected. However, the manner of automation is where we should focus. Certainly, one objective is to look at the efficiencies that result from completely removing data entry and the question would be to ask a vendor: “what amount of data can go straight-through at 99% accuracy?”
Another strategy is to address the need for using two or three staff on the same data. The question then is “on what amount of data can I remove the second or third data entry operator and still maintain 99% accuracy?” The difference may seem subtle, yet it can be significant.
All Automation Approaches Are Not Equal
The answer to the question, “what amount of data can go straight-through at 99% accuracy?” is that straight-through processing may only be achieved about 50% of the time. With a lot of effort and tuning, it could achieve higher percentages.
The answer to the next question, “on what amount of data can I remove the second or third data entry operator and still maintain 99% accuracy?” could be an immediate 80% to 90% elimination of the double-blind data entry. This can occur by using automation technology as the primary data entry and having a secondary operator independently key the same data. Where the answers match, they go straight through.
To make this even more interesting, if the automation can be tuned to only output data at 99%, a good percentage of it can still go straight-through, further reducing manual data entry. It is a best-of-both-worlds strategy that many don’t even consider. However, it is one that can offer both immediate significant and longer-term cost savings.