BPOs face many challenges when they process documents. These problems occur due to changes in images or the information itself as discussed here. Challenges can also result from operational or requirements issues.
Keeping Up with New Business & Regulatory Requirements
It is not uncommon for requirements to change either based upon business or regulatory need. For example, in health claims processing, the industry recently underwent a major change in the number of codes used to report diagnoses and procedures. The result is that the data used to validate the output of OCR changed from several thousand codes to tens of thousands of codes. Any BPO processing claims had to update their CPT validation databases with these new codes. If they did not update their databases with the new codes, it resulted in a significant amount of fields being rejected. Other examples are not always as visible and can result from internal business needs of the client that are not communicated to the BPO. Often the result is the delivery of output that falls short of expectations.
Prepping Adequately for Software Upgrades
Another challenge is with software components. If you have any knowledge of BPO operations or their predecessors, the service bureaus, you will know that once they adopt a technology, they usually stay on the same platform for a long time. The reason is that there is a lot of customization, testing, and training that is required in order to not negatively affect expenses or their service agreements. And yet, lots of great technology is out there that can make any given operation much more profitable. The danger is that, without having accurate baseline measurements, any upgrade can either directly or indirectly affect service quality and ultimately increase costs. The reasons are numerous and can include a change of input specifications of the software which result in rejection of non-conforming images, minor changes to image pre-processing capabilities that affect recognition, as well as changes to recognition itself that increase error rates.
Any change of software must be preceded by measurement of existing performance using a representative sample set of real data against the performance of the new system. Any change, no matter how small should be scrutinized to understand operational impacts.
Managing the Impact of Staff Turnover
Another operational challenge involves the staff that perform the document processing. While it is possible to have an operation that enjoys low turnover, most of the jobs involved with document-based processing are lower-level and typically involve larger turnover than in other industries. The effect is not only with training costs, but actually with data quality. The best-performing BPOs spend a lot of time evaluating the performance of every person involved with document processing whether it is the speed and accuracy of sorting batches or the efficiency of data entry operators. Having this data allows them to establish quality standards that support service level agreements. Any staff turnover can have an adverse affect on overall data quality. If you haven’t implemented consistent measurements, then your data quality is probably not what you think it is.
Accurately Measuring OCR Performance
How a BPO measures it’s OCR performance is also critical. It all comes down to using sample sets and truth data. What happens when a BPO doesn’t keep its sample sets up-to-date? Documents change; both in the layout and the data. When the sample sets used to measure performance remain the same, the BPO becomes blind to their actual data quality. An account that once produced output with 99.5 percent accuracy might now suddenly be reduced to 97 percent accuracy. While this degradation might seem slight. this can easily translate to a six-fold increase in errors.
Ensuring Sample Data Sets Represent All the Data
Another problem with sample sets is when they are not truly representative of the production data stream. If a BPO processes invoices from the US, Canada, and Australia, but only has US invoices in its sample set, then the measured accuracy is not reliable. In this case, it is highly likely that accuracy and error rates would be worse than measurements.
BPOs must evaluate a random stream of production data and then accumulate a statistically valid number of samples based upon the overall production volumes. It helps to understand statistics and confidence intervals. Think of the margin-of-error used in political polling. It’s the same thing. And if you have very stringent data quality requirements (e.g., 99.5 percent accuracy), you will need to select a confidence level of 99 percent with a margin of error of 1 percent. This means that if you have a production volume of 500 images per day and you wish to take a sample of that volume, then you need to have a sample size of 485 images in order to accurately measure performance. For a volume of 100,000, the sample set needs to be just over 14,000 images. Daily volumes may be different from week-to-week, so it’s better to take samples from over a longer time period. Overall, your sample set needs to properly represent your overall volume of production documents.
Implementing 5 Key Strategies
Effectively implementing these five strategies may seem like a big challenge. And you’re right. It is. Many highly capable and well-known corporations are grappling with these types of issues and, even once the issues are identified, they may require additional staff expertise and a lot of additional time to address them. The result is costs that are higher than they should be and service levels that aren’t being met.