Data inaccuracy costs retail £1.4 bn
A new white paper, co athored by Professor Alan Braithwaite, estimates that data inaccuracy among the top five retailers and their suppliers is costing as much as £1.4 billion per year – one per cent of total revenue, and a considerably higher figure than previously estimated.
The new study was conducted by LCP Consulting and follows a recent report from GS1 UK and Cranfield School of Management which made a more cautious estimate of £200 million as the annual cost of inaccuracies made in retail.
By applying Six Sigma statistics to the results from GS1, LCP found that the retailer data published by GS1 showed an “extraordinary level of inaccuracy – 29,000 times worse than six sigma and suggesting that one in every 10 data elements being incorrect”.
The scale of any errors as well as their frequency is a major driver of their cost. LCP’s calculation of the true cost of data inaccuracy identified in more detail the business performance levers that are connected to a retailer and showed how they can impact individual company performance.
The LCP paper says that for any CEO, CFO or trading director, the suggestion that there is one per cent of margin to be won makes this an interesting prize and worth addressing. The paper sets out eight key steps to secure the benefits of high levels of accuracy:
1. Measure actual performance – continuously.
2. On physical dimensions, use equipment like CubiScan to capture missing data on goods receipt.
3. Monitor and analyse data adjustments.
4. Set up a perpetual audit process.
5. Apply the framework to identify value potential and focus on the big opportunities.
6. Systematically improve Processes to manage input.
7. Build six sigma accuracy into cross-functional KPIs and make data quality everyone’s responsibility.
8. Automate data alignment where possible.
The paper has been prepared by LCP Consulting for joint publication with Zetes and was co-authored by Professor Alan Braithwaite, chairman of LCP Consulting and Visiting Professor at Cranfield School of Management and Professor Richard Wilding from the Centre for Logistics and Supply Chain Management at Cranfield School of Management.
Braithwaite said: “From our experience of working with many companies, data accuracy is poor with errors in physical dimensions, pricings and operational parameters such as shelf fill, replenishment quantities and order quantities. As this report shows there is a big opportunity cost hidden behind this problem. Companies need to take a fresh look at their master data management processes alongside their data identification and capture methods; the business cases from investing in both identification and processes may be bigger than they expect. This backroom stuff is crucial.”
Wilding added: “The reported levels of inaccuracy and their associated costs are worrying. This is especially the case in the context of the enormous investments that all the big retailers have made in product identification, data capture and supply chain integration, and the focus that many companies have put into lean and six sigma methods.”
The paper goes on to highlight another key challenge in the form of GTINs (Global Trade Item Numbers) which are forecast to increase to 250 with the addition of food safety data. As some of this data will bring liability implications for retailers and manufacturers, corporate accuracy may yet become an issue of corporate governance and social responsibility.
The paper has been prepared by LCP Consulting for joint publication with Zetes. Two leading supply chain experts co-authored the paper – Professor Alan Braithwaite, Chairman of LCP Consulting and Visiting Professor at Cranfield School of Management; and Professor Richard Wilding from the Centre for Logistics and Supply Chain Management at Cranfield School of Management.