Yesterday I wrote about the de-professionalisation of records.
Ultimately, it comes down to trading off accuracy for perceived efficiency.
Efficiency is only perceived as efficiency because what we really do is shift the cost of records from a professional team, to other teams in a way that we can’t measure.
That’s bad, but accuracy is far more troubling.
Accuracy is what you will build the entire future of your organisation on. Any process that relies on the records relies on their accuracy.
Unfortunately, what we end up trying to measure is the cost of work that won’t get done because it simply isn’t feasible, because our records aren’t accurate enough to produce reliable results, and the costs of getting them there are so large that it’s not feasible.
There are going to be winners and losers in this.
Winners will have high quality records, and will be able to take advantage of machine learning and many, many more automation technologies as they become available. They’ll also pass audits – which is nice.
Losers will have to do one of two things –
- Start a records program to produce high quality records.
- Wait for strong AI that can do the work anyway (in 70 years time).
The way to win is to automate and architect. When information enters the organisation in pre-defined structures (or semi-structures), and gets handled automatically you can have both efficiency and accuracy.