What Woolworths and project management can teach us about the iron rules of classification

Something that has made me alternately frustrated, depressed and extremely angry in the last few months is that there appears to be NO research about the impact of classification scheme design on records management system usage.

In the face of no research, I’ve had many arguments with long standing practitioners who tell me that the practices we use are “well proven” – while conveniently ignoring rates of systematic control of records between 10 and 20%, and the FACT that when you put people on a windows file server and let them organise their own information, they happily implement a classification scheme that makes sense to them and use it. Incidentally, the classification scheme they are using is very different to the one we try and implement for them.

The most infuriating piece of practice I keep running into concerns variable terms and non-variable terms. No one in records management seems to like the idea that there’s a variable term above a non-variable term.

This means that we try and implement a project classification scheme that makes many people go somewhere else. The example that keeps coming up is project management.

Project managers universally (in my experience) want –

Project name –> Project stages

This is how they think about their work.

What we seem to want them to deal with is

Project Stage –> Project Name.

This is apparently considered “best practice” and “proven.”

It makes project managers go somewhere else – where they inevitably create a file structure that goes Project name –> Project Stage.

This structure is also copied in EVERY SINGLE PIECE OF PROJECT MANAGEMENT SOFTWARE that I’ve ever seen. None of them start at stages and move down to projects. NONE.

To recap –

  • Project managers organise their information in a certain way.
  • Project management software vendors organise the information in the same way.
  • Records managers do it differently.
  • Records managers can’t get project managers to use their systems.
  • The way records managers do it is “proven” and “best practice.”

We could argue about the virtue of a static number of stage files above a potentially unlimited number of project files, but that is an academic argument that still sees project managers not knowing how to deal with what we’ve created and going somewhere else.

We could also argue about the problems of variable terms above fixed ones – but it is something we do for our convenience and the convenience of system vendors, not the people that we actually need to use the system.

Lets move on to woolworths.

I am 100% comfortable saying that their classification scheme does not stand up to records best practice. Simply looking at their drinks shows you that they’re not interested in a “best practice” classification.

You’ve got Coke in the “drinks” section with 180kj/100ml of energy. Monster in the “energy drinks” section with 9kj/100ml of energy, milk (a drink) in the dairy section with 247kj of energy/100ml, orange juice (drink is broader than juice) in the “juice” section with 180kJ/100ml.

So why do they organise their stores like this?

Simple – they’re in it to make money, and they know that organising the stores like that is more effective.

And to me, that should be the iron rule of classification.

Do what is effective.

If you implement best practice and usage goes down, back it out. If you implement terrible classification practice and usage goes up – stick with it.

Records management is not struggling because we have a shortage of best practice.

Records management is struggling because we can’t get people to use our systems.

We should take a leaf out of woolworths book and organise the information we have in the way that gets us what we want.

And what about classification to ensure we have good archives and can classify according to disposition needs?

The problem we have at moment is getting people to put information in the systems. If it never gets into the systems, it’s never going to get to archive, and if we never get control of the information disposition and our profession become irrelevent. It has to get in the system first.

The only iron rule for classification scheme development should be “do what gets people using what you’ve built.”

Where I think the next round of major improvements in records management effectiveness are going to come from

User experience research.

I think it’s the core of what’s missing from records management.

We have a unique value proposition – which is the concept of business evidence and it is timeless and will be fit for purpose forever. But…

20 years ago a different group of people started doing records management.

Records management went from being done by professionals to being something that was done by people using what professionals had built.

Simply put, 20 years ago, we started to have to look after users.

But no one has ever trained us to look after users.

Universities don’t teach us how to look after users.

There’s no research into the the link between how we build systems, and the effectiveness of those systems.

And we can’t get them to use what we’ve built.

Because we have rules of thumb that worked when we didn’t have users but that I think now actively work against adoption by them.

Rules like “you can’t have a fixed term above a variable term” in a classification scheme – which are convenient for us, but which I think mean that people who would use our systems go somewhere else.

Don’t agree?

That’s the problem – there’s no agreement, no central pool of research that says “this works, this doesn’t.”

There’s no A/B testing.

No card sort.

We need user experience focused research to tell us why we can’t get the bar up.

We need it to save us from the practices that used to work.

If you’re already doing this, know someone who is, or would be interested in doing some research in the area, I’d love to talk to you about it – please reach out https://www.linkedin.com/in/karlmelrose/

When does records practice stop being proven, and how do we know?

Recently I’ve been discussing records practice with people and looking for the evidence that it is succeeding.

The thing that keeps happening, is that we reach a point on a discussion of a practice where whoever I’m discussing it with will say “it’s well proven” – or something to that effect.

What is interesting about this, is that the anecdotal feedback I get from the organisations I talk to is that records practice is currently gaining systematic control of between 10% and 30% of records.

To me, this doesn’t say that practice in its current form is succeeding.

Logically, this means that the proven practices that we have need to change.

I think that the records regulatory authorities know this.

In the last couple of years we’ve seen frameworks go from completely prescriptive with metadata standards that were very difficult to implement, to frameworks based on a handful of principles, and 3 – 5 pieces of metadata.

I also had an opportunity recently to pose a question about whether the model of sentencing and disposition we have is fit for purpose to a RIMPA panel of two directors of state records, and an assistant director from National Archives. One of the directors said that he didn’t think it was fit for purpose – but also that he didn’t currently have an alternative.

What this all points to for me, is that we have to re-examine what “proven” means.

The core of our profession is high quality evidence of business practice.

The practices that we have are supposed to produce this.

If they aren’t, they need to change.

The question is, how do we know what needs changing?

I think complexity theory has the answers for us here. When dealing with complex problems, complexity theory tells us that we should run multiple “safe to fail” experiments and when we find evidence that one is more effective – we do more of that.

Ultimately, the question that we need to answer for everyone is “how much of our practice is proven, and how much is effective?” The only way we’re going to know is by finding evidence that what we do is more effective than the alternatives, and the only way we can do that is to try the alternatives.

If you’re already doing this – I’d love to hear from you and I’d love to take an opportunity to share your story (good or bad) – so please leave a comment or find me on LinkedIn.

What it means to be a user in records management, and what it means for professionals to have users.

It means to use something built by a professional.

We have to rememebr this, because the vast majority of records best practice is developed by professionals for professionals.

We are a profession with 5000 years of being done by professionals, for professionals.

The problem though, is that the vast majority of records management is now done by users – and this has only been the case for about 20 years.

So we have a problem.

5000 years of professional records management.

20 years of users.

Our training and resources teach us how to think like professionals.

This means that if we build systems based on our training and resources, we’re going to build things that only professionals can easily use.

And I think that probably explains quite a lot of our problems.

The question we all need to ask every day is “have I built this system* to be easily understood by the users I have, at the maturity level they are at.”

*System in the general sense, including but not limited to the IT system sense.

Why we need to define records again (yes really, hear me out – it’s a coherent argument, I promise)

I’m going to say up front that it doesn’t matter what definition you use.

Just that you use one.

Because:

  1. “Records is just the paper.”
  2. “Email aren’t records.”
  3. “Post it notes aren’t records.”
  4. “That’s not a record, it’s a database.”

These are all idiotic blanket statements made by records illiterate people who can influence your executives.

They are also idiotic blanket statements made by records illiterate people who use them as rules to decide whether they will keep a record.

When we let other people decide what records is, we end up with whatever illiterate thing people decide to give us.

A definition of records in our organisation is something that we should all be fighting for.

It’s foundational.

It’s easy.

It lets us decide what records is.

It tells the organisation what records are.

It’s a standard that we can hold them to.

There’s also a heavy responsibility.

A simplistic definition creates ambiguity, an overly complicated one creates a barrier.

The underlying point remains though – as long as we let non-records people decide what records is about, we’ll keep getting stuffed into the basement to deal with whatever garbage they want us to deal with.

If we fight for the definition we need in our organisation, define its value and prove that it has an impact on performance, we might just get the records we want.

How to set records strategy.

I think it starts with three questions –

  1. Where is the organisation under performing because it does not have efficient access to reliable business evidence?
  2. Where is the organisation accumulating an excess of business risk that we can mitigate through better business evidence?
  3. Which strategic priorities of the executive team are going to fail without a strategy for business evidence?

I think that once you’ve answered those questions, you have everything you need to work on.

The first thing a records team should invest in

Is a service management platform.

The next thing the team should do is refuse to accept any work that isn’t directed via the service management platform.

The reason is simple – most records teams struggle to justify their value.

The simplest relationship in value management is “cost” to “how much stuff you do.””

If you can’t measure it, you have no basis for telling the organisation how much work you’re doing for them – and how much value they’re getting from you.

As the old saying goes, “if you can’t measure it” you can’t tell people what they’re paying you for.

If you can’t tell people what they’re paying you and your team for, they’re never going to give you more money to get more done.

Quantifying the value that records management adds to the organisation

The two big questions on my mind are:

  1. How do we do this?
  2. What happens if we don’t?

The majority of people that I talk to aren’t doing it.

They’re also struggling for funding because executives making capital allocation decisions are allocating their capital to everything else.

Why though would an executive allocate scarce capital resources to something with no quantified value? If no-one is quantifying the value, all the executive is going to see is the cost. Left column – -$1,000,000, right column +0.

So I think what happens is what we’ve got now – records chronically under-funded and poorly understood.

I think though that the problem is even larger than that.

When we don’t quantify the value of what we do, we adopt practices based on ideas about value that may not exist.

Disposition for instance hasn’t changed in 25 years yet the cost of storing information now is about .1% of what it was 25 years ago.

We also justify disposition by talking about things like FOI and how long people spend searching for information – but the numbers either don’t exist or are virtual. No one believes them because they’re “IDC says” or “Gartner says”. They’re not – I sat with John in accounting for a day and he spent 3 hours trying to find accounting records, or I sat with Julie in our development team and she spent 3 hours trying to find old development records.

We also end up with users trying to decipher disposition focused classification schemes – because without a rigorous, value focused practice, it’s acceptable to put an incomprehensible classification scheme in front of a user and consider our job done.

So how do we value records practice?

I want to hear from people who have achieved this, who have real world, in their own business, rigorous valuations of practice. We all need that evidence.

My view on how it should be done is simple.

  1. We need to evaluate each practice in terms of its options and their relative costs and values.
  2. We need a bit of information taylorism – time and motion studies.

Option evaluation is simple – as an example, disposition has no inherent value. It is a solution to a problem. The problem used to be that in order to store more records, we had to buy more buildings, but in an electronic world, that’s not the case.

So what are the other solutions? We could not destroy a record at all. This actually makes sense – the cost of storing a 300KB word document for 237,000 years is 28c, so if we run a disposition process that costs us $1 we’ve spent 700,000 years of storage costs – so we should probably not talk about that practice being good value if we expect serious people making capital allocation decisions to take us seriously.

But what about the cost of finding information? (It’s the logical question for anyone in records right!)

This brings me neatly to point 2 – if we’re going to be taken seriously when we talk about productivity based improvements, we need to measure them rigorously – which means time and motion studies of our own organisations (because no one believes numbers from a research vendor doing studies commissioned by search vendors are in any way relevant to their own organisations).

Time and motion is a simple idea. You sit with a stop watch and record what people are doing and how long it takes them so that you can find better ways of doing it.

For search, someone needs to sit with a stop watch and see how long it takes people to find things, they also need to pay attention to the person’s behavior while searching – how often do they open a document only to find that it’s the wrong one etc. This might seem a little detail oriented, but when I worked for Dell, I had a research analyst from one of the big four accounting firms sit behind me for four hours timing how long I had to wait for an application to deliver me the information I needed – it was lagging and they wanted to make sure the upgrade didn’t cost more than the problem. This is how businesses and disciplines that are serious about organisational performance do things.

Once this is done, you can actually start to find out how performance can be improved, and I’m going to bet that 98 times out of 100 it’s not going to be disposition because titling can have a bigger impact, metadata enrichment can have a bigger impact, designating files of a certain age as old and having them not show up initially can have a bigger impact – there are 90 other tools that can have a bigger impact and cost less money.

The point of all of this is to say that we need to quantify the value of what we do rigorously before we can expect serious people to give us real money.

The really good news is that nothing I mentioned here is outside of records management’s control or skill base. It’s all stuff that we can do and I think that if we can bring the money element back into records management we become much more likely to get executives making capital allocation decisions to take us seriously, and get funded.

The best records systems are built by…

People who aren’t thinking about records.

This is something I’ve noticed consistently.

The best records systems are built by people who are thinking about the information that people need to get their job done, and how they deliver it to them when they need it.

No other focus gets people thinking as hard about classifications that match the way people think about their work.

No other focus does as well at making sure people understand what’s in it for them when it comes time to put a record in the system.

No other focus ties the records system so clearly to organisational performance – and that gets managers and executives focused meaningfully on records without actually knowing that’s what they’re doing.