Originally published on February 8, 2017

21st in a series of 50 Knowledge Management Components (Slide 29 in KM 102)

Metrics and reporting: capturing operational indicators and producing reports to communicate performance against goals, areas for improvement, and progress toward the desired state

There is a wide spectrum of opinion about the importance of measuring knowledge management activities. Some believe that it is essential, and want to collect data and create reports on a long list of detailed metrics. Others believe that this is a waste of time, and that effort is better spent on enabling knowledge flow.

Three different kinds of metrics are typically captured and reported.

  1. Goal-oriented measurements which directly relate to employee goals allow assessment against those goals.

Collecting and reporting on goal-oriented measurements ensures that the organization is aware of how it is performing and that individuals can be held accountable for achieving their goals. Reports should be produced and distributed every month to track progress, reinforce good performance, and encourage improvements where needed. Reporting metrics by group within the organization, for example, regions of the world or countries within a region, allows each group to compare its performance against other groups, and create a friendly competition to excel. Reporting metrics by individual may be limited by data privacy laws, and if allowed, transmitted confidentially to the manager for use in performance coaching and appraisals.

Operational metrics can be helpful in analyzing how an initiative’s infrastructure is being used, who is using it, and identifying areas for improvement. However, there is only so much which can be inferred from data such as page hits, uploads, and downloads. These metrics don’t indicate the value of any of these activities. If a user visits a web page, they may not find what they need there. If a document is uploaded, it may not be of any use to anyone. If a document is downloaded, it may not be reused. Follow these rules when deciding on which operational metrics to collect and report: Keep the time and effort required to a minimum, automating as much of the collection, extraction, and production as possible. Ask your team about which metrics will help them the most. Focus on a few key metrics which relate to your overall objectives. Use the metrics to improve the environment and test for this in user surveys. Communicate the metrics regularly so that they influence behavior.

Business impact metrics are potentially useful in justifying the expense of a program, in garnering management support, and in communicating the value of spending time on recommended activities. Anecdotes and success stories can be collected and converted into numerical values. Data can be captured in incentive points systems about the value of an initiative. Processes can be created or modified to ask participants about the business impact of initiative tasks. But there are few definitive ways to prove that a particular business indicator was solely influenced by the initiative. There are usually multiple reasons for a specific business result, and the initiative in question may be one of those reasons.

If there is a way for you to collect business impact metrics, then do so. They have more significance than operational metrics. But follow the same guidelines about limiting the effort involved to a reasonable amount.

Collecting and reporting on the measurements used in your KM program will help you to communicate progress, motivate people to improve their performance, and reassure management of the value of the initiative. Keep the effort required to do so in the right balance with other projects, look for ways to continue to streamline the process, and review the reporting process annually to keep it relevant to the current state.

Questions and Answers

Q: Why should metrics be collected and reported?

A: Here are three reasons:

  1. Take action based on what the numbers indicate. For example, if you are leading a communities initiative, report on the health of each community every month, and retire the inactive ones using a community health report.
  • 118,652 Total Members

Q: What metrics should be captured and reported?

A: Collect metrics directly related to the objectives of your program. Report on the key activities of knowledge management: sharing, innovating, reusing, collaborating, and learning.

Q: What metrics and reports should be avoided?

A: Here are five guidelines:

1. Don’t capture metrics for the sake of metrics. Many people express a desire for data that doesn’t drive any action or insight — collecting data for data’s sake. For each metric to be captured and reported, there should be an associated action or insight which is expected to be driven. Avoid collecting every random thing, sliced and diced every possible way, which someone might want to know once, but has no intent to do anything with other than say, “Oh, that’s interesting.”

2. Don’t establish a long list of arcane metrics. The fewer the number of metrics, and the simpler the reports, the better. Instead of reporting uploads, downloads, site visits, and other similar numbers, report on how the organization is sharing, innovating, reusing, collaborating, and learning.

3. Don’t attempt to measure knowledge using the metrics of balance sheets. Conventional balance sheet metrics do not adequately measure knowledge.

4. Avoid chartjunk, infographics, and bad statistics. Nice-looking, but worthless charts, infographics, and stats are a waste of effort. See:

5. Be wary of publicizing numbers which reflect actions you don’t want to encourage. For example, if you don’t want lots of groups being created in your ESN, don’t promote these metrics:

  • 30 New Groups Created

Q: How can you measure time saved through reuse? Once a document is in a KM database, how can you track the time it will save if it is reused?

A: I don’t believe that it is worthwhile spending a lot of effort to capture time saved. Here are two articles on the subject:

If you wish to try to capture time saved at the document level, you will have to deal with these challenges:

  • When a user downloads the document, they will not yet be able to tell how much time they saved. They will have to return later to input this data, and it is unlikely that they will remember to do so.

To encourage users to input such data, you can use a KM recognition system to award points for KM activities. Users can be motivated to claim points for reuse, and in claiming those points, they can be required to document the value of that reuse. You can capture the time saved for a particular document by asking for the URL of the document and the time saved as a result of reusing that document as part of the input form in the recognition system.

However, you may find that despite offering recognition and/or rewards, users may still be unwilling to enter this information. So it may be easier to give them a button they can easily click, similar to a Like button, which indicates that they reused the document productively, and not bother trying to capture the amount of time saved.

Q: How should metrics reporting be managed?

A: Here are three ways:

  1. When defining how the KM program will be governed, include the process for reporting. Include the reporting schedule in the overall plan of record, maintained on an easily-accessible web site.

Q: How should metrics reports be communicated?

A: Here are five ways:

  1. Send monthly KM metrics reports to the senior leadership team. Ask them to publish their own variations for their organizations.

Q: How can people components be incorporated in KM metrics?

A: Here are three suggestions:

  1. The means of motivating employees include monitoring and reporting on progress against organizational goals. Track and communicate progress against the established KM goals for individuals.

Insights

1. Why measure the value of KM? by Chris Collison

Measuring value can often seem like a time-consuming effort for knowledge workers. Is it worth it? Is measuring the value of KM initiatives a worthwhile way to spend time? Don’t get me wrong; clearly any KM activity needs to be linked to the creation of business value, and we need to be able to illustrate that convincingly. But, the concern is that to try and separate out the unique contribution than KM activities make can become something of a cottage industry and counterproductive to ‘getting on with the business of making a difference.’

2. Collaboration by Chris Crawford

Measuring the success of social media tools is not an enigma. We focused our measurement strategy on understanding what tools our people were using, and how and why they were using them. Our monthly Collaboration 2.0 scorecard and other reports help us continuously track usage, cost savings, employee satisfaction and, of course, the impact on the primary beneficiaries of Collaboration 2.0: our clients.

3. How do we measure experts? by Dave Snowden

What we could expect from expert KMs? Here are some potential metrics:

1. Sensemaking metrics such as the accuracy of the predictions they make about users’ preferences and practices, sophistication of their explanations for user behaviors, speed and accuracy of spotting anomalies.

2. Data gathering strategies — their power and efficiency.

3. Range and sophistication of mental models.

4. Ability to make important discriminations, see cause-effect links, discount spurious cause-effect links.

5. Declarative knowledge. Just the collection of facts they need to know.

4. Things a leader can do by Dave Snowden

Start thinking about shifting to VECTOR measures rather than outcome based targets. Vectors define direction and speed of travel from the present and allow novel discoveries that are often ignored in the focus on explicit targets (which we know destroy intrinsic motivation). Stop managing by spreadsheet!

5. Alice MacGillivray

Most metrics ignore collaboration, relationship building, capacity building, knowledge generation, and kindness.

6. Bloody good conversations by Euan Semple

A while back someone asked me to write something about metrics. I replied “How do you measure bloody good conversations with interesting people?”.

7. Vanity Metrics: Add Context to Add Meaning by Aurora Harley

If a metric doesn’t have any actionable outcome when it changes over a tracked time period, then it’s likely a vanity metric and not worth tracking.

Examples

1. HP

a. Metrics Definitions

  1. Capture: The number of new projects recorded in the PPR as a percentage of all new projects booked. Goal: 80%

b. Knowledge Management Executive Report Contents

  • Definitions and Year-End Goals
  1. Worldwide
  • Capture
  1. Worldwide
  • Reuse
  1. Worldwide
  • PPR Usage
  1. Worldwide
  • Portal Usage
  1. Worldwide

c. The measurements established for the HP Professions were:

  1. Number of people enrolled

d. HP measured threaded discussion forums as follows:

  1. Number of Forums

e. Designing a KM Program With Strong Adoption, Leadership Alignment and Metrics at Hewlett Packard Enterprise by Vijayanandam V M and Grey Cook

2. Deloitte

a. Community Metrics

b. ESN Metrics

c. ESN Dashboard

Resources

  1. Metrics & data & charts, oh my!

SIKM Leaders Community Discussion Threads

  1. KPIs in the different KM phases

Books

  1. Show Me the Numbers: Designing Tables and Graphs to Enlighten by Stephen Few
  • The Visual Display of Quantitative Information

Knowledge Management Author and Speaker, Founder of SIKM Leaders Community, Community Evangelist, Knowledge Manager https://sites.google.com/site/stangarfield/

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store