KM Metrics and Reporting Process

Stan Garfield
13 min readJun 25, 2018

Originally published on February 8, 2017

21st in a series of 50 Knowledge Management Components (Slide 29 in KM 102)

Metrics and reporting: capturing operational indicators and producing reports to communicate performance against goals, areas for improvement, and progress toward the desired state

There is a wide spectrum of opinion about the importance of measuring knowledge management activities. Some believe that it is essential, and want to collect data and create reports on a long list of detailed metrics. Others believe that this is a waste of time, and that effort is better spent on enabling knowledge flow.

Three different kinds of metrics are typically captured and reported.

  1. Goal-oriented measurements which directly relate to employee goals allow assessment against those goals.
  2. Operational metrics are based on data captured by systems used by the initiative. For example, a knowledge sharing initiative would capture details such as web page hits, uploads, and downloads; threaded discussion subscribers, posts, and replies; and repository submissions, searches, and retrievals.
  3. Business impact metrics attempt to determine the return on investment (ROI) of initiatives, and include costs saved, costs avoided, incremental revenue, improved quality, increased customer satisfaction and retention, new business attracted, increased market share, revenue from innovation, etc.

Collecting and reporting on goal-oriented measurements ensures that the organization is aware of how it is performing and that individuals can be held accountable for achieving their goals. Reports should be produced and distributed every month to track progress, reinforce good performance, and encourage improvements where needed. Reporting metrics by group within the organization, for example, regions of the world or countries within a region, allows each group to compare its performance against other groups, and create a friendly competition to excel. Reporting metrics by individual may be limited by data privacy laws, and if allowed, transmitted confidentially to the manager for use in performance coaching and appraisals.

Operational metrics can be helpful in analyzing how an initiative’s infrastructure is being used, who is using it, and identifying areas for improvement. However, there is only so much which can be inferred from data such as page hits, uploads, and downloads. These metrics don’t indicate the value of any of these activities. If a user visits a web page, they may not find what they need there. If a document is uploaded, it may not be of any use to anyone. If a document is downloaded, it may not be reused. Follow these rules when deciding on which operational metrics to collect and report: Keep the time and effort required to a minimum, automating as much of the collection, extraction, and production as possible. Ask your team about which metrics will help them the most. Focus on a few key metrics which relate to your overall objectives. Use the metrics to improve the environment and test for this in user surveys. Communicate the metrics regularly so that they influence behavior.

Business impact metrics are potentially useful in justifying the expense of a program, in garnering management support, and in communicating the value of spending time on recommended activities. Anecdotes and success stories can be collected and converted into numerical values. Data can be captured in incentive points systems about the value of an initiative. Processes can be created or modified to ask participants about the business impact of initiative tasks. But there are few definitive ways to prove that a particular business indicator was solely influenced by the initiative. There are usually multiple reasons for a specific business result, and the initiative in question may be one of those reasons.

If there is a way for you to collect business impact metrics, then do so. They have more significance than operational metrics. But follow the same guidelines about limiting the effort involved to a reasonable amount.

Collecting and reporting on the measurements used in your KM program will help you to communicate progress, motivate people to improve their performance, and reassure management of the value of the initiative. Keep the effort required to do so in the right balance with other projects, look for ways to continue to streamline the process, and review the reporting process annually to keep it relevant to the current state.

Questions and Answers

Q: Why should metrics be collected and reported?

A: Here are five reasons:

  1. Take action based on what the numbers indicate. For example, if you are leading a communities initiative, report on the health of each community every month, and retire the inactive ones using a community health report.
  2. Track and communicate progress against goals. For example, if you are leading a knowledge management initiative, identify the top 3 objectives, track and report on how the organization is doing in a monthly report, and inspect and discuss progress (or the lack thereof) in management team meetings.
  3. Persuade others.
  4. Answer typical questions.
  5. Refute baseless assertions. For example, I received comments such as “no one uses our enterprise social network (ESN).” I refuted these by pointing out that the ESN actually had:
  • 118,652 Total Members
  • 1,256,806 Total Messages
  • 144,432 Total Files

Q: What metrics should be captured and reported?

A: Collect metrics directly related to the objectives of your program. Report on the key activities of knowledge management: sharing, innovating, reusing, collaborating, and learning.

Q: What metrics and reports should be avoided?

A: Here are five guidelines:

1. Don’t capture metrics for the sake of metrics. Many people express a desire for data that doesn’t drive any action or insight — collecting data for data’s sake. For each metric to be captured and reported, there should be an associated action or insight which is expected to be driven. Avoid collecting every random thing, sliced and diced every possible way, which someone might want to know once, but has no intent to do anything with other than say, “Oh, that’s interesting.”

2. Don’t establish a long list of arcane metrics. The fewer the number of metrics, and the simpler the reports, the better. Instead of reporting uploads, downloads, site visits, and other similar numbers, report on how the organization is sharing, innovating, reusing, collaborating, and learning.

3. Don’t attempt to measure knowledge using the metrics of balance sheets. Conventional balance sheet metrics do not adequately measure knowledge.

4. Avoid chartjunk, infographics, and bad statistics. Nice-looking, but worthless charts, infographics, and stats are a waste of effort. See:

5. Be wary of publicizing numbers which reflect actions you don’t want to encourage. For example, if you don’t want lots of groups being created in your ESN, don’t promote these metrics:

  • 30 New Groups Created
  • 1,148 Total Public Groups
  • 1,186 Total Private Groups
  • 2,334 Total Groups

Q: How can you measure time saved through reuse? Once a document is in a KM database, how can you track the time it will save if it is reused?

A: I don’t believe that it is worthwhile spending a lot of effort to capture time saved. Here are two articles on the subject:

If you wish to try to capture time saved at the document level, you will have to deal with these challenges:

  • When a user downloads the document, they will not yet be able to tell how much time they saved. They will have to return later to input this data, and it is unlikely that they will remember to do so.
  • Estimates of time saved are not very accurate.
  • The question raised in the article by Michael Koenig is what did they do with the time saved? Was it used productively?

To encourage users to input such data, you can use a KM recognition system to award points for KM activities. Users can be motivated to claim points for reuse, and in claiming those points, they can be required to document the value of that reuse. You can capture the time saved for a particular document by asking for the URL of the document and the time saved as a result of reusing that document as part of the input form in the recognition system.

However, you may find that despite offering recognition and/or rewards, users may still be unwilling to enter this information. So it may be easier to give them a button they can easily click, similar to a Like button, which indicates that they reused the document productively, and not bother trying to capture the amount of time saved.

Q: How should metrics reporting be managed?

A: Here are three ways:

  1. When defining how the KM program will be governed, include the process for reporting. Include the reporting schedule in the overall plan of record, maintained on an easily-accessible web site.
  2. The KM core team should decide on the details for reporting. These should include which metrics to report, the targets for each metric, the format of reports, what level of detail and how granular reports should be, to whom reports will be distributed, where reports will be stored, how frequently reports will be produced, who will produce reports, how and when to revise metrics and targets, and how to produce custom reports.
  3. Serve the needs of your reporting providers and consumers. Ask those who create metrics reports for suggested improvements in the collection and reporting processes. Ask those who use reports how they use them and what changes they would like to see.

Q: How should metrics reports be communicated?

A: Here are five ways:

  1. Send monthly KM metrics reports to the senior leadership team. Ask them to publish their own variations for their organizations.
  2. Report on how the organization is doing in a monthly report, and inspect and discuss progress (or the lack thereof) in management team meetings. Request that the organization’s balanced scorecard or equivalent performance indicator reporting be updated to include compliance to KM goals. Review KM indicators along with the usual business metrics, so it will be clear that they are just as important.
  3. Incorporate the metrics into as many parts of the organization as possible. Use newsletters, web sites, calls, and meetings.
  4. On the intranet, include the latest key metrics, and links to more detailed reports. Compare actual results to targets to celebrate progress and remind users of goals still to be achieved.
  5. Implement a data warehouse for self-service KM indicator reporting.

Q: How can people components be incorporated in KM metrics?

A: Here are three suggestions:

  1. The means of motivating employees include monitoring and reporting on progress against organizational goals. Track and communicate progress against the established KM goals for individuals.
  2. Report on KM incentive programs. Include standings, winners, and related statistics.
  3. Include the results of employee satisfaction surveys in your regular program metrics reporting. Indicate what actions will be taken as a result of the feedback received.

Insights

1. Why measure the value of KM? by Chris Collison

Measuring value can often seem like a time-consuming effort for knowledge workers. Is it worth it? Is measuring the value of KM initiatives a worthwhile way to spend time? Don’t get me wrong; clearly any KM activity needs to be linked to the creation of business value, and we need to be able to illustrate that convincingly. But, the concern is that to try and separate out the unique contribution than KM activities make can become something of a cottage industry and counterproductive to ‘getting on with the business of making a difference.’

2. Collaboration by Chris Crawford

Measuring the success of social media tools is not an enigma. We focused our measurement strategy on understanding what tools our people were using, and how and why they were using them. Our monthly Collaboration 2.0 scorecard and other reports help us continuously track usage, cost savings, employee satisfaction and, of course, the impact on the primary beneficiaries of Collaboration 2.0: our clients.

3. How do we measure experts? by Dave Snowden

What we could expect from expert KMs? Here are some potential metrics:

1. Sensemaking metrics such as the accuracy of the predictions they make about users’ preferences and practices, sophistication of their explanations for user behaviors, speed and accuracy of spotting anomalies.

2. Data gathering strategies — their power and efficiency.

3. Range and sophistication of mental models.

4. Ability to make important discriminations, see cause-effect links, discount spurious cause-effect links.

5. Declarative knowledge. Just the collection of facts they need to know.

4. Things a leader can do by Dave Snowden

Start thinking about shifting to VECTOR measures rather than outcome based targets. Vectors define direction and speed of travel from the present and allow novel discoveries that are often ignored in the focus on explicit targets (which we know destroy intrinsic motivation). Stop managing by spreadsheet!

5. Alice MacGillivray

Most metrics ignore collaboration, relationship building, capacity building, knowledge generation, and kindness.

6. Bloody good conversations by Euan Semple

A while back someone asked me to write something about metrics. I replied “How do you measure bloody good conversations with interesting people?”.

7. Vanity Metrics: Add Context to Add Meaning by Aurora Harley

If a metric doesn’t have any actionable outcome when it changes over a tracked time period, then it’s likely a vanity metric and not worth tracking.

Examples

1. HP

a. Metrics Definitions

  1. Capture: The number of new projects recorded in the PPR as a percentage of all new projects booked. Goal: 80%
  2. Reuse: The average amount of project content that was reused by new projects entered into the Project Profile Repository (PPR) for this month. Goal: 45%
  3. PPR Usage: The number of employees who reviewed one or more project profiles from the PPR this month, as a percentage of total population. Goal: 20%
  4. Portal Usage: The number of employees who visited one or more practice portals looking for official content this month, as a percentage of total population. Goal: 40%
  5. Participation: The number of employees who participated in the forums (either online or as a subscriber) this month, as a percentage of total population. Goal: 50%

b. Knowledge Management Executive Report Contents

  • Definitions and Year-End Goals
  • KM Metrics Dashboard
  • KM Metrics Dashboard by Region
  • Worldwide Summary
  • Participation
  1. Worldwide
  2. By Region
  • Capture
  1. Worldwide
  2. By Region
  • Reuse
  1. Worldwide
  2. By Region
  • PPR Usage
  1. Worldwide
  2. By Region
  • Portal Usage
  1. Worldwide
  2. By Region

c. The measurements established for the HP Professions were:

  1. Number of people enrolled
  2. Number of professional certifications
  3. Number of mentors and mentees
  4. Number of white papers published and read
  5. Number of training and community events
  6. Overall health rating (green, yellow, red)

d. HP measured threaded discussion forums as follows:

  1. Number of Forums
  2. Number of Subscriptions
  3. Number of New Threads
  4. Number of Replies
  5. Total Number of Posts
  6. Number of Participants
  7. % of Population Participating
  8. Overall health rating (healthy, in danger, dead)

e. Designing a KM Program With Strong Adoption, Leadership Alignment and Metrics at Hewlett Packard Enterprise by Vijayanandam V M and Grey Cook

2. Deloitte

a. Community Metrics

b. ESN Metrics

c. ESN Dashboard

Resources

  1. Metrics & data & charts, oh my!
  2. KM Goals and Measurements
  3. Community Goals and Measurements
  4. Analyze this: Useful ESN analytics
  5. Prove it: measuring the value of knowledge management by the UK National electronic Library for Health
  6. European Guide to good Practice in Knowledge Management — Part 4: Guidelines for Measuring KM
  7. Methodologies for identifying knowledge value by Paulo Petrucciani — SlideShare versionPDF version
  8. Community of Practice Metrics and Membership by Lee Romero
  9. Community Metrics: The Novell Approach by Lee Romero
  10. Knowledge Management Measurement by APQC
  11. Measuring the Impact of Knowledge Management (Best Practices Report) by APQC
  12. Metrics Guide for Knowledge Management Initiatives by the US Department of the Navy
  13. Metrics for knowledge management and content management by James Robertson
  14. James Robertson
  15. How Do You Measure the Knowledge Management (KM) Maturity of Your Organization? Metrics That Assess an Organization’s KM State by Rob Hoss and Art Schlussel
  16. Measuring Knowledge Management Performance by Rifat Shannak
  17. Metrics in Knowledge Management by Nick Milton
  18. Knowledge Management Metrics by Nick Milton — PDF version
  19. Suitable KPIs for your KM team by Nick Milton
  20. Nick Milton
  21. Measures and Metrics in Knowledge Management by Charles Despres, Daniel Remenyi, and Danièle Chauvel
  22. Knowledge management metrics for Public Organizations: A literature review-based proposal by Pérez López-Portillo, Héctor, Vázquez González, Edgar René, Romero Hidalgo, Jorge Alberto
  23. Mary Abraham
  24. Measuring knowledge work — when measures become targets by Shawn Callahan
  25. Measuring Knowledge Effectiveness by Chris Collison
  26. Knowledge Management, Made to Measure: Using Metrics as a Roadmap to KM Success by Tim Hines
  27. Rethinking ROI: The Metrics of Intangible Assets by Art Murray
  28. Innovation: What are the real metrics? by Judith Lamont
  29. How To Use KPIs in Knowledge Management by Patrick Lambe
  30. Metrics, ROI, Monitoring and Evaluation Again by Patrick Lambe
  31. On measuring library value by Brad Hinton
  32. The banality of measurement by Dave Snowden
  33. If you try and set targets for knowledge sharing you have failed to understand the subject by Dave Snowden
  34. Guy St. Clair
  35. Measuring the Impact of Knowledge Management by Luis Suarez
  36. Knowledge Sharing Metrics for Large Organisations by Laurence Lock Lee
  37. Knowledge Management: Analytics by Luke Mortensen
  38. KM Metrics & Measurement Framework by Kevin O’Sullivan
  39. KM Performance Management, Metrics, and KPIs by Iknow LLC
  40. Knowledge Management — Metrics by Tutorials Point

SIKM Leaders Community Discussion Threads

  1. KPIs in the different KM phases
  2. KM Metrics Needed
  3. Metrics hashtag

Books

  1. Show Me the Numbers: Designing Tables and Graphs to Enlighten by Stephen Few
  2. Storytelling with Data: A Data Visualization Guide for Business Professionals by Cole Nussbaumer Knaflic
  3. Performance Dashboards: Measuring, Monitoring, and Managing Your Business by Wayne Eckerson
  4. Designing Metrics: Crafting Balanced Measures for Managing Performance by Bob Frost
  5. Edward R. Tufte
  • The Visual Display of Quantitative Information
  • Envisioning Information
  • Visual Explanations: Images and Quantities, Evidence and Narrative
  • Beautiful Evidence

--

--

Stan Garfield

Knowledge Management Author and Speaker, Founder of SIKM Leaders Community, Community Evangelist, Knowledge Manager https://sites.google.com/site/stangarfield/