How will we measure success of this knowledgebase?

There will probably be many criteria for measuring the success of this UCLA Knowledgebase experiment. But the first one is how many people contribute to it and how often. Here are some numbers to start measuring that. For now they’ll be added manually.

Date 5/1/2006 5/8/2006 5/15/2006 5/22/2006 5/29/2006 6/5/2006 6/19/2006 Articles with Answers 110 155 306 363 434 458 488 Posts since last date — 45 151 57 71 24 30 Articles w/o Answers 3 4 2 2 2 2 2 Contributors 28 29 42 51 64 69 74 Contributed more than 5 4 5 9 11 13 15 16 Days since start 31 38 45 52 59 66 80 Articles/day 3.5 4 6.8 6.9 7.4 6.9 6.1

The database was announced to the Help Desk/CSC Meeting on April 12, 2006. At that time it had 23 answers and roughly 10 contributors.


Please suggest other measures of success. Remember the first target audience is the staff of the 43 Help Desks at UCLA.


Possible Evaluation Criteria

  • relevancy of articles as judged by user ranking (if we add that feature)
  • number of queries per day
  • number of new articles per day
  • number of regular contributors
  • percentage of help desks that contribute regularly
  • percentage of help desks that use it for queries regularly
  • anecdotal evidence of knowledgebase success