Continuing from yesterday’s email

Step 3 – Define the scoring values

Rather than use a typical score of 1-5 or 1-10, this is where things become interesting.

Imagine the following scoring:

  • 10 = exceptional
  • 7 = good
  • 4 = workable
  • 0 = poor

Having gaps between the scores allows for more discrete responses. In other words, the value of each score is worth more, so the people scoring need to carefully decide.

Note that the exact wording of the scoring may be different for each category. For example, when reviewing support, the scores may represent the following.

  • 10 = World-class support with SLAs
  • 7 = Standard email/ticket/phone support
  • 4 = Only one support channel available
  • 0 = Support is very limited

Step 4 – Create an Excel or Google Sheet

Place of this information into a single Excel file or Google sheet. Ideally, each reviewer has their own dedicated tab.

For each application being considered, display the category, weight and score wording. This way the reviewer knows the importance of the category and the definition of each score.

Then allow the reviewer to enter their score per category, and some additional comments that support the score.

Step 5  – Calculate the weighted score for each category

This can be a hidden column and simply calculates the category weight * score / 100.

For example,

  • “Overall functionality” is worth 20, so when a reviewer gives it a score of 7, then 20 * 7 / 100 = 1.4
  • “Technical & security” is worth 15, so a score of 7 here means 15 * 7 / 100 = 1.05

Step 6 – Consolidate scores

Create a summary tab that consolidates the scores from every reviewer. Then average the score per category and sum the average.

Step 7 – Decide who the reviewers are

Ideally, this should be a combination of the business, the IT team, and the implementation team. Aim for at least 3 reviewers, but no more than 9.

Step 8 – Schedule demos

Organize a meeting with each application provider, so they can demo the app. This also allows the reviewers to ask questions so they can properly score. Meetings should be at least one hour, and inform the provider they will be scored so they can best prepare.

Once all the scores have been submitted, the final result for an application looks like this:

Application score: 7.05

Step 9 – Select the winner

Compare the score for each application and see which one has the most points.

The takeaway
Have a pragmatic and standard way of doing an application assessment. This makes it easy to understand why an app was chosen.

Category:
Salesforce
Tags: