Calibrations

A QA calibration is an organized meeting of reviewers to align on how they score interactions during reviews. The goal is to ensure that all reviewers are clear and consistent with expectations and criteria when conducting reviews.

Calibration access and permissions

Any Team Leader or Admin with access to review an interaction can start and participate in a calibration. Team Members cannot participate in calibrations.

Anyone with permission to calibrate has access to view all completed or in-progress calibrations. Calibration participants cannot view calibration results until they submit their scores for the calibration.

Anyone with permission to calibrate can select the consensus responses and delete a calibration.

To work with calibrations, in the Medallia Agent Connect navigation bar select QA, and then click Calibrations.

Overview report

The Overview report shows trending data of calibration sessions, from the overall scorecard to each question. Use the report to identify whether any scorecard is trending up or down, then drill down to individual questions without the need to calibrate the entire scorecard. Additionally, you can begin a calibration session from the report.

Note: The report shows data for completed calibrations only.

At the top of the report, select the date range for the calibrations you want to view.

The Total calibrations tile shows how many calibrations were completed in the selected range.

The Alignment score tile indicates how aligned your reviewers were with the calibration owner when scoring interactions.

  • The higher the number, the more aligned your reviewers were.

  • A low alignment score warrants investigation, and likely the need for coaching reviewers.

The Overview and Trends over time charts show total and trend data for each scorecard. Click a scorecard bar in the Overview chart to filter the report by that scorecard. Click any data point in the Trends chart to filter the report by that scorecard's data for that point in time. In a filtered report, the tiles update to show section-level data for the scorecard you clicked. For example, the following image shows a report filtered to show data for the Phone scorecard:

Overview charts filtered for the Phone scorecard

When you first open the report, the Overview table at the bottom of the report shows summary data for each scorecard. In this view, click See details to filter the report by a specific scorecard. Whether you click See details in the table or click a chart to filter the report, the table updates to show question-level data for the scorecard you clicked, as shown in the following image. Trend arrows in the Alignment score column indicate whether the score has increased or decreased in the selected time period compared to the previous time period.

Calibrations report table, filtered for the Phone scorecard

Use the dropdown above the table to sort the questions by their order in the scorecard or by alignment score. Click See results for any question to see and review the actual calibration results for that question. This allows you to see which reviewers scored differently than the calibration owner, and to read reviewers' comments to better understand their reasoning. For more information, see Reviewing an interaction in a calibration and Calibrating review results, below.

To begin a calibration of specific questions in the table, select one or more questions, and then click Calibrate. For detailed information, see Creating calibrations, below.

All calibrations tab

The All calibrations tab shows the incomplete calibrations you own, the incomplete calibration sessions for which you are a non-owner participant, and the calibrations you have completed either as an owner or as a participant.

All calibrations table

Each table has the following columns:

  • Interaction ID — The ID of the interaction.
  • Started by — The owner of the calibration.
  • Participants — The people participating in the calibration.
  • Meeting date — The date the calibration owner intends to submit a consensus score for each question. All participants should complete their calibration reviews by this date.
  • Answered — The number of people (including the owner) that have completed the review.
  • Status — The status of the review. Click a status to open that calibration
    • In progress indicates that the Meeting date has not been reached.
    • Awaiting consensus indicates that the Meeting dat has been reached, and the owner has yet to provide consensus scores.
    • Review interaction appears only in the Calibration Sessions section.
    • View results appears only in the Completed table. .

At the top of the screen, click New calibration to create a calibration. For more information, see Creating calibrations, below.

Creating calibrations

Creating a calibration includes selecting an interaction for review, selecting the questions in need of calibration, and selecting the reviewers to participate in the calibration.

  1. Chose one of the following methods to begin creating a new calibration:
    • From the Overview report, either click New calibration at the top of the report, or select specific questions and then click Calibrate.
    • From the All calibrations tab, click New calibration.

    The Select interaction screen appears.

  2. Specify the interaction details.

    If your company is using an integration with a CRM, Agent Connect automatically includes the transcript or audio file associated with the interaction. Otherwise, you can manually insert a transcript or upload a file.

    1. Enter the Interaction ID.
    2. If it is not populated automatically, select the Team Member for that interaction.

      Only the calibration owner can see the name of the Team Member. The name is hidden from all other calibration participants to eliminate bias from the calibration process.

    3. Select a Scorecard to use during the calibration.

      This selection determines which questions are available to be calibrated.

    4. Optionally, provide an Interaction link.

    Select an interaction for the calibration

  3. Click Next step.

    If you began this procedure without selecting questions in the Overview report, the Select questions screen appears. If you already selected questions in the Overview report, the Add participants screen appears.

  4. If you have not already done so in the Overview report, select the questions you want to review during the calibration.

    It might be unnecessary to calibrate on questions for which your team has consensus. Medallia recommends that you focus on a small subset of questions to make sure your calibration meetings are productive for the entire team, while still building confidence that reviewers are aligned with scoring criteria.

    Select questions

  5. Click Next step.

    The Add participants screen appears.

  6. In the Participants field, select the Team Leaders to participate in the calibration.
  7. In the Meeting date field, select a date for the calibration meeting.

    Select the Team Leaders to participate in the calibration

  8. Click Start calibration.

    The interaction appears, ready for a calibration review.

Reviewing an interaction in a calibration

The Calibration Sessions section of the All calibrations tab lists all interactions you are to review as part of a calibration. While a calibration is in your Calibration Sessions section, you cannot see how other participants scored the interaction.

Click Review interaction to begin. You can grade the interaction and add relevant comments. Medallia recommends that you provide comments to help facilitate a better calibration meeting. For detailed information about conducting reviews, see Reviewing interactions.

A calibration review in progress

After you have completed your review of a calibration, the interaction appears in the Completed / In progress list of the Calibrations screen. You can see reviews from other participants in completed calibrations.

Calibrating review results

To calibrate scores across reviewers in the calibration, gather your participants to discuss their feedback. Click View results on the Calibrations screen to open the interaction, and see the responses of each calibration participant.

Calibrating the results of participant reviews

Of the answers provided, select the final, consensus answer. Optionally, add meeting notes to document decisions or outcomes from the calibration. When you are done, click Complete.