Calibrations

Align QA reviewers to standardized criteria.

A QA calibration session is a meeting of QA reviewers to increase alignment on the way they score interactions during reviews. The goal is to ensure that all reviewers are clear and consistent, using the same expectations and criteria when conducting QA Reviews.

Required permissions

Any user with the "Perform QA Review" permission can participate in a calibration session. Users with the "Manage Calibrations" permission can start a new session, and those with the "Export Calibrations Data" permission can generate and download exported information about completed calibrations.

image of required permissions

Anyone with permission to calibrate has access to view all completed or in-progress calibrations. Anyone with permission to manage calibrations can also select consensus responses and delete a calibration. Calibration participants cannot view others' input until they submit their own scores for the calibration.

Create calibrations

Everything related to calibration sessions can be done on the Calibrations page (QA > Calibrations). Creating a calibration includes selecting an interaction for review, selecting the questions in need of calibration, and selecting the reviewers to participate in the session.

  1. Chose one of the following methods to begin creating a new calibration:

    • On the Overview tab of the Calibrations page, click New Calibration, or...

    • Scroll to the bottom of the Overview tab and click See details for a particular scorecard. Select any number of questions in the scorecard, then click New Calibration.

  2. On the Select Interaction page, complete the required fields to identify an interaction to use.

    Select an interaction for the calibration

    • Enter the Interaction ID first.

    • If it is not populated automatically, select the Team Member for that interaction.

      Only the calibration owner can see the name of the team member. The name is hidden from all other calibration participants to eliminate bias from the process.

    • Select a Scorecard to use during the calibration.

      This selection determines which questions are available to be calibrated.

    • If it is not populated automatically, you can optionally provide an Interaction link.

    Note:

    If your company is using an integration with a CRM, Agent Connect automatically includes the transcript or audio file associated with the interaction. Otherwise, you can manually insert a transcript or upload a file (MP3, OGG, WAV).

  3. Click Next step.

    If you began this procedure without selecting questions on the Overview tab, the Select Questions page appears. If you already selected questions on the Overview tab, the Add Participants page appears.

  4. If you have not already done so, select the questions you want to review during the calibration.

    example of selecting questions

    Tip: It might be unnecessary to calibrate on questions for which your team has consensus. We recommend that you focus on a small subset of questions to make sure your calibration meetings are productive for the entire team while still building confidence that reviewers are aligned with scoring criteria.
  5. Click Next step to move to the Add Participants page.
  6. On the Add Participants page, first select a meeting date, then select the users who will participate in the calibration.

    You can choose to add all eligible members of a team or group, or select employees individually.

    example of adding participants

    Important: Only users with the "Perform QA Review" permission will be available to add to the session. The permission also determines which members of a selected team or group will be added.
  7. Click Create calibration to initiate the session.

    As the session owner, you can enter your own scores immediately, or click Save Draft to save the configuration for later.

Review an interaction

If you are included in a calibration, the Sessions to Complete table in the My Calibration Sessions tab lists all interactions you need to review, along with the meeting date of the upcoming session.

calibration sessions

You cannot see how other participants scored the interaction until you have provided your own scores, but you can see how many participants have completed their review.

Click the link in the status column for a particular interaction to begin. When the interaction loads, you can grade it and add relevant comments just like a normal QA Review. For detailed information about conducting reviews, see Review interactions.

in progress calibration
Tip: We recommend that you always provide comments to help facilitate a better calibration meeting.

After you have completed your review, the interaction appears in the All Calibration Sessions tab.

Tips for session owners

If you are the owner of a calibration session, meaning that you created the session and invited participants, you can track the status of your sessions as they progress. On the My Calibration Sessions tab, you will see a Sessions in Progress section listing all sessions that you have created. The table in this section shows the following details:

  • ID of the interaction being reviewed

  • Session owner

  • Session participants

  • Number of participants who have completed their reviews

  • Meeting date

  • Current status (In Progress or Awaiting Consensus)

image of sessions in progress

To view which participants have begun or completed their reviews, click in the Participant Status column for any session:

image of participant status

Calibrate the results

After everyone has scored an interaction, the next step is to bring the participants together to discuss the scores and comments and address any discrepancies.

Follow the steps below to complete the session:

  1. Open the session by clicking the status in the My Calibrations or All Calibrations tabs.

    When the meeting date is reached, the status will show "Awaiting Consensus". Until then, the status will show "In Progress". The status will also be shown in a progress bar near the top of the Calibration Session page.

  2. Click Set Consensus to begin recording the final results.

    image of the session page

    You can also save the session as a draft or delete it by expanding the dropdown menu next to the Set Consensus button.

  3. For each question, review the answers given by each participant, discuss the answers among the group, and select the final consensus answer.

    example of selecting a consensus answer
    Tip:

    To help with the discussion, you can view the underlying interaction via the link in the Interaction Details section or access any available transcripts and audio recordings directly on the Calibration Session page.

    example call timeline
  4. Use the Meeting Notes field to record any reasoning, action items, or followups from the group discussion.

  5. When you have selected a consensus answer for all questions, click Submit to complete the session.

    The session will now be given an alignment score and will be visible in calibrations reporting.

    example alignment score
    Important: If you do not submit the session, it will remain in the Awaiting Consensus status indefinitely and will not show an alignment score.

The Overview tab on the Calibrations page shows trending data of calibration sessions, from the overall scorecard to each question. You can use this report to identify whether any scorecard is trending up or down, then drill down to individual questions without the need to calibrate the entire scorecard.

image of the overview tab
Note: The report shows data for completed calibrations only.

At the top of the report, select the date range for the calibrations you want to view.

The Total Calibrations tile shows how many calibrations were completed in the selected range.

The Alignment Score tile indicates how aligned your reviewers were when scoring interactions. The higher the number, the more aligned your reviewers were. A low alignment score warrants investigation, and likely the need to coach reviewers.

The Overview and Trends Over Time charts show total and trend data for each scorecard.

Filter reports by scorecard

Click a scorecard bar in the Overview chart to filter the report by that scorecard. Click any data point in the Trends chart to filter the report by that scorecard's data for that point in time.

In a filtered report, the tiles update to show section-level data for the scorecard you clicked. For example, the following image shows a report filtered to show data for the Phone scorecard:

Overview charts filtered for the Phone scorecard

When you first open the tab, the Overview table near the bottom of the page shows summary data for each scorecard. In this view, click See details to filter the report by a specific scorecard. Whether you click See details in the table or click a chart to filter the report, the table updates to show question-level data for the scorecard you clicked, as shown in the following image.

Calibrations report table, filtered for the Phone scorecard

Trend arrows in the Alignment Score column indicate whether the score has increased or decreased in the selected time period compared to the previous time period.

View sessions containing a specific question

In the score question breakdown table, click See results for any question to see and review the results for that question across all calibration sessions in the specified date range. This allows you to see which reviewers scored differently than the calibration owner, and to read reviewers' comments to better understand their reasoning. When you are in this view, you can use the arrows near the top of the page to cycle between sessions:

image of individual sessions

View all calibrations

The All Calibration Sessions tab shows all sessions that fall within your data scope, regardless of whether you were an owner, participant, or not involved in the session.

All calibrations table

The Completed Calibrations table includes the following columns:

  • Interaction ID — The ID of the interaction

  • Started by — The owner of the calibration

  • Participants — The people participating in the calibration

  • Participant Status — The number of participants who completed their assignment

  • Alignment Score — The final alignment score of the session

  • Completed Date — The date that the consensus options were submitted for the session

  • Meeting date — The date the calibration owner intends to submit a consensus score for each question (all participants should complete their calibration reviews by this date)

  • Status — Click View results in this column to view the details of a completed session.

To download details of all completed sessions within the selected timeframe in CSV format, click Export.