Calibrations
Align QA reviewers to standardized criteria.
A QA calibration session is a meeting of QA reviewers to increase alignment on the way they score interactions during reviews. The goal is to ensure that all reviewers are clear and consistent, using the same expectations and criteria when conducting QA Reviews.
Required permissions
Any user with the "Perform QA Review" permission can participate in a calibration session. Users with the "Manage Calibrations" permission can start a new session, and those with the "Export Calibrations Data" permission can generate and download exported information about completed calibrations.
Anyone with permission to calibrate has access to view all completed or in-progress calibrations. Anyone with permission to manage calibrations can also select consensus responses and delete a calibration. Calibration participants cannot view others' input until they submit their own scores for the calibration.
Create calibrations
Everything related to calibration sessions can be done on the Calibrations page (QA > Calibrations). Creating a calibration includes selecting an interaction for review, selecting the questions in need of calibration, and selecting the reviewers to participate in the session.
Chose one of the following methods to begin creating a new calibration:
On the Overview tab of the Calibrations page, click New Calibration, or...
Scroll to the bottom of the Overview tab and click See details for a particular scorecard. Select any number of questions in the scorecard, then click New Calibration.
On the Select Interaction page, complete the required fields to identify an interaction to use.
Enter the Interaction ID first.
- If it is not populated automatically, select the Team Member for that interaction.
Only the calibration owner can see the name of the team member. The name is hidden from all other calibration participants to eliminate bias from the process.
- Select a Scorecard to use during the calibration.
This selection determines which questions are available to be calibrated.
If it is not populated automatically, you can optionally provide an Interaction link.
Note:If your company is using an integration with a CRM, Agent Connect automatically includes the transcript or audio file associated with the interaction. Otherwise, you can manually insert a transcript or upload a file (MP3, OGG, WAV).
- Click Next step.
If you began this procedure without selecting questions on the Overview tab, the Select Questions page appears. If you already selected questions on the Overview tab, the Add Participants page appears.
If you have not already done so, select the questions you want to review during the calibration.
Tip: It might be unnecessary to calibrate on questions for which your team has consensus. We recommend that you focus on a small subset of questions to make sure your calibration meetings are productive for the entire team while still building confidence that reviewers are aligned with scoring criteria.- Click Next step to move to the Add Participants page.
On the Add Participants page, first select a meeting date, then select the users who will participate in the calibration.
You can choose to add all eligible members of a team or group, or select employees individually.
Important: Only users with the "Perform QA Review" permission will be available to add to the session. The permission also determines which members of a selected team or group will be added.Click Create calibration to initiate the session.
As the session owner, you can enter your own scores immediately, or click Save Draft to save the configuration for later.
Review an interaction
If you are included in a calibration, the Sessions to Complete table in the My Calibration Sessions tab lists all interactions you need to review, along with the meeting date of the upcoming session.
You cannot see how other participants scored the interaction until you have provided your own scores, but you can see how many participants have completed their review.
Click the link in the status column for a particular interaction to begin. When the interaction loads, you can grade it and add relevant comments just like a normal QA Review. For detailed information about conducting reviews, see Review interactions.
After you have completed your review, the interaction appears in the All Calibration Sessions tab.
Tips for session owners
If you are the owner of a calibration session, meaning that you created the session and invited participants, you can track the status of your sessions as they progress. On the My Calibration Sessions tab, you will see a Sessions in Progress section listing all sessions that you have created. The table in this section shows the following details:
ID of the interaction being reviewed
Session owner
Session participants
Number of participants who have completed their reviews
Meeting date
Current status (In Progress or Awaiting Consensus)
To view which participants have begun or completed their reviews, click in the Participant Status column for any session:
Calibrate the results
After everyone has scored an interaction, the next step is to bring the participants together to discuss the scores and comments and address any discrepancies.
Follow the steps below to complete the session:
Open the session by clicking the status in the My Calibrations or All Calibrations tabs.
When the meeting date is reached, the status will show "Awaiting Consensus". Until then, the status will show "In Progress". The status will also be shown in a progress bar near the top of the Calibration Session page.
Click Set Consensus to begin recording the final results.
You can also save the session as a draft or delete it by expanding the dropdown menu next to the Set Consensus button.
For each question, review the answers given by each participant, discuss the answers among the group, and select the final consensus answer.
Tip:To help with the discussion, you can view the underlying interaction via the link in the Interaction Details section or access any available transcripts and audio recordings directly on the Calibration Session page.
Use the Meeting Notes field to record any reasoning, action items, or followups from the group discussion.
When you have selected a consensus answer for all questions, click Submit to complete the session.
The session will now be given an alignment score and will be visible in calibrations reporting.
Important: If you do not submit the session, it will remain in the Awaiting Consensus status indefinitely and will not show an alignment score.
View calibration trends
The Overview tab on the Calibrations page shows trending data of calibration sessions, from the overall scorecard to each question. You can use this report to identify whether any scorecard is trending up or down, then drill down to individual questions without the need to calibrate the entire scorecard.
At the top of the report, select the date range for the calibrations you want to view.
The Total Calibrations tile shows how many calibrations were completed in the selected range.
The Alignment Score tile indicates how aligned your reviewers were when scoring interactions. The higher the number, the more aligned your reviewers were. A low alignment score warrants investigation, and likely the need to coach reviewers.
The Overview and Trends Over Time charts show total and trend data for each scorecard.
Filter reports by scorecard
Click a scorecard bar in the Overview chart to filter the report by that scorecard. Click any data point in the Trends chart to filter the report by that scorecard's data for that point in time.
In a filtered report, the tiles update to show section-level data for the scorecard you clicked. For example, the following image shows a report filtered to show data for the Phone scorecard:
When you first open the tab, the Overview table near the bottom of the page shows summary data for each scorecard. In this view, click See details to filter the report by a specific scorecard. Whether you click See details in the table or click a chart to filter the report, the table updates to show question-level data for the scorecard you clicked, as shown in the following image.
Trend arrows in the Alignment Score column indicate whether the score has increased or decreased in the selected time period compared to the previous time period.
View sessions containing a specific question
In the score question breakdown table, click See results for any question to see and review the results for that question across all calibration sessions in the specified date range. This allows you to see which reviewers scored differently than the calibration owner, and to read reviewers' comments to better understand their reasoning. When you are in this view, you can use the arrows near the top of the page to cycle between sessions:
View all calibrations
The All Calibration Sessions tab shows all sessions that fall within your data scope, regardless of whether you were an owner, participant, or not involved in the session.
The Completed Calibrations table includes the following columns:
Interaction ID — The ID of the interaction
Started by — The owner of the calibration
Participants — The people participating in the calibration
Participant Status — The number of participants who completed their assignment
Alignment Score — The final alignment score of the session
Completed Date — The date that the consensus options were submitted for the session
Meeting date — The date the calibration owner intends to submit a consensus score for each question (all participants should complete their calibration reviews by this date)
Status — Click View results in this column to view the details of a completed session.
To download details of all completed sessions within the selected timeframe in CSV format, click Export.
