Calibrations
A QA calibration session is a meeting of QA reviewers to increase alignment on the way they score interactions during reviews. The goal is to ensure that all reviewers are clear and consistent, using the same expectations and criteria when conducting QA Reviews.
Required permissions
Any user with the Perform QA Review permission can participate in a calibration session. Users with the Manage Calibrations permission can start a new session, and those with the Export Calibrations Data permission can generate and download exported information about completed calibrations.
Anyone with permission to calibrate has access to view all completed or in-progress calibrations. They can also select the consensus responses and delete a calibration. Calibration participants cannot view calibration results until they submit their scores for the calibration.
Create calibrations
Everything related to calibration sessions can be done on the Calibrations page (QA > Calibrations). Creating a calibration includes selecting an interaction for review, selecting the questions in need of calibration, and selecting the reviewers to participate in the session.
- Chose one of the following methods to begin creating a new calibration:
On the Overview tab or the All Calibrations tab, click New Calibration.
Scroll to the bottom of the Overview tab and click See details for a particular scorecard. Next, select any number of questions in the scorecard, then click Calibrate.
On the Select Interaction page, complete the required fields to identify an interaction to use.
Enter the Interaction ID first.
- If it is not populated automatically, select the Team Member for that interaction.
Only the calibration owner can see the name of the team member. The name is hidden from all other calibration participants to eliminate bias from the process.
- Select a Scorecard to use during the calibration.
This selection determines which questions are available to be calibrated.
If it is not populated automatically, you can optionally provide an Interaction link.
Note:If your company is using an integration with a CRM, Agent Connect automatically includes the transcript or audio file associated with the interaction. Otherwise, you can manually insert a transcript or upload a file (MP3, OGG, WAV).
- Click Next step.
If you began this procedure without selecting questions on the Overview tab, the Select questions page appears. If you already selected questions on the Overview tab, the Add participants page appears.
If you have not already done so, select the questions you want to review during the calibration.
Tip: It might be unnecessary to calibrate on questions for which your team has consensus. We recommend that you focus on a small subset of questions to make sure your calibration meetings are productive for the entire team, while still building confidence that reviewers are aligned with scoring criteria.- Click Next step to move to the Add participants page.
On the Add participants page, select the users who will participate in the calibration, and set a meeting date.
Click Start calibration to initiate the session.
You can begin the session immediately, or click Save Draft to save the configuration for a later meeting time.
Review an interaction
The Calibration Sessions table in the All Calibrations tab lists all interactions you are to review as part of a calibration.
While a calibration is shown here, you cannot see how other participants scored the interaction, but you can see how many participants have completed their review. You can see reviews from other participants after everyone has provided their input.
Click the In progress link to begin. When the interaction loads, you can grade it and add relevant comments just like a normal QA Review. For detailed information about conducting reviews, see Interactions.
After you have completed your review, the interaction appears in the Completed or In progress tables in the All Sessions tab.
Calibrate the results
After everyone has scored an interaction, the next step is to bring the participants together to discuss the scores and comments in the calibration and address any discrepancies.
In the In Progress \ Awaiting Consensus table in the All Sessions tab, click the In progress link to open the interaction and see the responses of each participant.
Of the answers provided, select the final, consensus answer. Optionally, add meeting notes to document decisions or outcomes from the calibration. When you are done, click Complete.
View calibration trends
The Overview tab on the Calibrations page shows trending data of calibration sessions, from the overall scorecard to each question. Use the report to identify whether any scorecard is trending up or down, then drill down to individual questions without the need to calibrate the entire scorecard. Additionally, you can begin a calibration session from the report.
At the top of the report, select the date range for the calibrations you want to view.
The Total Calibrations tile shows how many calibrations were completed in the selected range.
The Alignment Score tile indicates how aligned your reviewers were with the calibration owner when scoring interactions.
The higher the number, the more aligned your reviewers were.
A low alignment score warrants investigation, and likely the need to coach reviewers.
The Overview and Trends Over Time charts show total and trend data for each scorecard.
Filter reports by scorecard
Click a scorecard bar in the Overview chart to filter the report by that scorecard. Click any data point in the Trends chart to filter the report by that scorecard's data for that point in time.
In a filtered report, the tiles update to show section-level data for the scorecard you clicked. For example, the following image shows a report filtered to show data for the Phone scorecard:
When you first open the tab, the Overview table near the bottom of the page shows summary data for each scorecard. In this view, click See details to filter the report by a specific scorecard. Whether you click See details in the table or click a chart to filter the report, the table updates to show question-level data for the scorecard you clicked, as shown in the following image.
Trend arrows in the Alignment score column indicate whether the score has increased or decreased in the selected time period compared to the previous time period.
Use the dropdown above the table to sort the questions by their order in the scorecard or by alignment score. Click See results for any question to see and review the actual calibration results for that question. This allows you to see which reviewers scored differently than the calibration owner, and to read reviewers' comments to better understand their reasoning. For more information, see Review an interaction and Calibrate the results, below.
To begin a calibration of specific questions in the table, select one or more questions, and then click Calibrate. For detailed information, see Create calibrations, below.
View all calibrations
The All calibrations tab shows the incomplete calibrations you own, the incomplete calibration sessions for which you are a non-owner participant, and the calibrations you have completed either as an owner or as a participant.
Each table has the following columns:
Interaction ID — The ID of the interaction.
Started by — The owner of the calibration.
Participants — The people participating in the calibration.
Meeting date — The date the calibration owner intends to submit a consensus score for each question. All participants should complete their calibration reviews by this date.
Answered — The number of people (including the owner) that have completed the review.
Status — The status of the review. Click a status to open that calibration
In progress indicates that the Meeting date has not been reached.
Awaiting consensus indicates that the Meeting date has been reached, and the owner has yet to provide consensus scores.
Review interaction appears only in the Calibration Sessions section.
View results appears only in the Completed table.
At the top of the page, click New calibration to create a calibration. For more information, see Create calibrations, below.