Summer 2023
Release starting 25 July 2023
All features announced in this release will begin to roll out on Tuesday, 25 July 2023. If you have any questions, reach out to our Support Team via the in-app Contact Us menu option (? icon in navbar > Contact Us) or your Account Manager.
Enhanced QA Assignments Reporting
A new, enhanced reporting view allows Managers and Admins to see which Team Members are performing well and which ones may need additional coaching. Additionally, Managers and Admins now have the capability to get a high-level overview of which interactions were skipped and why.
Why It Matters
Managers and Admin need an at-a-glance view of the state of their QA Assignments to understand their teams' performance for any given Assignment cadence, including past periods and not just the existing period. The in-depth view of a QA Assignment for the current period also makes it easier for Admins to understand where they may need to catch up for that period and the Skip Reason overview provides a snippet of why interactions were not QA'd so QA Managers and Admins can understand if they need to adjust the QA Assignment setup criteria.
How do I use it?
When a Team Leader or a Company Admin clicks into Details of an Assignment, they are now to see the new, enhanced view of QA Assignments. The default view of the report is always based off the current period of that Assignment. Users can also easily check to see the breakdown of interactions that were skipped and the overall results over time. To see high level metrics and details for a past period, scroll down to the Periods table.
Track Skip Reasons in QA Assignments
When a QA Reviewer conducts their reviews via a QA Assignment, the Skip button prompts the Reviewer to provide a reason for why that interaction isn't a good candidate to be QA'd. Once a reason has been provided, Managers and Admins can see an overview of the number of interactions that were skipped, why they were skipped, and the details of the interaction in the QA Assignment Details page.
Why It Matters
Too often, interactions are skipped from a QA Assignment because they were junk records (e.g., a spam interaction), too short to provide substance for a full review, or too long and complex for a QA Reviewer to go through in the limited amount of time they have to achieve their QA Assignment quota. The ability to capture and report on these provides a feedback loop for the Admin who creates QA Assignments and for Managers to work with their QA Review team to improve overall efficiency and save reviewers time.
How do I use it?
When a QA Reviewer starts going through the Assignment and comes across an interaction that's not eligible for QA, they can click on the Skip Interaction button and select a reason. The Reviewer can select an Other option and provide a reason via the open text field or search for an Other reason that may already exist.
When a Company Admin or Team Leader navigates to the QA Assignment Details page, they see a Reasons for Skipping module. From there, they can see an overview of all the skip reasons, with counts, for that Assignment period, and click into each reason to see a breakdown of the skipped interaction. They can also navigate to the Skipped Interactions page and drill down deeper to better understand the interactions that were
QA Review History
QA Review History captures events in a QA Review so you can gain enhanced visibility into a completed QA Review and understand any changes made to a QA scorecard after the QA Review is completed. The history includes QA score changes, changes to answer selections, changes or edits to comments, timestamp of the changes and user who made the changes.
Why It Matters
Gaining visibility into changes made on a completed QA Review ensures reviews are adhering to an organization's quality assurance standards and guidelines and hold QA Reviewers accountable for the QA Reviews they complete. These insights increase transparency into the QA Review process, program efficiency improvements and make coaching QA Reviewers more effective.
How do I use it?
When a completed QA Review has been revisited and answers, comments and the QA score have been updated, that Completed Review is marked with an * on the Completed Reviews table. When Team Leaders and Company Admins open that QA Review, they see a View History link at the top of the QA scorecard. The link opens the history of all changes made to that QA Review.
QA Assignments Out of Interactions Preference
Users who have the capability to set up QA Assignments now have the flexibility to turn on the setting to allow for "relaxation" of the original QA Assignment criteria in order to bring more interactions in for QA. This setting can be configured on an Assignment by Assignment basis.
Why It Matters
Agent Connect is able to unblock QA Reviewers dynamically and in real time to ensure that they're able to complete their work, when automated criteria doesn't allow for more interactions to be brought in. This setting allows for QA Reviewers to complete their Assignments in a timely manner and improves efficiency by putting a stop to Reviewers having to manually check back to the QA Assignment to see if an eligible interaction is now ready for their review.
How do I use it?
To apply this setting globally on your instance, let your Account Manager know. When creating a new QA Assignment, Team Leaders or Company Admins must ensure the Allow Partial Match checkbox is checked under the Out of Interactions section. When a QA Assignment has run out of interactions, a page will pop up, indicating to the QA Reviewer that no more interactions are eligible for QA based on the original QA Assignment criteria and the QA Reviewer can either go back to the Assignment or choose Review Similar.
Calibrations Enhancements
An Alignment Score for each individual Calibration is now available. Additionally, Company Admins have the capability to export Calibration results for any given timeframe.
Why It Matters
Exposing individual Calibration session Alignment Scores allows the Calibration initiator to understand how aligned the Participants are for any given Calibration as opposed to relying on the comprehensive Alignment Score, which aggregates the Alignment Score from all Calibration sessions. Additionally, providing an export functionality allows Company Admins to aggregate Calibration data in their preferred view, outside of Agent Connect.
How do I use it?
An Alignment Score column has been added to the All Calibrations and All Sessions table. Additionally, when a Company Admin accesses the All Sessions tab, they will be able to select the time range on the Completed table for which they want to export calibrations and click on the download button to begin a download. When the file has completed downloading, they can view that .csv.
Team Metric Goals
Set metric-based goals for a Team and track progress over time. Team Leaders and Company Admins can now set metric-based goals for any standard metric on the Performance dashboard and can track the entire Team's progress against that goal, over time.
Why It Matters
This feature helps Team Leaders understand which Team Members are performing above, at or below goal and provides insights into which Team Members may need additional coaching or recognition for a job well done! The Team Metric Goal feature is a way to measure baseline performance for any given Team and enables a Team Leader to keep a close eye on the Team's overall performance for a given metric.
How do I use it?
When a Company Admin or Team Leader is on a Team's Performance page, they can click Create New Goal, which brings up the goal creation workflow. A goal can be open-ended in terms of timeline or the creator can provide an optional due date. Once a goal has been created, a goal card is visible on the Performance page. Please note that goal cards and goals on the Performance page data table are not affected by the date range filter and will only display the most current and active goals, regardless of what date range has been selected.
Click on Past Goals to see completed and archived goals and additional information including who set the goal, when it was set, due date (if applicable) and when it was completed, canceled or archived.
Feedback Reporting by Response Date
All areas of reporting now have the ability to toggle the view of reports by responded_at or created_at time stamp.
Why It Matters
This feature reduces ambiguity and confusion for both Team Members, Managers and execs on Performance metrics by allowing viewers to change the timestamp for which they're viewing that data. When filtering Feedback data by Created At date, which was the default and only option in the past, it's always possible that you might check and communicate certain scores to leadership/your organization but find that when you look back at that same time period next week, the score has now changed because new survey responses have come for surveys created during that time period since you last checked. By filtering using Responded At, you can feel more confident when reporting data out to your organization that the scores shared won't change because new survey responses are counted toward the time period at a later date.
How do I use it?
An Admin can set the Default query method (Created At or Responded At for Feedback reporting by heading to Settings > Metric Configuration) and scrolling down to the bottom to find the Report Feedback Metrics By field. Once your Admin has selected whether Created At or Responded At should be the default, we'll use that automatically when users land on a Feedback-centric Reporting page (Trends, Performance, Stream), but users with access to these pages can always toggle between using Created At or Responded At when filtering by Date/Time Period on these reporting pages moving forward, based on their current use case when looking at/sharing data.
Feedback personal data erasure policy change
Customer personal data (such as name, email address, credit card numbers, etc., and which is sometimes called PII) will be erased from Survey Response Feedback records after 90 days, or sooner as set by the Feedback expungement setting. Agent Connect instances that previously did not have this setting defined will now automatically be set to 90. The maximum is now 90 days and cannot be extended.
The Settings > Company Info > Feedback expunge threshold in days (30) setting controls the number of days after which Agent Connect removes personal data from Survey Response Feedback records after the response was collected and displayed in Agent Connect. For example, when set to 30, Agent Connect removes personal data after 30 days.
QA Customer personal data redaction policy
The new Customer PII Redaction process redacts personal customer data from QA Transcripts 30 days after the QA Review was completed. Transcripts are still kept for the number of days set on the Expungement Threshold on the QA Settings page, but the personal customer data will be redacted after 30 days. This threshold cannot be edited and ensures compliance with GDPR and CCPA.
Additionally, the QA PII Expungement Policy is now set to a value of 30 (days) for all new company instances. For existing companies, the setting will not be changed — including for blank or very large values.
Admins can check the setting by navigating to Settings and clicking on the QA tile. Scroll down to find the Expungement Threshold field.
Set a Start Time on Custom Cadence QA Assignments
When building a QA Assignment and selecting a Custom Cadence for your QA Assignment automation to repeat, you can now set a custom Start Time! Traditional QA Assignment cadences of Weekly and Monthly utilize 12 am UTC at the start of a new week or month, respectively, to reset -- for organizations with global teams, 12 am UTC may mean your assignment resets at an odd time of the day.
Now, you can simply select Custom Cadence when creating your QA Assignment and set the specific Start Time (in your local timezone) that you'd like us to use when resetting your Assignment to begin a new Assignment period. We'll show help text that explicitly states when your QA Assignment begins, when it will reset next, and how it will repeat moving forward.