Welcome to Agent Connect!

Leverage surveys, QA reviews, and 1:1 meetings to enhance agent performance.

Medallia Agent Connect enhances visibility into team performance through customer feedback, quality assurance, and coaching.

image of the agent connect stream

Agent Connect collects agent-level feedback from customers after every service interaction. Customers can rate agents, leave comments about their experiences, suggest rewards, and make recommendations for areas of improvement. Feedback is shared directly with agents, managers, and QA teams in real-time.

Integrated workflows in Agent Connect connect customer feedback with QA reviews, providing more context around customer interactions and developing coaching opportunities as they happen. This keeps team members productive and connected to their customers. It also creates an efficient, real-time loop where team members get feedback from a customer, take action, then close the loop with that customer quickly.

The Medallia Agent Connect experience includes the following components:

Surveys

Agent Connect surveys help you understand how customers feel about their interactions with your team. Surveys gather feedback from customers in a consistent way and display real-time metrics in reports for each team role. For more information about viewing customer feedback data, see Dashboards and reports.

Agent Connect surveys can also include Areas of Excellence and Areas of Improvement, where customers identify aspects of the interaction that were delightful or disappointing. You can also set up a rewards program, which allows customers to recommend rewards for Team Members, as described in Rewards.

When a customer has a negative experience, you can use the service recovery function to respond to that customer and improve the relationship. For more information, see Customer Feedback.

Customer feedback in the Stream

You can use Marketing features within the Customer Feedback component to drive sales, improve customer loyalty, and increase awareness of your company. For example, you might have a referral or loyalty program you want highlighted in your survey. You might also want to encourage customers to share their experiences on social networking sites. For more information, see Marketing features.

Quality assurance (QA)

Agent Connect makes it easy to launch QA reviews based on customer feedback and investigate the driving factors behind positive and negative interactions. Agent Connect assigns representative samples of customer feedback to QA reviewers for analysis and coaching.

Scorecards provide a framework for multiple QA reviewers to approach their reviews in a consistent manner. For example, a scorecard might ask reviewers to identify whether the team member used the customer's name during the interaction, or whether the team member showed empathy for the customer's situation.

A QA dashboard

QA reviewers can message team members directly during reviews to ask questions and to learn more about a customer interaction.

QA reviewers can leave comments as annotations to interaction transcripts. For example, the reviewer might have identified a perfect interaction from a team member and would like to call attention to that interaction in the review. The reviewer might also have identified process issues that need to be corrected. Annotations are visible in QA interaction reports.

For more information about the Quality Assurance component, see Quality Assurance (QA).

1:1s

The primary tool used for coaching in Agent Connect is the 1:1 meeting. Team leaders coach team members about how best to ensure customers are happy and satisfied after each interaction. This includes identifying things team members already do exceptionally well and areas they could improve.

Note: Your company can use surveys and Quality Assurance (QA) alone or together. However, one or both are required for the 1:1s component.

1:1s for a Team Member

Everyone with access to Agent Connect can save specific pieces of customer feedback, scores, or QA reviews for later discussion during a 1:1 meeting.

For example, a QA reviewer might want to highlight remarks from happy customers so the responsible team members can be recognized in upcoming 1:1s. A team leader might see a low score from a customer that is known to be difficult and want to discuss the best ways to interact with that specific customer. A team member with questions about a specific customer interaction might save that interaction for discussion during the next 1:1.