

What we’ve found though, is that it’s a really incredible learning experience – as the junior agent can see how a skilled colleague handles situations that they’ll eventually be exposed to. Sometimes junior agents grade the work of tenured agents which might seem odd (someone who’s new to the company giving feedback to someone who really knows their stuff). The Benefits Of Peer Review Over A Traditional Set-Up In each 1:1 that agents have with team leads, agents bring an example of a graded ticket of theirs, and the two talk about it together. In this case, a team lead will regrade the ticket, and then there’ll be a dialogue around how the team lead thinks about the score and the interaction, resulting in a better understanding of both how tickets should be handled and how they should be graded. Sometimes agents come in with a disagreement about how a ticket of theirs was graded by a peer, which creates healthy debate. In order to account for this, we created a policy where agents can grade fewer tickets in these times, and we have automations set up so that people get regular, small grading assignments in their inboxes, so none of their work ever piles up, and they’re able to grade regularly even when they’re busy. In busy times, agents often deprioritize their QA assignments to keep up with incoming customer requests. The calibration process allows us to continue to nurture our team and help agents grow in both areas.

With this strategy, we’re using MaestroQA to help agents get better at support and supporting our product, as well as coaching their peers.
#Illuminate education trainer jobs how to
Sometimes team leads work with individual agents – they’ll regrade tickets that a peer agent has already graded, to show the agent where their grading (and inherently, their understanding of quality and feedback generally) can improve.Ĭalibrations are used to coach agents on how to grade tickets, and how to give more constructive feedback. It includes how the new features can be used by the team to help drive the QA process and provide better constructive feedback. We also create a training video everytime a new feature in MaestroQA is rolled out. These trainings focus more around how to use the tools in MaestroQA, and how these tools can be used to identify what agents need help with. Team leads have additional, separate trainings from agents.

We’ve had three major trainings in the past 7 months, where we go over the basics with agents (new and old), the best ways to use MaestroQA, and how to give constructive feedback to teammates.

Illuminate handles this through extensive training. One challenge associated with peer review (which is less of an issue on teams that have just a few people grading) is keeping every grader aligned in their standards for quality. Aligning On Quality, And Staying Consistent While team leads do grade occasionally, the majority of the grading comes from agents looking over each others’ work. To do this, we developed a peer review program using MaestroQA. The last thing we wanted was a top down system for quality management – we wanted the team to continue learning from one another and helping each other both up and down the hierarchical structure. This is their story: The “Why” Of Peer Review They just needed a more robust framework and process to create the feedback culture that they wanted to have – a culture in which feedback moves both up and down the hierarchical ladder, and in which feedback is constructive, actionable, and ultimately improves the customer experience. They felt like giving feedback was a cultural value of their company – but the feedback was often vaguely positive and non-actionable, and wasn’t actually helping anyone improve their skills.Īdditionally, one of Illuminate Education’s core values is continuous improvement, and they knew that they could improve the program. Matt Dale, VP of Support, and Kallen Bakas, Director of Support, didn’t feel like they fully knew what was going on in agent-customer interactions, and they didn’t have a strategy or process for team improvement unless an issue was escalated (in which case they’d give the agent feedback). And, like many companies in this phase, they were managing the process in spreadsheets. Before MaestroQA, the support team at Illuminate Education was doing what many companies in the early phases of a quality assurance program do – they were spot checking tickets reactively, particularly when they were made aware of an issue (a bad CSAT score or a customer complaint).
