Path:

Mongeral Aegon tears up the customer service rule book

March 21, 2016

Mongeral Aegon’s call center recently ripped up the quality-monitoring rulebook, and took inspiration from a FIFA video game to get closer to customers. The result? Better customer connections and motivated employees.

The Mongeral Aegon Customer Service team

"We had two issues," says Claudia Barbosa, customer service supervisor, Mongeral Aegon. "The first was: How to create a positive and memorable experience for our customers. The second was: Our monitoring quality process was an obstacle, not a help".

According to Claudia, Mongeral Aegon relationship agents were evaluated according to technical procedures, which meant agents tended to handle calls 'mechanically'. "Their primary concern was to follow the 'script' in order to avoid a low monitoring score or even a penalty," she says. But following procedure wasn't improving the customer experience. The solution was to shift from the 'company view' to the client's point of view.

Total immersion

To fully understand the relationship between customer and agent, Mongeral Aegon sought more than 60 employees to fully immerse themselves in the relationship agent's daily routine to get their feedback on the monitoring and quality process, challenges, and how clients reacted to the script.

I had a flash of inspiration while playing a FIFA video game at home

Gustavo Monteiro Control & Quality Supervisor
 

The team was searching for a way to evaluate without inhibiting the agents when Gustavo Monteiro, Control and Quality Supervisor, had a flash of inspiration while playing a FIFA video game at home. "I thought of using a radar graph, which is used to show the qualities of each character in the game, instead of a grade," says Gustavo.

Throwing out the script

As a result, Mongeral Aegon threw out the script and started evaluating the relationship agents with a new monitoring tool. With the graph idea in mind, Quality Analyst Nathalia Morales, set out the qualities that needed to be evaluated, while Gustavo, Claudia and Quality Analyst Luciana Louzada, modelled the concepts.

Monitoring was divided into 10 items, five technical and five relational. The evaluation range for each item was broadened to cover: "Surpasses," "Achieved," "Did not achieve," to "Does not achieve," "Partially achieves," and "Neutral." The idea was to value not only the situations in which agents deliver competence, but also their attempts and efforts to improve.

In addition to the conceptual change in the evaluation tool and method, the Quality Monitoring team's approach also changed. Instead of focusing on pointing at and correcting mistakes, employees were educated, provided with follow-up, and the means for professional development.

Agents' attitudes have completely changed. They are free to carry on a conversation with our customer, but it's the customer that determines the rules

 

A tutoring plan for the team was also established whereby three quality analysts were made responsible for developing one third of the team for three months on rotation, ensuring new approaches were shared with the whole team. During this tutoring period, each analyst was responsible for issuing a monthly individual opinion about the monitored calls, and to suggest ways to approach each service, develop the skills and recognize strong points.

Customers rule

Today, agents' attitudes have completely changed. "They are free to carry on a conversation with our customer, but it's the customer that determines the rules. If they are more outgoing, the conversation can be more in depth," says Claudia.

The change has been marked. In some cases, customers have invited agents to become 'friends' on Facebook, while one customer called just to say 'happy birthday' to the agent that had solved their problem. "This change of approach has worked wonders," says Claudia, "we've seen no negatives. It hasn't even increased the duration of calls, but it certainly has changed the quality."