
Manual corrections, torturous extra data entry tasks, constant glitches – it turns out that AI assistants forced on customer service workers aren’t really that helpful.
The aim of the study, conducted by researchers affiliated with a Chinese power utility, Guangxi Power Grid, and several Chinese universities, was simple enough: find out how customer service representatives (CSRs) use AI assistance during their interactions with customers.
The researchers spoke to 13 CSRs at the power utility’s call center, including team leaders and shift supervisors who are responsible for handling phone inquiries.
That alone signifies a different approach. Rather than evaluating the quality of AI assistants’ performance only, the study also focuses on workers who are pushed to use AI during customer calls.
“Through a field visit and semi-structured interviews with 13 CSRs, we found that AI can alleviate some traditional burdens during the call (e.g., typing and memorizing) but also introduces new burdens (e.g., earning, compliance, psychological burdens),” the study explains more formally.
But then begin the quotes, detailing how the AI inaccurately transcribes customer call audio into text or makes mistakes in rendering sequences of numbers, such as phone numbers.
“The AI assistant isn’t that smart in reality,” one survey respondent said. “It gives phone numbers in bits and pieces, so I have to manually enter them.”
Most respondents pointed out that the assistant could be useful, especially in taking notes. Still, many complained they have to manually verify the AI’s output most of the time due to frequent mismatches between the latter and organizational standards.
The AI often gave false alerts about customers’ attitudes, so workers largely disregarded emotion analysis features, the study said.
“If we don’t manually verify and adjust it, and just go with the system’s default suggestion, there will be many errors,” said another respondent.
Additionally, although the AI assistant offers emotion recognition features, CSRs revealed shortages of AI emotion recognition, including “misclassifying normal speech intensity as negative emotions and lacking sufficient tags in emotion categories.”
The AI often gave false alerts about customers’ attitudes, so workers largely disregarded emotion analysis features, the study said.
“While the AI enhances work efficiency, it simultaneously increases CSRs’ learning burdens due to the need for extra adaptation and correction,” concludes the report.
“The mismatch between technological expectations and actual implementation reflects a common oversight among technology designers, who overestimate efficiency gains while underestimating the implicit learning burdens of adapting to new systems.”
Your email address will not be published. Required fields are markedmarked