75% of work conversations could be recorded and assessed by 2025, according to Gartner

Using models to analyze recorded employee communication, companies could add value to their business, however, the capabilities come with myriad risks and data privacy concerns.

Image: iStock/ChainarongPrasertthai

In recent months, organizations have introduced a vast suite of surveillance technologies to mitigate the spread of COVID-19 in-house. This ranges from facial recognition capabilities to ensure employees are wearing masks to thermal imaging cameras to detect potential fevers. The future workplace could include other panoptic systems to monitor internal conversation, according to a recent Gartner report. In the years ahead, Gartner predicts that companies will begin to use algorithms to analyze recorded workplace communications to identify areas for organizational improvement, monitor compliance, streamline workflows, and more. However, the potential value add also comes with risks and data privacy concerns to consider.

SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download (TechRepublic Premium)

Recording and assessing employee conversations

On Nov. 20, Gartner published its “Top Strategic Predictions for 2021 and Beyond: Resetting Everything” detailing the potential impact of various technologies on organizations in the years ahead. In the report, Gartner analysts predict that three-quarters of “conversations at work will be recorded and analyzed” by 2025, and these capabilities will enable “the discovery of added organizational value or risk.”

“A lot of the conversations we have at work, and a lot of the things that are happening in meetings, and one-to-one conversations, and in customer service and things like that, a lot of that [has] intangible value for companies,” said Magnus Revang, research vice president at Gartner.

By analyzing these communications, organizations could identify sources of innovation and coaching throughout a company, Revang explained. To assist with implementation, the report details a series of recommendations such as creating an ethics board to determine fair use of this communication data. The report also recommends deleting or anonymizing conversations once analyzed and organizations should “avoid focusing on individual employees.”

“You don’t need to pinpoint the individual conversations. It’s more the indications of where does information flow, and who does it flow through, and stuff like that to be able to get a pinpoint of what is it, and who’s important, who’s contributing maybe much more than your business metrics would say,” Revang said.

The report also details a series of “near-term flags;” predictions about events that could arise in the years ahead as more companies begin to analyze recorded work conversations. In 2022, the analysts expect recorded conversation analysis will be used as the “primary data source” during a “major corporate acquisition” to determine which employees will be retained. In 2023, the analysts predict that a multi-billion dollar US corporation will determine compensation using automatic algorithms “with analysis of recorded conversations as a major contributor.”

SEE: Balancing public safety and privacy during COVID-19: The rise of mass surveillance (TechRepublic)

Algorithm bias and altered behavior

In recent months, artificial intelligence and facial recognition capabilities have drawn scrutiny due to underlying biases and inherent flaws in these models. By tapping algorithms to gauge employee innovation and potentially determine compensation, organizations risk introducing bias into their datasets.

“There’s danger in [the models] being skewed or biased based on the data you already had on how people behave, and what makes good performance and bad performance,” Revang said.

Companies will also need to consider the potential negative impact associated with introducing a panoptic monitoring apparatus throughout their organizations. Employees could alter their typical behavior and interactions with coworkers if they know they are under an all-seeing eye.

“We have already seen stories—from schools as well as employees—about how being watched every second takes a toll on their mental health as well as bothersome enough that it makes it harder [for people] to do their work,” said Rebecca Jeschke a digital rights analyst at Electronic Frontier Foundation via email.

Revang noted research detailing the psychological stress associated with being constantly monitored. However, he also suggested that if implemented appropriately employees may not be as cognizant of these monitoring systems.

“People change their behavior when they know they’re being monitored. Now what we look at as well is after a while, if you [use the technology] correctly and in a manner which is not invasive for the individual employee, you might [start] to forget that everything is being recorded, right?” Revang said.

SEE: UofL using IT know-how and lots of tech to prep campus for the fall amid pandemic (TechRepublic)

Cost-benefit analysis

In general, the ability to analyze recorded conversations introduces potential value and risks for organizations. Revang said that he believes technology is inherently neutral, however the way an organization chooses to deploy and use a technology is another consideration.

“I definitely think there [are] companies that are going to use technology like this and misuse it, and step over the line of what you would call ethical or moral. That’s going to happen, no doubt about it,” Revang said.

However, when looked at comprehensively Revang sees the potential gains outweighing the potential risks if the technology is used correctly.

“If it should happen, we don’t make an ethics judgment on that. We’re not saying if it’s positive or negative, right? I think that’s going to be a large part of the public discussion of it once it happens,” Revang said.

Also see

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here