“F*$# off with your surveillance software,” said a comment on our digital ad. (Pardon his French!) It’s the most hostile reaction to our people analytics solution so far, and quite a shock coming from someone who just saw an advertisement in passing.

But we get it.

As workplaces moved online, managers feared a drop in productivity and a slew of solutions to monitor and track employee performance made its way to market. Old school punch clocks took digital steroids and now measure not just your time in and out, but also the apps you have open and how often you click or use your keyboard. Some even take random screenshots of the employee and their screen to make sure they’re present.

But that’s pre-pandemic thinking (or at least we think it should be). What data has shown is that most employees working from home during the quarantine put in extra hours and effort, adding to the many causes of a burnout epidemic and ‘The Great Resignation’.

A report revealed that burnout was experienced by 52% of Americans in 2021 and another raises the percentage to 77% globally. Voluntary turnover reached record-breaking numbers with 47.4 million Americans quitting their jobs last year.

With burnout and turnover costing companies over a trillion dollars every year, employers are taking notice. There is now a wave of interest in ensuring employees are healthy physically, mentally, and emotionally. Some organizations are providing mental health breaks, access to therapy, more paid time off (PTO), flexible work hours, and other benefits that encourage life-work balance, prioritizing the employee in the hopes of long-term productivity.

Thankfully, more and more leaders believe that investing in the wellbeing of the workforce is also an investment in the success of the entire organization.

Still, the reality is that surveillance software exists and can be used against employees if top management allows it. This is our attempt at learning more about—and directly addressing—the elephant in the room.

Artificial Intelligence, Algorithms, and Optimization

We have an obsession with efficiency, in making the most of our resources. After all, we only have 24 hours in a day and about 28,835 days in an average lifetime. We need to make the most of it!

For organizations that can drill down their expenses per day, hour, department, employee, category, product, etc., the pressure to optimize is real and the failure to do so has measurable consequences on revenue and success. Hence the even greater obsession with technology and the convenience and efficiency it provides.

Today, algorithms and artificial intelligence (AI) help spot inefficiencies even faster in order to correct them before costs accumulate. Just as doctors now use AI to check whether their diagnoses and treatment plans are accurate, organizations use AI technology to quickly analyze performance data and pass them through standardized rules to reveal where they can optimize further.

Can it go too far? Absolutely.

We already mentioned a few examples above, like the use of desktop screenshots, pressuring employees to constantly have something new on display for the all-seeing eye to chew on every few minutes. What’s worse, this information is being used to penalize employees for taking quick breaks going as far as withholding wages for the time spent off-screen, as if grabbing a glass of water doesn’t empower us to do our job better.

Looking through studies, we find that there are negative effects to algorithmic management such as increasing discriminatory practices and reducing leadership’s accountability.

What is algorithmic management? The use of AI techniques and algorithms in people management.

Surveillance tools can even worsen the state of employees, causing more accidents as well as reducing happiness at work. It plays with the employee’s fear of termination or reduction in wages prompting them to work reactively with the objective of satisfying the AI criteria rather than being empowered to do a good job.

Does this mean that AI and monitoring tools are bad for business? Not necessarily. In an article as early as 2020, a warehouse worker that was speaking to a colleague who left their surveillance-heavy work environment said, "He had scanners and metrics there [in his new workplace], too, but they only measured whether his team was on track for the day, leaving the workers to figure out their roles and pace. ‘This is like heaven,’ he told his co-workers.”

Perhaps it’s not just the technology developed, but how it’s used.

It’s not the what, but the how.

As the age-old adage goes, “With great power comes great responsibility.” It applies to Spiderman’s superpowers as much as it does to AI and every new technology.

AI can work wonders for humanity. We’re seeing AI tools that analyze mammograms and screen for breast cancer quicker and more efficiently, aiding doctors so they can serve more clients. We now have machine learning technology making drug development faster, even helping in the research for COVID anti-viral treatments.

Engineers have developed an AI to empower persons with hearing disabilities with an app that translates text to sign language. Our day-to-day travels are also aided by an AI that recommends the best routes on Google Maps.

Then there are the bad apples and the faulty AIs that still have ways to go, like Amazon’s recruiting tool that had to be taken down due to its bias against women or the health algorithm with results showing racism against African Americans. Not to mention Uber’s autonomous vehicle that ran red lights during its testing.

There is clearly much to be improved and developed in the field of AI and in maintaining clean data, which is the tech’s source of analytics. There is still a need for human supervision to check and ensure everything is working properly and that the algorithms used don’t result in biased analytics. What happens when the humans involved don’t have the best intentions?

AI, technology, superpowers, and weapons can be as good, or as evil, as the people who use it.

The same is true for employee monitoring. But as we interview over 50 U.S. executives, people managers, and H.R. professionals, we find that most of them sincerely care about the wellbeing of their people.

“I don't want to micromanage my teams. I don't want them all to communicate in the same way or the same style. But I want to know how to connect with them the best,” says a high-level executive when asked about how he might use our platform, revealing his empathy for employees, appreciation of their uniqueness, as well as the struggle to make authentic bonds with each one—a more daunting task for leaders managing multiple teams.

“I like it because it is one of the few tools I have seen that has the employee in mind, instead of the organization,” said another executive when showing him the Erudit platform for the first time. He liked that it prioritizes employees; a far cry from the manager who penalizes an employee for leaving his desk.

For many people leaders, and most especially for HR professionals, there is a struggle to balance the needs of the workforce and the success of the company, especially when there are solid metrics for business success. People metrics that has to do with wellbeing, such as employee engagement or stress levels, are derived from employee surveys, which can be biased and outdated.

That said, it’s no wonder that there is a demand from well-intentioned people leaders to try and test employee monitoring tools that give them a better, unbiased, and data-driven view of how their employees are doing. It boils down to intentions, the hows and the whys. This is true for the end user as well as the developers of AI solutions like Erudit. Which is why we put in safeguards to ensure that only the AI can sift through messages and that all analytics are anonymous and shown as group data and never per individual.

"I don't want to micromanage my teams. I don't want them all to communicate in the same way or the same style. But I want to know how to connect with them the best."

Building trust

Employee surveillance may increase productivity in the short term, but takes a toll on the workforce and can be a liability in the long run. It leaves a company with unengaged employees that leave and need to be replaced regularly.

One study states: “Empirical evidence from the field shows that even a modest increase in group-level privacy sustainably and significantly improves line performance, while qualitative evidence suggests that privacy is important in supporting productive deviance, localized experimentation, distraction avoidance, and continuous improvement.”

The study calls this the transparency paradox. In this sense, monitoring of employees at work is counterproductive to every company.

In an article by Sam Blum for HR Brew, where he features our technology, he also includes a very valid concern from an HR manager: “If I thought my employer was scanning my messages and conversation to figure out if I was unhappy, that is very unnerving to me. They’re doing it today for [burnout], what are they going to do tomorrow? What are they going to do the next day?”

We welcome the feedback and we share in the concern. You can bet your bottom dollar that everyone in our team would be up and arms if messages were read and scanned by managers! Which is why we go the extra mile to ensure anonymity.

That said, AI technology can be used to anonymously monitor wellbeing. With great respect for privacy and with awareness of biases that may arise with seeing messages or even employee surveys results, AI can be trained to anonymize data and analytics to only reveal team metrics that help leaders include workforce wellbeing in their decision making.

Anonymity is key and has been used time immemorial to oppose power and even topple regimes. Could employees use anonymity to their advantage as well? Anonymous metrics and even conversation topics inform leaders while protecting the identity of workers. It can reduce bias while enriching management decisions.

At the end of the day, it’s about building and earning trust. When a manager trusts their team to do their best at work, they focus on empowering rather than spying on them. When an employee trusts that their HR department and leaders care about their wellbeing, they may be more open to trying out new solutions for workforce analytics.

Technology, in the wrong hands, is a weapon. In the hands of compassionate leaders and teams that have earned each others’ trust, it’s an empowering tool.

What's your take on employee monitoring? Share your thoughts with us on our LinkedIn post!

Updated content: