Workplace AI Wants to Help You Belong

Imagine this: On a Thursday at 11 a.m., you receive a personalized Slack notification asking you to get in touch with a coworker you haven’t seen in a while. Then, you are informed about who is not speaking up as much during a Zoom lunchtime team meeting, and you can invite them to participate. Later on in the day, as you are writing, a plugin driven by AI suggests that you use “chairperson” rather than “chairman.” The following day, as you get ready for a quarterly check-in with a supervisee, you take a look at a dashboard that provides an overview of how your team is doing. Information gathered from pulse surveys and “listening tools” such as text analysis, video, and always-on surveys indicate that, although they may be experiencing burnout, your team feels extremely connected to you and other teammates through one-on-ones.

Greetings from the coming era of AI and digital surveillance at the workplace. Are you prepared to fit in?

After the pandemic, a large number of physical offices moved to remote locations, with meetings and communications suddenly being conducted digitally. This created new opportunities for the collection, analysis, and utilization of enormous volumes of workplace data. Additionally, a significant increase in new digital tools for performance management and employee engagement has coincided with the availability of employee data.

Organizations have also been reacting to more and more demands for DEIB (diversity, equity, inclusion, and belonging) at work: Racial, gender, sexual orientation, socioeconomic, and other long-standing systemic injustices in society and organizations have been reinforced and illustrated by persistent gaps in representation inside organizations, especially in leadership positions. Thus, it should come as no surprise that tech businesses have started investigating how technology and the recently made data troves could be used to measure and/or improve organizational DEIB initiatives, monitoring staff to improve a sense of belonging.

Beyond inclusion, belonging is about having a deep sense of connection to and integration into the organization. Furthermore, it is indisputable that belonging is important. Humans have an inherent desire to belong since, in the past, overcoming challenges and pressures depended on forging ties with others. Isolation and a sense of belonging have been major contributors to the recent rise in mental health issues, with the “great resignation” being mostly attributed to a lack of belonging.

Are AI-powered solutions for workplace surveillance the solution?

What are these instruments and the advantages they offer? What are the potential unintended consequences of using them, if any? To what degree do these “good” instruments also justify excessive employee surveillance? Is it possible to guarantee that DEIB instruments genuinely promote fair and impartial results?

The Expanding Range of AI Tools for Workplace Integration

Although digital workplace surveillance has long been commonplace for workers in warehouses and logistics companies, such as UPS drivers, employee engagement and productivity tools are currently growing quickly among knowledge workers. For instance, the New York Times discovered that eight of the ten biggest private US firms monitor employee productivity. While new tools are specifically focused on DEIB goals, several of these tools are incorporating features important to furthering internal DEIB.

We analyzed workplace technology tools, particularly AI-based ones, with an emphasis on those that claimed to promote a sense of “belonging” (given the importance of this concept in improving workplace equity). The 34 tools that we mapped out differ in size and scope, but they all have goals that are explicitly related to belonging, and they are currently being used by workers and workplaces all over the world. Our customers come from a wide range of industries, and their companies range from startups with less than 1000 employees (like Axios) to businesses with 5–10K employees worldwide (like Spotify, Twilio, and Virgin Atlantic) to big businesses with more than 100K employees (like Microsoft, Unilever, and Vodafone).

Three different kinds of tools appear:

  • Tools for data analytics that attempt to quantify or evaluate belonging (32.3 percent)
  • Tools for changing behavior to enhance belonging (26.5 percent)
  • Instruments that integrate both (41.2%)

Tools for data analytics that quantify or evaluate belonging gather data in real-time so that employers may learn about the people that workers are linked to and speaking with, their degrees of inclusion, their engagement, and their emotional states. They accomplish this by using a variety of instruments, such as questionnaires and response evaluation, frequent pulse checks, and/or meeting data tracking. The tracking and analysis of communication metadata (from internal emails and messages to external reviews on sites like Glassdoor) is one example of a more technically complex service. Another is the use of sentiment analysis to evaluate the emotions in qualitative survey data, and employee network mapping helps determine who is connected to whom. Even while only a small number of these tools presently make use of AI, many are still looking into how to incorporate AI into their products.

Organizational belonging tools promote behavior change, frequently through the use of digital “nudges.” A notion from behavioral economics known as “nudge theory” holds that people’s thoughts and behaviors can be influenced by subliminal messages and positive reinforcement. These context-based and customizable nudges can be given by text, email, Slack, and other channels. Most digital nudging technologies use machine learning to customize nudges according to meeting minutes, individual communications, and other internal data. These nudges can offer advice on a variety of DEIB and wellbeing-related subjects, encourage inclusive workplace behavior or learnings linked to DEIB subjects, and encourage inclusive language and work practices unique to particular positions or functions. In addition to nudging, some solutions give managers and staff a forum to exchange compliments, acknowledgments, and other positive reinforcement for the job well done.

Though optimistic, worries linger

These technologies have the potential to increase the effectiveness, affordability, and scalability of DEIB initiatives while also improving their understanding and advancement inside enterprises. Tools that use personal data to generate insights and promote tailored behavior modification, however, can raise serious problems.

Data privacy | A variety of data privacy strategies are illustrated by the mapped tools. Employees may not always be able to choose whatever data is being gathered, and there is considerable variety in the level of protection offered to employees’ personal information and the insights gleaned from it. For instance, individual users of Cultivate must voluntarily grant the platform access to all forms of data it gathers, including the information in their emails, calendars, and conversations. Additionally, Medallia gives staff members the choice to share specific information, such as call transcripts. Nevertheless, the software further gathers signals automatically from email and calendar metadata without giving employees a chance to refuse. Employees frequently have no idea what information is being collected on them, much less have a say in it. Furthermore, managers or nefarious individuals may still be able to access personal information, such as email content, even in the presence of protections such as information anonymization.

Transparency | To what extent are workers aware of the uses of their data? Different tools for delivering nudges to promote behavior change differ in how transparently the nudges are created. For example, Microsoft VIVA gives specific employees access to information about the source of the data used in their nudges. Humu adopts a similar strategy, employing hyperlinks to provide information on the data points that prompted each nudge and the reason it was given to the employee. While some technologies use HR and demographic data to obtain more comprehensive insights, most nudge delivery tools do not give employees this information, and it is unclear if employees are aware that their data is being utilized in this manner.

Bias | Within AI tools, bias can be present at different phases. Specifically, AI systems use data that they have been trained on to make choices, but this data may contain bias. For instance, research indicates that women frequently network with peers or lower-level employees and may lose out on networking opportunities because of caregiving duties. Women’s networks in enterprises are also known to be weaker than men’s. Instruments that establish links through pre-existing networks have the potential to amplify these disparities and sustain gender-based networking patterns. Regarding this matter, Microsoft VIVA encourages staff members to establish connections with one another based on information like who is praising and recognizing whom, which could unintentionally strengthen preexisting networks. Diversification of the networks being established within the organizations is the aim of other techniques. As an example, the Slack app Donut can be used to randomly connect people across departments, locations, and leadership levels. It can also be used to try to introduce people who might not otherwise interact.

Incoherence | A clear, evidence-based understanding of what constitutes “belonging” and what characteristics should function as signals for belonging is often lacking in tools. For example, one of the factors that Medallia takes into account when evaluating “belonging” is whether or whether workers use their paid time off immediately upon earning it, as opposed to accumulating it. Beyond shaky ties to a sense of belonging, parents may take separate vacation schedules, which could have negative effects on parents—especially mothers, who often shoulder the majority of the caregiving responsibilities. In a similar vein, Cultivate measures how frequently managers “express doubt, request feedback, and share opinions” to determine whether or not they foster psychological safety. This is possible because it analyzes communication metadata. However, even though these factors monitor the actions of leaders that are associated with fostering psychological safety, they may not take into consideration whether or not workers genuinely experience psychological safety.

slick slope More breakthroughs, such as AI systems to track and identify the emotional and cognitive states of remote workers, are likely as tools like these already constitute a sizable and quickly expanding market. Consider the new virtual classroom software that Classroom Technologies and Intel are developing. It can be added to Zoom and is designed to analyze students’ behaviors and facial expressions to determine whether they are engaged, bored, or perplexed. Digital and video communication platforms are being used in the virtual workplace to test and implement similar kinds of technology. Although it seems that good intentions are behind the creation of these technologies, the assessment and capture of emotions and facial expressions are fraught with controversy and lack solid scientific backing.

Higher up, we are worried about excessive monitoring carried out under the DEIB’s name. These instruments continue to be used as tools for monitoring in private areas even though their data collection efforts are beneficial to the advancement of DEIB. Even when established with “good” intentions, monitoring has the potential to violate privacy and eventually strengthen control in the workplace. Furthermore, underprivileged areas—especially Black and Brown communities in the US—have long been the focus of surveillance, which paradoxically makes prejudice more focused.

To be clear, these issues do not apply to every tool that we mapped. Tools such as Donut, which randomly assigns employee connections, and Everyday Inclusion, which offers science-based, non-customized “inclusion nudges” to employees, do not create the issues we discuss here. We encourage leaders to think about the possible drawbacks as well as the benefits of tools that use personal data to generate insights and customize behavior.

  • How Can Leaders of Social Change Help? Leaders in social change need to be aware of the technologies they use, promote, fund, or invest in under the DEIB banner. To produce more just and equitable results, methods that promote belonging must be devised and handled with the utmost care and respect. Leaders in social transformation need to ask themselves:
  • Which biases and power dynamics might the instrument unintentionally reinforce? What are the chances that the tool is maintaining disparities in terms of who is networked and connected to whom? In what ways could the tool help some employees get recognition and appreciation while excluding others? How are the tools’ development teams making sure that every employee has an equal chance to be seen and heard inside the company, while also taking into account how certain individuals are perceived and
  • heard? Have we taken all necessary steps to guarantee fair outcomes for every employee? Does the tool’s construction make use of solid scientific data that is relevant to a wide range of identities, groups, and cultures? Or is it assuming things that might not work out as planned?
    Is this tool’s development and management staff diverse in terms of both demographics and disciplines?
  • Is the team prepared to think ahead of time about how diverse people could use and perceive these tools?
  • Exist strong built-in privacy safeguards? Have we given any thought to how management or dishonest individuals might utilize the tools in a way that could (intentionally or not) perpetuate bias and discrimination?
    Is it clear to employees how personal data is collected and used? Is it simple for them to consent to or refuse data collection?

It’s simple to think that insurmountable problems like a loss of identity and unfairness at work may be resolved by technology. Regarding the promises made by technology and artificial intelligence, we must exercise caution. These kinds of tools can be very beneficial, but as leaders of social change, we need to demand more and pose important questions to gain a deeper understanding of the potential ramifications of these tools as well as how power is replicated within and via them. Additionally, we may assist groups and innovations that prioritize and uphold justice from management via design.

In the end, stepping up AI and surveillance in the name of DEIB is a risky move. To build more just and equitable workplace settings, tools for social change must be advanced and pushed for with the support of thoughtful, inquiring, and intentional leadership and investment. However, in many situations, the most important query is: Should this tool even be developed?

Leave a Comment