What’s Your Bias?
Diversity of talent and an inclusive work environment are becoming more than just socially responsible business practices.
They are increasingly seen as a crucial part of evolving and digitizing the insurance industry. For those leaders who want diversity and inclusion but aren’t sure how to get there, technology can help.
Computer software can do everything from broad sweeps of communications to very targeted checks for problematic activities. Some programs are built to make sure job application forms and other hiring outreach efforts aren’t skewed toward certain groups and against others. Those programs identify and eliminate “barrier language,” which is application information that allows HR staff to unconsciously cull prospects based on ethnicity, gender and other characteristics that have little or nothing to do with skills and potential success in the position advertised.
Mercer and RedThread Research released a study in February 2019, Diversity & Inclusion Technology: The Rise of a Transformative Market, whichstates, “While there are many potential benefits of D&I technology, the most apparent one is the opportunity to create consistent, scalable practices that can identify or mitigate biases across organizations, often in real time.”
Computer software can do everything from broad sweeps of communications to very targeted checks for problematic activities.
Though a company’s leadership may desire a culture of diversity and inclusion, getting granular on those topics can be a dicey proposition.
Still, the ability to scour communications at an anonymized macro level is starting to make it appealing for broader employment practices purposes.
Ironically, the high capability of these D&I programs may be an obstacle to their adoption. Though a company’s leadership may desire a culture of diversity and inclusion, getting granular on those topics can be a dicey proposition. Even with transparency as the word of the day, some members of the C-suite are apprehensive about “the full Monty.” In fact, among senior leaders interviewed, the level of comfort in reviewing potential determinants of bias, discrimination, and exclusion covers a broad spectrum. Some executives are leery of the picture a straight numerical depiction of internal communications might paint. Others believe monitoring employee communications sounds invasive.
The D&I tech industry understands that, and the industry is willing and, to a large degree, able to customize data collection and analysis based on the level of incursion an organization thinks is reasonable. And the presentation of that data can also be controlled. It can be displayed as a dashboard of graphics that help an organization’s leaders measure conduct by type, frequency and severity—for example, patterns in emails or other communications as well as callbacks on résumés. The software can also issue findings for quarterly reports or even be set up to send real-time alerts based on language, images or other red-flag content.
Preempting Risky Behavior
Communications among members of an organization are data trove numero uno. And the ability to scour communications at an anonymized macro level is starting to make such oversight appealing for employment practices purposes.
“It was probably in the offing even absent #MeToo,” says Kelly Thoerig, senior vice president and EPL coverage leader at Marsh. “But now there is a push to put a finer point on what a company’s exposure is to EPL complaints.” That includes applying data collection and analysis programs that can identify problems and alert the right people.
“I’m aware of a number of data-mining platform organizations that provide these services, and more of our clients are starting to look at these tools to better track what is going on within the company and ideally help address the problem at the point of action,” Thoerig says.
Indeed, a niche industry is developing to serve companies seeking to get a handle on their overall D&I culture.
“We measure human relationships in the workplace using a digital trace of communications over email and other digital systems to identify relationships that are an anomaly,” says Greg Newman, people analytics product manager at TrustSphere, an international relationship analytics firm that primarily focuses on the financial services industry. “We don’t touch the content or substance of the communications. We look at the metrics of interactions: how long have they been interacting, the volume of communications, the parity of communications (is one person sending a lot but not getting a response), the date and time of communications (which shows the priority the parties assign to the communications), and if there are attachments to the communications (which indicates how strong the communication is).”
The program doesn’t predict if there will be internal misdeeds, but if there is a suspicion of wrongdoing, TrustSphere can “target the person in question,” Newman says. The product can be integrated into Salesforce and other platforms to mine corporate communications.
Relativity Trace is another new product, a compliance program that helps large firms identify internal exposures based on the firms’ customized needs. The software ingests various forms of content—audio, text, images, etc.—then continuously and automatically indexes that content to identify yellow flags. It then sends reports or alerts to designated team members at the client company so they can evaluate the data for further action. The organization is alerted based on rules it sets for the exposures it wants to monitor in the particular way it defines each exposure.
Morgan, Lewis & Bockius, a global law firm that includes defense of insurers as part of its practice, built its own #MeToo bot to identify behaviors that could indicate a pattern of sexual harassment, a form of workplace exclusion. “The algorithm can analyze virtually any kind of data (structured or unstructured) to which we can get access,” says Tess Blair, a Morgan Lewis partner who founded and leads the firm’s eData practice. Data can be pulled from instant messaging, VoIP, text messages, WhatsApp, social media, the internet of things, location data, and images—“virtually anything on a smart phone, drones, etc. For example, the bot can be used to analyze years of company emails and find evidence of a specific person’s questionable communication,” Blair says.
A program like this can be of assistance to firms that take a proactive approach to diversity and inclusion and can mitigate #MeToo behaviors that drive lawsuits.
“The tool analyzes great volumes of data and then very quickly surfaces markers that require further analysis,” says Blair.
A More Nuanced Approach
Some HR officials issue a cautionary note on trawling through companywide data for yellow flags. They prefer to use a more nuanced approach.
“It depends on where you are in your D&I journey,” says Kim Davis, executive vice president and chief diversity and inclusion officer at NFP. “For example, Textio can go in and look at job descriptions you are posting, emails before you send them, memos that you type and check for language you might want to change. If you are doing that in conjunction with a D&I culture, it could make sense. I’m not sure the constant sweep of all communications would be manageable. It feels like doing it at the individual level would be better.”
Marty Guastella, chief human resources officer at Oswald Companies, has a similar take.
While the technology can facilitate a broad search of data, Guastella says, “I would not be a proponent of something that looks to reveal how many times this guy talks to that guy. It’s much bigger than that. I don’t plan to implement any type of software that would scrape communications to see if person x or person y is not practicing D&I.”
Instead, Oswald has formed a firmwide business resource council that will design, implement and support a consistent program of diversity and inclusion across its seven offices. To help ensure the D&I initiative is baked into operations, the firm is using Chronus, a software-as-a-service platform that helps firms implement mentoring programs involving many participants across many offices, even worldwide. It allows members of Oswald’s business resource council to communicate online more fluidly than they could using group emailing.
TrustSphere also just introduced a new tool for chief human resource officers and HR managers that helps assess a firm’s overall health of diversity and inclusion. The software uses organization network analytics to detect bias at the corporate and team levels (though not at the individual level). It is initially focused on revealing the tendency for people to build networks with others like them in gender, which the company terms “homophily,” but may be expanded to other traits in the future. The goal is to provide empirical data to company leadership that enables organizational change.
“Everybody wants talent from diverse levels,” says Bob Harrison, chairman and president at Daniel & Henry. “Broadening your base to have a more diverse network will help in that. Knowing that you are communicating exclusively with people who look like me is the first step. The behavioral change is the next step. Senior leaders and HR really need to push it through. It won’t happen overnight.”
Changing the Heart
Unconscious bias is, by definition, formed outside of our own conscious awareness. So just knowing what our own unconscious biases are can be difficult, let alone learning how to preempt them.
“It’s changing the heart that’s hard,” Guastella says. “If you could use a D&I product to expose unconscious bias, that would be very valuable. How we temper our behavior matters on who is in the room. How you behave with your boss is different from how you behave with your best buddy.”
That concept of “changing the heart” has driven the creation of new technology-based products that appeal to deep-seated emotions. Human resources officers can now find numerous virtual-reality programs designed for D&I and employment practices that play upon a person’s emotional response to stimuli.
The philosophy behind VR training is that people gain a better understanding of bias, exclusion, harassment and other workplace mistreatment when they experience it than when they simply read or hear about it. Though lessons differ from company to company, a standard model is to place the trainee in the shoes of an employee who suffers workplace marginalization or abuse. Effectively, the trainee “becomes” that employee and interacts virtually with others who run the gamut—from insensitive to abusive. Key behaviors typically include age, race and gender bias, exclusionary behavior, patronizing or vulgar language, assumption of your role, crediting your role to another person, and making unwanted sexual advances.
The programs not only have visual representations and scripts, they really try to be immersive, which includes sensory experiences and mobility. Trainees can try to move away from people or, conversely, try to be seen or heard by positioning themselves nearer to the virtual staff they work with.
Because of the user’s near-total immersion in the virtual world—stereo speakers on the headset complement the visuals—it becomes easy to forget you are in a conference room. Though the training may lack actual tactile experiences, the brain compensates based on the other sensory inputs and stimulates an emotional response that is very nearly like reality.
“Any time you can take someone and put them into somebody else’s experience, what they go through on a process basis, daily and weekly, it can really have an impact,” Davis says. “People don’t really understand how things affect others until they experience how it feels from that person’s perspective. The systems that I’ve seen so far tend to be really expensive, and I haven’t seen the results. But it seems like it could be good.”
Key to success is follow-up from the firm’s HR trainer. Questions—such as what did you want to say to him, why didn’t you leave, and do you think so-and-so should have helped—can move the trainee from the experience of the problem to the exercise of a solution. Some VR providers integrate response training and bystander intervention into their modules, but follow-up by HR personnel is a more complete method.
The D&I tech industry is only a few years old—still in start-up mode overall. According to Mercer/RedThread Research, it primarily is being used for talent acquisition (43% of all solutions on the market); analytics (26%); advancement/development (19%); engagement/retention (12%).
And though there are about 100 vendors and 40% are achieving a year-on-year revenue growth of more than 100%, it may be difficult to find someone in a specific corporate peer group who has implemented either training or data analytics. That can make it hard to tell which products are the most effective. Insurance defense attorneys also caution companies to evaluate legal considerations before implementing any D&I tech protocol.
“The main takeaway is there needs to be a plan and process for how you will use that data, because otherwise you generate new problems of employment practices liability,” says Marsh’s Thoerig, who is also a lawyer.
Dove Burns, an equity partner at law firm Obermayer Rebmann who serves as defense counsel for management liability insurance claims, agrees.
“Use of proactive diversity and inclusion analytics is inherently helpful,” Burns says. “However, the practical ramifications of such a collection can cut both ways. Proactive data collection demonstrates a focus on a diverse, multifaceted workforce, which is helpful for public perception, branding and potentially as part of a defense when a discrimination claim is brought. However, it’s far more likely that the hard data will be discoverable and used against an employer if the actual analysis isn’t performed in such a way that privilege can be maintained.
“Once the data is collected, what happens next is also quite important and potentially problematic. Regardless of whether action is taken or not, the employer’s next steps after data is collected will be scrutinized. If a policy is enacted negatively impacting (even if only in perception) one protected group, the plaintiff can allege that the employer is looking to target that group. If the employer does nothing after data collection, that could be equally as damning.”
In the end, D&I technology may provide a new avenue to reveal unconscious corporate bias and pave the way to removing it. The challenge for HR is to develop policies for implementation that improve diversity and inclusion without exposing the company to EPL complaints. It’s definitely an area where human resources leaders can make a major impact on corporate culture.
Virtual reality technology can offer a more effective way to improve workplace behaviors. It is designed to put employees in situations where they not only can develop empathy for co-workers but also can learn and practice responses in situations where they feel uncomfortable.
While role-playing with colleagues has traditionally been used to leverage the power of human interaction, this method has shortcomings. Many people feel awkward in public settings and often deliver contrived responses to prompts. To overcome those flaws, VR providers help companies build programs that allow trainees to interact virtually in a highly personalized, individual setting, though group training or follow-up can be integrated depending on the need.
In a VR session, employees don a headset that contains stereo speakers and goggles. They are also typically equipped with some kind of handset providing haptics: a joystick, dual controllers or gloves, all of which vibrate, rumble or create pressure—increasing in price as they become more complex. Those tools allow the trainee to participate fairly functionally in the scenarios fed into the system. The trainee can move about the virtual environment, manipulate objects and converse.
“A lot of the tools are based on conversational interactions,” says Michael Casale, chief scientist at VR provider Strivr. “The program automatically translates words to the analytic tools that smartly parse out what you said and your tone of voice, even your sentiment. The most sophisticated approach uses some sort of AI so the virtual staff can provide an appropriate response. That works well for simple types of conversations—where you already know the topic and the information that will be provided.”
These conversational tools are often used in call-center and CSR training, where many incoming communications deal with the same issues.
“Some of our interactions are very specific: call center training, how do you deal with an unhappy customer, how do you provide bad news as a doctor,” Casale says. “In Strivr’s product, you are provoked to give a response to input so you can practice in a natural environment. You feel like you’re really talking to someone but not in the artificial environment of a group role play. So you’re not fumbling for words.”
Conversational training for phone interaction can also help employees deal positively with clients of different backgrounds and cultures whose communication styles evoke various emotions, such as frustration, impatience or condescension.
More realistic training scenarios, which include uncomfortable touches or more complex conversations, are not yet on the market.
“People have even made full-body suits—motion-capture suits for video games where you would be getting input instead of giving input,” Casale says. “Those things are not ready for prime time, but that technology is coming along.”