INFORMATION SYSTEMS   RESEARCH  

Decoding the psychology of big data and smart decisions

September 8, 2021 ·

Contributed by: Sonia Verma, McMaster University

A professional woman seated at her desk, focused on her computer monitor.

Algorithms and data drive organizational decisions on hiring, staffing, priorities, production, expansion, marketing, partnerships and many other matters. And if an algorithm is skewed or if the organization doesn’t use it correctly, it can lead to poor decision making, with far-reaching financial and social implications.

Maryam Ghasemaghaei is working to make sure that doesn’t happen.

The Chair of Information Systems at the DeGroote School of Business is studying how organizations can apply advanced technologies — like big data analysis and artificial intelligence — to make better decisions.

The big deal about big data

Big data refers to more than just a large amount of data — it has to satisfy three criteria: A large volume of information, of course, which comes in at a high velocity and with a great deal of variety or complexity.  Firms can analyze data descriptively, to help understand something from the past (like your revenue for a past quarter), or predictively, to forecast a likely future outcome (like projecting profits) or prescriptively, to inform a course of action (like where to open a franchise outlet or who to hire).

We all want to make decisions based on data and informed by evidence, Ghasemaghaei says.

“But there’s actually a very high failure rate when it comes to correctly using big data, and part of my research focuses on helping organizations use it correctly.”

If a company or an organization invests in big data, they also need to invest in resources and staff who can correctly analyze the massive volume of complicated data coming in very, very fast. She explains: Firms need to make sure they not only have enough staff to use it, but that they are trained to use it effectively and efficiently.

The dark side of big data

You can use rich, complex data to inform all kinds of organizational or business decisions, Ghasemaghaei says.

In fact, many companies have automated the initial steps of hiring new employees. And part of Ghasemaghaei’s research focuses on making that process more fair and less prone to bias and discrimination. She says: “firms may make discriminatory decisions when they use biased algorithms. This can cause certain groups in society to constantly receive unfair decisions.”

For instance, if an organization has historically hired a certain kind of candidate, a hiring algorithm is likely to skew toward continuing to recommend that kind of person. So if most past hires are men, the algorithm is likely to recommend a male candidate over an equally or even more qualified female candidate.

“I’m trying to understand whether and why algorithmic bias may lead to making discriminatory decisions and how biases in algorithms affect accepting recommendations that analysts offer,” she says. “If managers know an algorithm is biased, would they be willing to accept its recommendations, or would they disregard them?”

That depends on many things, it turns out, some very hard to quantify: Moral identity, for instance — which includes the manager’s belief in stereotypes. Managers who believed stereotypes accepted biases against women, regardless of their own age and gender, Ghasemaghaei found.

“Those with ‘high’ moral identity were less likely to accept an algorithm with high bias,” she notes. “But those who trusted the algorithm accepted its recommendations and didn’t even question whether it was biased.”

Because this field is evolving rapidly, a great deal of research — including Ghasemaghaei’s, in the past — has focused on the positive aspects of using advanced technologies. But increasingly, she says, there’s a need to consider its negative effects.

One of her research projects examines whether big data is a factor in workplace knowledge hiding, where people conceal information or share incomplete information.

“I found that the volume and velocity of data increased knowledge hiding,” she says. “One main reason could be that data analysts spend a lot of time on studying such large volumes of data in a short time. They worked very hard and spent a lot of time to arrive at their insights, and are reluctant to share with their colleagues, perhaps for fear that they are giving up power.”

Psychology or tech?

She’s also working with researchers from other disciplines to study the key factors that impact the process through which CEOs are able to successfully pursue new strategic initiatives, like the introduction of a big data-oriented strategy;  and whether data analysts who are under immense pressure in analyzing big data are prone to cut corners or cheat on their analyses.

For Ghasemaghaei, exploring the links between unexpected factors is one of the things that makes her work exciting. “You have a problem that nobody has figured out, and what really excites me is the outcome that I get after studying it for a few years.”

The more Ghasemaghaei explains, the more it’s apparent that her work sounds more like psychology than technology.

That’s exactly right, she says. “Information systems is all about the use of technology. If we find there are complicated technologies that people can’t process or use, they may come under a lot of stress and it’ll affect their performance,” she says.

“When my findings impact people’s lives and I can help organizations and people succeed, it’s very exciting.”


Related Stories