By Dustin Moores
“We are experiencing a higher than normal volume of calls at this time. Please stay on the line to maintain your priority. We value your call. Your current wait time is approximately….”
For many of us, messages like these are all too familiar. In my case, it seems that whenever I call my bank, my wireless provider, or insurance company, I am greeted with some derivative of the above. Even when I call at a “quieter” time, I can’t seem to avoid that dreaded message. How is it that my mobile provider experiences higher-than-normal call volumes at 2:17 a.m. on a Wednesday? If they really valued my business so much, wouldn’t they just pick up the phone?
Recently, I discovered there might be method to this madness –– that chances are, I am not the victim of some cruel “forever-on-hold” curse. And as it turns out, data brokers might have something to do with it.
My hunch is that instead of a curse, I am the victim of something perhaps equally sinister: consumer ranking products. Consumer ranking helps companies determine our potential value to them. In turn, they can make better decisions about what products and services to offer us. One such product allows businesses to improve call-routing decisions by “identify[ing] the inbound callers most likely to convert, those with the highest lifetime value and those most receptive to cross sell-offers.” Armed with consumer ranking, today’s call centres know who you are before your call is even answered. As the New York Times notes, these products “can determine whether a customer is routed promptly to an attentive service agent or relegated to an overflow call center.” Suddenly, all that time I’ve spent on hold makes sense. I must not possess a high lifetime value. Ouch.
As a student saddled with significant debt and a limited budget, a company who employs consumer ranking would likely predict I won’t be shopping for big-ticket items anytime soon. Debt and other indicators suggest I’m at greater risk of defaulting on loans or other payments. Accordingly, I am sent down to the bottom of the customer value heap. I can wait while “high value” customers are bumped to the front of the line.
The rise of consumer ranking underscores serious questions about fairness brought on by new, predictive technologies. Brokers and their clients, armed with data and algorithms, make assumptions about us and decisions based on those assumptions, all without our knowledge. They make inferences regarding our race, age, sexual orientation, socioeconomic or health status. Their decisions can affect the products, services, and prices offered to us, or even whether our calls are sent to an overflow centre. Some thinkers, like Canada Research Chair in Ethics, Law, and Technology, Ian Kerr, argue that when it comes to predictive technologies, there “is wisdom in setting boundaries around the kinds of assumptions that can and cannot be made about people.” And where assumptions are made, individuals should have the opportunity to “observe, understand, participate in, or respond to” them.
While consumer ranking products raise a host of concerns when they operate as planned, the consequences can be devastating when they get it wrong. You may not qualify for a mortgage or, depending on where you live, you may not be accepted into a university or admitted into a hospital based on faulty assumptions about your ability to pay.
When ranking products get it right, it can seem as though the cards are stacked against us. Brokers and their clients potentially know more about our consumption habits than we do ourselves. As a result, they can manipulate us by displaying ads at the optimal time and place to elicit a purchase, or they can offer personalized upgrades to squeeze that one last dollar from our hands. They can also hide certain products and services from our view, limiting our available choices.
Moving forward, we must ask ourselves what limits, if any, should be imposed on the types of data companies collect about us and the assumptions they are allowed to make. How do we maintain an even playing field for consumers when businesses are armed with powerful predictive technologies? How do we let consumers review and correct errors in the information held about them by hundreds, if not thousands, of independent brokers? How do we ensure brokers and their clients use our information responsibly? And finally, what role might data brokers play in widening the gap between the most and least fortunate members of our society? Consumer ranking can create two-tiered systems that “invisibly prioritize” some consumers over others. Are we willing to leave the less fortunate among us perpetually on hold?