Why People Matter More than Algorithms
There’s a lot to love about an algorithm.
For one thing, it’s orderly. The procedure for solving a formula of this kind is a step-by-step progression. The nice thing is that it includes repetitive calculations, and the familiarity feels friendly. When you arrive at the algorithm’s solution, you also feel a sense of accomplishment.
I don’t know if computer software is capable of developing feelings about the algorithms it solves. I don’t think it cares – yet. I do know, however, that computers excel when it comes to deep learning, and deep learning is the result of building layers upon layers of algorithms that can be solved in fractions of a second.
As much as I like a good algorithm, I can’t do that.
Like you, I bring something else to the table, and it can’t be replicated or replaced by machine learning.
Humans interpret contextually
People are far better at dealing with ambiguity. Algorithms don’t do well with the gray areas between black and white; they operate in a world of yes or no, 1 or 0. They can’t evaluate or judge. Computers lack true synthesis. Feeling has no influence on them. Every solution is based on a series of cause and effect steps, much like a flow chart, with no wiggle room or even any grace.
Stanford recently conducted an experiment between high school students and algorithms. The goal was to compress images without compromising clarity and quality. In a race against computer algorithms, the high schoolers won, hands down. It was all a matter of perspective, and as it turns out, people are sharper at acuity.
Algorithms make recommendations, but not for the better
Some people harbor concern that complex math formulas are trying to influence our decision-making. Algorithms, they say, are telling us what to do. Algorithms are getting smart, but they haven’t outsmarted us.
Wharton professor Kartik Hosanagar has another take on how algorithms influence us; he thinks that without our supervision and monitoring, they make poor decisions for us. They lack subjective subtlety. Hosangar recommends that the data which algorithms generate be transparent so that we can control the decisions made for us. It’s up to us to act on our ignore the advice of algorithms.
Every time you make a purchase on Amazon, for example, an algorithm recommends another purchase with the words, “Customers who bought this item also bought . . . .” The online retailer’s algorithms know how to pair together the items you might consider for your next purchase. Netflix and Pandora do something similar by recommending and curating entertainment collections according to your interests.
It’s up to you whether or not to take your computer up on its suggestion.
Letting a computer choose for you might not seem significant now, but eventually, it could be algorithms that determine whether or not you’re getting a promotion or a bonus this year. A computer may decide what medical treatment is best for your condition, both physically and financially.
Most of us want humans making those decisions, not an inanimate device that relies entirely on objective data.
In the end, people will always matter more than algorithms.