To Adapt to Tech, We’re Heading Into the Shadows

Almost all work settings that deal with intelligent technologies have one overarching goal: Figure out how to get value out of the damn thing. For technologists this is more about how to design and build. For marketers and business development professionals, how to pitch, and to whom. For managers, when to buy and how to implement. For users, building and mastering new techniques. Over 80 years of social science tells us very clearly that if approved means won’t allow these interdependent professionals to innovate and adapt their way forward into this exponential technostorm, some percentage of them are going to turn to inappropriate means to do so.

We’ve wired up the globe with an interconnected system of cheap sensors—keyboards, touchscreens, cameras, gps chips, fingerprint scanners, networks to transmit and store the data, and now, crucially, machine learning-style algorithms to analyze and make predictions based on this data. Each year that we build out this infrastructure it gets radically easier to observe, analyze, judge, and control each of our behavior as workers—let alone as citizens. And work has gotten a lot more complex. Just a decade or two ago, the only authority that had any sway in complex work was the expert on the scene. Now we’ve got a host of professionals and paraprofessionals with distinct expertise that get a say in how the work is going and who should be rewarded and punished. This comes via formal mechanisms like 360-degree performance reviews but also informally: Who gets to decide whether a professor is pacing her lectures appropriately, or whether a beat cop is taking too long to report back as they reach their patrol destinations? Or whether any of us was adapting or innovating appropriately? Ten years ago, the answer was basically one person. Now it can be many, including those who have access offsite and after the fact. Anyone can call foul, and all of them are empowered with massive new sources of rich data and predictive analytics.

All this means that the grey area is shrinking. Few people prefer to innovate and adapt in ways that risk catastrophe or punishment—but some will turn in this direction when we know that approved means will fail. Like it or not, more and more critical innovation and adaptation will be happening in areas of social life previously reserved for “capital D” deviants, criminals, and ne’er do wells. Leaders, organizations, groups, and individuals that get wise to this new reality will get ahead.

But how? How can we look into the shadows to find these sketchy entrepreneurs, understand their practices and capitalize on them while maintaining a sense of trust in our critical values?

Here are some questions to ask yourself, drawn from early indicators I’ve seen on the front lines of work involving intelligent machines.

Can you exercise surveillance restraint? Sometimes your organization, team, or even a single coworker will adapt more productively if you leave stones unturned and cameras off. To take just a tiny step in this direction in a robotic surgery, this might mean turning off the TVs while a resident is operating. You might want to do this kind of thing earlier on in residents’ training to give them space to make minor mistakes and to struggle without the entire room coming to a snap judgment about their capability. It’s that kind of early judgment that leads residents to conclude they have to learn away from prying eyes.

The broader point is that there’s a certain point at which surveillance, analysis, prediction, and control stops yielding returns: not because the data or predictions are wrong, but because you are destroying the under-observed spaces where people feel free to experiment, fail, and think through a problem. Moreover, excessive surveillance, quantification, and predictive analytics can drive the work experience down the toilet. Rolling this back will be exceptionally difficult in cultures or organizations that prize technical progress and data-based decision making.

Source

Author: showrunner