Trust requires consistency and transparency. It must be understandable. And we need regulations that protect the public
By Milton Friesen
Social Cities program director
The big success stories of our time have scaled at exponential rates – Facebook, Amazon, Apple and the other ‘super bigs’ relentlessly enlarge themselves. When something scales, however, the mix of characteristics, benefits, and costs of a business or organization may scale with it.
In contemporary life, the tools that business, research and social interaction depend on carry such mixes with them. Their complex and intricate nature means the functioning of algorithms, calculations and processing devices is opaque to nearly all of us.
We also depend on them in hundreds of ways, as Pedro Domingos (The Master Algorithm) so eloquently points out. That combination of dependence and invisibility requires a high degree of trust – directly, personally, by each of us.
Consider recent news about Charlsie Argo and her sister Carly, identical twins who submitted their DNA to five testing businesses and received five different profile answers. The magic of sending in a bit of your DNA and having a scientific wizard read your crystal ball has been jostled by this smart and practical experiment.
While the results don’t undermine the science of DNA testing, the conflicting results suggest we should temper our confidence in consumer-level testing.
In that case, the answer is that science requires interpretation. Algorithms are tuned uniquely to the companies that own and maintain them, and variability enters the equation. We don’t trust only what’s perfect or we’d trust nothing.
However, trust requires consistency and transparency. By extension, it must also be something we understand.
A Time article by Roger McNamee, a Facebook whistleblower and former mentor of Mark Zuckerberg, complements these considerations. Where we can’t all understand or engage with the details of a function, some form of regulation that serves the public interest is needed.
We can’t all test the food that enters our home for heavy metals or poisons, so we have a centralized food regulation system. The same goes for the water that comes out of our faucets. I took a drink this morning without a second thought because I trust my city’s water supply regulations.
McNamee calls for, among other things, regulation of algorithms with significant public impact. We need to know which applications of scientific and computational capability are toxic to us personally or collectively, and which ones are good.
For now, our trust in machines is premised on the traditional regulation of the organizations that develop, own and control them. They’ve provided so much usefulness that their bigger systemic effects get less attention.
We’ll doubtless see more cases of abuses and distortions that are costly to real people. These stories are essential. There are important benefits we all realize through various tools like DNA testing and optimizing algorithms to navigate around traffic jams.
But the potential of the tools to truly solve problems can be tainted when they’re applied to less noble ends: helping powerful people shrewdly take more from the weak, harvesting our family history curiosity by coaxing DNA from us that in turn becomes a data asset for a corporation, and using our family-and-friends networks like a massive Amway promotional channel where we don’t even get the lousy cleaning spray.
The unwitting lesson we’re learning is that scaling by means of the powerful tools of science means that the perennial dilemmas of sorting out the beneficial from the harmful take on new potency. It’s as if we have, through these powerful tools, added an exponent to our organizing – helping2 or harming2.
We will need to deliberate together what that potency means socially and even how it may change our conception of what it means to be human. Evaluating harm and benefit will hinge on our collective sense of human value, debates that have been sparked by people like Gifford lecturer J. Wentzel van Huyssteen and carried on in places like Wycliffe College.
It will be up to us to be alert, creative and inventive in curling that power back on itself to see what happens – conjuring up new versions of the identical twin DNA test just to keep the game honest.
Milton Friesen is Social Cities program director at the think-tank Cardus.
© Troy Media
The views, opinions and positions expressed by all columnists and contributors are the author’s alone. They do not inherently or expressly reflect the views, opinions and/or positions of NetNewsLedger.com.