Key Learnings of the NetForum on Algorithm Regulation featuring Esther Dyson

I am pleased to share the Key Learnings of the NetForum on Algorithm Regulation featuring Esther Dyson held on February 28, 2017.

This is a broad subject, and an explosive one too — I am telling you right now that this will become a very hot potato for the tech industry. No, it probably already is but one largely hidden from view and handled discreetly in the halls of government in the US, China, Russia, India and the EU.

Fundamentally, the issue I dared raise is as follows: It is a given that our lives are migrating to digital environments owned and controlled by private entities — such as Facebook, Google, Snapchat, Twitter, and WeChat and many other smaller networks. How these digital environments allow us to interact with each other, what tools they give us to protect ourselves from each other and from commercial and governmental entities, and how they craft their terms of service and enforce them may be starting to raise public policy issues that call for some form of regulation.

I am going to try to distill the conversation we had at the NetForum. As always, I do not disclose who said what to protect people’s private views.

A. You Opted-In, Deal With It.

Fairly, a couple of participants did state that I opted-into Facebook. No one forced me. So it is only “fair” that I submit to its rules. I can always leave. I get that and the “personal freedom and responsibility” folks would say it is insulting and demeaning to me to assume that I am infantile and incapable of making an informed choice to join Facebook and protecting myself, that I need to be protected from my own choices. I get that argument but I think that it may apply to the old virtual communities where consenting adults joined small groups that engaged in say, sado-masochism. It could apply to Reddit, and perhaps to Tumblr. But is the argument overwhelmed by the “public interest” when you are talking about something that has become perhaps one of the primary way people communicate, as Facebook has become?

Support for the idea that Facebook and its kind may have tipped into something more than a private club came from Supreme Court Justices  considering whether the State of North Carolina could restrict a registered sex offender from having access to social media platforms (as reported in the NYT, I did not read the transcript myself). [I always get help from the NYT around the NetForum events as they always choose to write something relevant to the NetForum topic.] Supreme Court Justices likened Social Media sites to a proverbial Public Square (Justice Kennedy) and implied that access to Social Media sites may be a constitutional right implied under the first amendment – that they are “embedded in our culture as ways to communicate and ways to exercise our constitutional rights.” (Justice Kagan.)

To be followed. Nice to know that the Supremes are keeping up and may in fact be ahead of the public in recognizing the structural social impact of a world where we are all connected.

B.  We Are Being Manipulated All the Time Already, Nothing New Under the Sun. 

Another preliminary argument we had to dispose of was perhaps the “Naive Card.” Pretty much everybody in the room (40 people) expressed the view (or nodded in acquiescence) that manipulation is everywhere anyway. So, being concerned that Facebook and other Social Networks may be controlling us via their Algorithms is basically Old News (though certainly not Fake News). Whether it is the media, government, or advertisers — the view that we are constantly being manipulated in one way or another went unchallenged in the room. Much to my alarm, actually. I did express the view that I was disappointed by the cynicism in the room and argued we could still channel our energies to make a better a world.

C. Algorithm Manipulation is Real but the Blame is on the Humans.  

On the question of whether we are being manipulated through Algorithms, there was little push back. But a significant debate emerged on whether it was appropriate to talk about “manipulation by Algorithms.” Several forcefully made the point that Algorithms, like all computer programs are neither bad or bad. They are just computer instructions, programmed to achieve an end. Hence the focus should be on the Algorithm writers and their goals, and that some goals should be permissible and others not for social policy reasons or otherwise.

There were others, myself included, that pointed out that there were unintended consequences  of even well intentioned Algorithm writers and their programs. If one can observe that users of a massively successful and profitable Social Network exhibit increased symptoms of loneliness and depression and that bullying is rampant — abstracting for a moment from the serious question of causation — can we legitimately ask whether the goals of the private network (for example, profit maximization through engagement with advertising cleverly generated by highly perceptive algorithms) could have been achieved with greater attention to unintended secondary effects? At the same time, how can one predict the secondary effect of an engineered virtual construct on its human population? (We did spend sometime discussing a new concept called Virtual Distance that measures the isolation of highly connected people, among many other things.)

D. Regulation is probably Needed but by Whom and How? 

I would say that the NetForum participants largely favored some type of regulation of Social Networks, such as Facebook, or perhaps in particular Facebook. I am using the word regulation broadly and not necessarily as a synonym for government regulation. In fact, in light of the distrust and cynicism I discussed earlier, most were skeptical the government could or should do the regulating. The mood was more one of “Yes, it ought to be regulated but by whom or what?”

Although we did not have sufficient time to get into this aspect of the topic, there were germs of ideas on regulation. One concept that was repeated was Transparency.  More transparency on what data is collected, how it is being used and allowing more control over data was brought up forcefully. Another concept that was brought up was more training of the engineers that make up the algorithm writers — not allowing a certain type of person to (perhaps unintentionally) perpetuate their values and essentially to impose them on a global scale by sensitizing such people to other values — think increased humanities training for engineers. Yet, another concept that was briefly discussed is credentialing — after all the designers of physical spaces inhabited by humans (aka architects) need to get credentialed by the state and their professional organizations. Finally, we were reminded that there were already plenty of laws on the books that could be applied with the coercive force of the state to protect citizens from often self-serving claims of “do-goodism” by aggressive commercial enterprises.

In closing, it was an exciting event, kindly hosted by my friends at Willkie Farr. I am grateful to Esther Dyson for having contributed so much to the debate, and to all the attendees who contributed with their unique perspectives. We are at the dawn of a new age in social relations, and I reject the notion that we are powerless. (Needless to say that all opinions expressed herein are my own.) I believe that the members of our Digital community need to be conscious of the unintended consequences of our work. Future NetForum events will continue to explore this subject.

In fact, the topic of the next NetForum, hosted by Margaret Isa Butler of Greenberg Traurig, is Should VC’s consider the Consequences of their Investments! The invite will go out will go out very shortly.

I look forward to seeing you at the NetForum!

Laurent

Leave a Reply

Your email address will not be published. Required fields are marked *