Skip to content

Is It More Than Just A Social Dilemma?

The docudrama The Social Dilemma on Netflix should be mandatory viewing for anyone concerned with societal changes and especially for parents of teens and pre-teens.

Wikipedia provides this synopsis:

The documentary examines the effect that a handful of companies, including but not limited to Google, Facebook, Instagram, and Twitter have over the public; it is emphasized that a relatively small number of engineers make decisions that impact billions of people. The documentary examines the current state of social media platforms focusing more specifically on problems in the industry. Jeff Orlowski, director of other documentaries such as Chasing Coral and Chasing Ice, designed the film to include conversations that tackle concepts in technology such as data mining, technology addiction, machine learning, artificial intelligence, and surveillance capitalism. The film follows a cast of interviewees, most of which have left their respective companies due to varying ethical concerns that the industry as a whole has lost its way.
The documentary begins with an introduction to the array of interviewees, each listing the companies they had previously worked for and their role within each respective company. The cast of actors is then presented with news coverage of social media’s adverse effects playing in the background. Each interviewee then goes over their grievances with social media. Between interview commentary, the dramatization side of the documentary provides insight into the inner workings of the technology that powers social media. […]
In the documentary, it is stated that social media is a “useful service that does lots of good with a parallel money machine”. Social media has many beneficial qualities: a few that are mentioned in the film include the facilitation of interpersonal connection across long distances, acquiring knowledge, and even finding organ donors. However, former employees of social media companies explain how user data can be used to build models to predict user actions and how companies keep user attention to maximize the profit from advertisements. The film then dives into the manipulation techniques used by social media companies to addict their users and the psychology that is leveraged to achieve this end. […] The documentary also touches upon how user actions on online platforms are watched, tracked, measured, monitored, and recorded; companies then use this human-generated capital to increase engagement, growth, and advertising revenue. […]
The final point the film touches on is fake news. Tristan Harris refers to it as a “disinformation-for-profit business model” and that companies make more money by allowing “unregulated messages to reach anyone for the best price”. The film discusses the dangerous nature of the flow of fake news regarding COVID-19 and propaganda that can be used to influence political campaigns. […] The interviewees come to the unanimous decision that something must be changed for society to prosper. They claim that social media companies have no fiscal reason to change. […]

Perhaps there is also another scenario in play?

We can accept as basically fact that machine learning algorithms are giving people more and more of the content that they already agree with, and effectively leading them down an echo chamber that feels good and generates maximum viewing time (therefore revenue).

We can also agree that the old adage of “if it bleeds it leads” is still true, and conflict or tribalism is a strong emotive driver for people continuing to click on links.

In the last two USA elections there has been an almost 50:50 split between the votes for the two parties.

In a perfect machine-learning world, the best way to maximise the length of time that each individual spends on the site is to maximise the length of time they are riled up against people not in their tribe.

If, for example, there were a 75:25 split in the two tribes, that would mean that a huge opportunity for conflict would be lost, as too many people were agreeing with one another. The maximum amount of conflict is when both armies are of equal size, such that each person has the maximum chance of meeting an opposing “enemy”.

Therefore, one could postulate the theory is that machine learning algorithms are inadvertently (or maybe purposely) leading the USA towards a perfect 50:50 split between two factions, as creating an even split would give the maximum amount of conflict, therefore the longest engagement time … and the most revenue.

After all, they are in it for the money.

If you enjoyed this BFD article please share it.

Latest