Tux the Penguin reading books

FOSS Academic

New Book, Old Book

I’ve not written much about my most recent book, Social Engineering (MIT Press, 2022) on this blog. It didn’t seem like a good topic for this FOSS-centric blog. When I did write about it, I focused on the fact that my co-author, Sean Lawson, and I were able to use FOSS (particularly Nextcloud) to write it. In the parlance of this blog, that’s a decidedly Goal 1 topic – about using FOSS tools to do academic work.

So, I’ve focused instead on my next project, the Goal 2 project of writing a book about FOSS alternative social media. My idea is to use this blog to publicly write that book. Such public writing is new to me – it’s even a bit intimidating – but seems appropriate for this topic.

But the more I’m digging into what it means to envision, make, and realize ethical, FOSS, alternative social media, the more I realize my latest book, Social Engineering, is directly relevant to the project. So, this post is a way for me to make those connections.

Cover of book Social Engineering by Gehl and Lawson

The book was prompted by the explosion of disinformation and manipulation we’ve observed over the past 10 years. The big events here, of course, were the Russian interference campaign and Cambridge Analytica, both reaching prominence during the 2016 US presidential election. (We also write about more recent events, but these are two very well-documented instances of attempts at mass manipulation).

To get a better understanding of Russian disinformation or Cambridge Analytica “micro-targeting,” Sean and I turned to the concept of social engineering. We note that this term has been used in two distinct communicative contexts.

First was the early 20th century propagandists who drew on the language of social reform and social science of their time. They proposed to adjust society using scientific techniques and mass persuasion, a process glossed as “engineering of consent” (to use Edward Bernays’s famous phrase).

Second, there were the mid-20th century phone phreaks and hackers, who use the term “social engineering” to describe their practice of conning people out of passwords. This is an interpersonal sort of manipulation.

We argue that both the mass social engineering of propaganda and the interpersonal social engineering of hacking merge in contemporary corporate social media in a form we call “masspersonal social engineering.” We define this concept in the book as

an emerging form of manipulative communication enabled by the unique affordances of the internet and social media platforms. It brings together the respective tools and techniques of hackers and propagandists, interpersonal and mass communication, in an attempt to shape the perceptions and actions of audiences. To manipulate, masspersonal social engineers gather data on their targets; create fake personas to share messages; mix deception, accuracy, and friendliness as they engage with targets; and penetrate communication systems. Manipulation, in this case, can involve a range of goals, which might include attempts to change actions and beliefs. But it could also include discouraging action (e.g., voting) and amplifying or intensifying preexisting beliefs (e.g., racism, sexism, or other social divisions) when doing so is in the perceived interests of the masspersonal social engineers or their clients.

The connection to my current project on ethical alternative social media is now very obvious to me. Part of what we have to do when making a better suite of social media technologies is to remove the technologies, policies, and practices that aid and abet masspersonal social engineering.

There are obvious things, such as eliminating behavioral advertising, which encourages the use of reductive consumer profiling and links those profiles to viral messaging – a perfect breeding ground for the spread of misinformation.

But there are also less obvious things that have to be done – such as bringing media control closer to the end users. I wonder how much disinformation and misinformation spread is aided by a lack of a sense of shared responsibility for the health of the “community” in corporate social media. Contrast this with smaller, federated social media systems, where users may feel that they have more ownership over the health of the instance.

So, this post is merely my flagging the fact that my old book may end up more important to the new one. I may refer to the ideas from Social Engineering more often here. Rest assured, this is not in an attempt to sell my book, but is rather a way to think about how ethical alternatives may help alleviate the scourge of manipulative communication (or, and let’s be honest, how they also might become vectors for manipulation in their own right).

In light of that, if you’re curious about the connections between the history of manipulative communication and what alternative social media might do to undermine such manipulation, keep watching this blog, and perhaps you should check out Social Engineering – it’s in finer bookstores now, or you can get it for free from MIT.

Post Tags


Use your Mastodon account to comment on this post. Please note that these posts are public -- the only exception is if you change your privacy settings to followers-only or DM. Content warnings will work. You can delete your comment by deleting it through Mastodon.

Don't have a Mastodon account and you want one? Ask me how! robertwgehl AT protonmail . com