Architecting Software for Freedom in Networked Services
Who Does This Free Server Software Really Serve?

Alexandre Oliva

Networked applications often adopt a client/server architecture, in which client software takes care of user interaction, while servers hold data and application logic. Maintaining and securing server software are challenges for home and small business users. Third parties do so with economies of scale that make them nearly irresistible for users. Adopting a different networked application architecture could protect users from such threats as censorship, Service as a Software Substitute (SaaSS), and loss of autonomy, and avoid the risks of (re)centralization.

“SaaSS is equivalent to using a nonfree program with surveillance features and a universal back door” -- Richard Stallman

Free Software history is full of examples of server software that users could install and run autonomously on their own computers, developed to promote server-side user autonomy and decentralization, but that third parties install and run for multiple users, defeating these motivations.

It has happened to such widely-used communication and publishing services as instant messaging, email hosting, blogging, social media, and source code hosting, and to domain-specific software as for managing cities, schools, libraries, shops, restaurants, etc.

An important observation is that it has often happened even when software developers and server maintainers embraced decentralized (federated) architectures, and actively promoted decentralization by encouraging users to install their own servers.

When users' own computing is performed as a service for the users on a server controlled by a third party, the users relinquish control over their computing and their data. That's SaaSS, and that's why it's freedom-denying. If users ran Free Software on a server under their own control instead, they'd retain freedom and privacy. Alas, when they compare the costs of maintaining their own servers and IT staff with outsourcing the server to a service provider that runs the same software for multiple clients, the economies of scale are irresistible for all but the most freedom-concerned users.

These economies of scale tend to lead all server software to outsourcing and (re)centralization, and thus all server-side computing software to SaaSS. Even server software that is Free Software! It doesn't follow that it's unethical to develop Free Software for server-side computing, but even if it's developed with the intent of promoting users' freedoms, the economies of scale it enables play against this goal, driving most users to SaaSS instead. It's a poor strategy to liberate users.

Even when it comes to communication and publishing services, that are not SaaSS even when outsourced, the centralization that follows from the economies of scale, and the power imbalance that follows from centralization are often undesirable if not outright dangerous: they enable censorship, surveillance, oligopolization, and mandatory adoption of proprietary protocols. It is thus desirable to try to avoid these consequences of server software. Here is a plan.

Users carry portable computers that are far more powerful and better connected than servers and workstations of a few years ago. Users often find it easy to install apps on them, so they could conceivably install and run server software on them, if network, application and hardware architectures didn't discourage such uses. With all this computing power and connectivity in our pockets and on our desks, we could do a lot better:

“we can create a peer-to-peer program through which collaborators can share data encrypted. The free software community should develop distributed peer-to-peer replacements for important “web applications”” -- Richard Stallman

We could have apps that perform the computing next to the user it pertains to, on their own device, communicating through reliable, networked, possibly peer-to-peer, dumb storage infrastructure. Dealing with all networking issues in a dumb storage layer enables simpler app design and logic; there are designs that can enable some tasks even while offline.

With strong crypto(graphy), even if data is dispersed over third-party devices or outsourced servers with backdoors for their operators, access and modification are limited to authorized user devices holding users' keys, with guarantees based on solid math, not on server security. Users' devices, holding their access keys, would still need to be secured, as much as they would to securely access a server with similar privileges.

The mentioned freedom and design advantages of this architecture could be realized with centralized dumb storage servers, internal or outsourced. What full-fledged P2P reliable dumb storage offers over that is resilience in the face of various server and network failures.

Embracing P2P does not rule out configuring one's own storage servers as peers in the system: that's actually a good way to keep the P2P infrastructure running even with few connected users. I expect that a relevant collection of killer apps, yet to be figured out and developed, may be enough of a motivator to attract users to install and keep running a shared global P2P storage layer, enabling a growing set of apps to be developed, tried out and used at large.

Free Software communities still devote plenty of resources to server software. As more of us realize that improvements to Free Software for servers end up helping SaaSS providers enslave more users through it, I expect us to switch our focus to P2P storage, strong crypto and user-side computing, so as to encourage cooperation and to avoid (re)centralization, thus promoting freedom for all users like any decent peer should. May it become the GNU normal.

Copyright 2020, 2021 Alexandre Oliva

This is a DRAFT, so permission is NOT YET granted to make and distribute verbatim copies of this entire document worldwide without royalty, provided the copyright notice, the document's official URL, and this permission notice are preserved.