The Blob Fallacy

Alexandre Oliva

Some people claim that a blob installed and loaded by users is no different from a blob preloaded and embedded by suppliers into a hardware black box.

I say preloaded blobs are not ideal, but strictly better than accepting suppliers' unilateral control of upgradable device smarts.

We are not the same.

(In case you're not familiar with the Internet meme, this means I'm pointing out a moral difference in spite of a superficial similarity, and I'm calling out differences in underlying motivations.)

The fallacy that makes user-loaded and supplier-preloaded blobs seem the same is that the user has just as little control over the device, because blobs are nonfree: they serve their supplier, not the user.

It's a false equivalence: that similarity exists, but so does a key difference in the power the supplier holds over the user, through exclusive control over the software that runs the device.

To see more clearly the difference between these two cases, let's resort to a third one: a device that is connected to the Internet, that calls mothership regularly for instructions, and onto which the supplier can force updates or even remotely kill it.

Think home automation (smartIoT), telescreening (smartphones), and voice assistant (smart listener) devices, smart automobiles, smart tractors... See the pattern?

The operative term is smart, that in the context of digital devices generally means that the supplier can remotely make the device smarter at controlling its users, at serving its true masters, and at thwarting users' attempts to resist that control.

The user is just as deprived of control as in both previous cases, so they're at the same spot in the "user control" axis, but in the "supplier control" axis, the device can have its behavior remotely modified, through its programming. It takes no more than a suppliers' whim to get the behavior of users' devices modified.

A little further along the "supplier control" axis are web blobs, the pieces of JavaScript chosen by suppliers and delivered to run on users' devices at every request. Most web blobs ship as object code, whether obfuscated or minified JavaScript, but even when they ship as source code under freedom-respecting licenses, "user control" is hardly ever available, and suppliers usually presume and sometimes enforce it absent. That makes that environment the WWWorst App Store, and Free Software in it Tivoized.

Further along the "supplier control" axis is the notion of a Service as a Software Substitute, SaaSS, in which you don't even get a copy of the software, and must turn in your data to have your computing done for you under the supplier's control.

Users are also equally deprived of control in these two cases, but there is a very notable difference in how much control suppliers get.

Moving in the user-favorable direction in the "supplier control" axis, we get to blobs that the user (more often, the operating system) installs and loads onto the device, one of the cases mentioned in the first paragraph.

The supplier can't push them directly onto users' devices, but it's not too hard to persuade operating system distributors to push them onto their users. Security (for whom, from what?) fear mongering can mislead users into blob "dupegrades", wherein the new version brings new bugs, misfeatures, disabled features, and other user-hostile changes with it, some of which may be irreversible, such as plugging jailbreaking features that users relied on to gain control over their own devices.

Such a blob, that the supplier cannot replace unilaterally on a whim, but can get replaced by convincing users or software suppliers to get it installed, is at the same spot in the "user control" axis, but not as far along as SaaSS, the web blobs, or even the smart devices in the "supplier control" axis.

A hardware black-box circuit, and equivalent opaque software preloaded into hardware, are not only technically but also ethically equivalent. They could be replaced if the supplier convinced users to bring the device to a repair shop. That would be more costly, thus less likely for suppliers to pursue, and even if they did, users could refuse.

Such hardware circuits and software blobs preloaded into hardware are at about the same spot in the "user control" axis as the other cases; a little more favorable to the user, actually, if you count the freedom to refuse changes. In the "supplier control" axis, it's less subservient to the supplier than all the earlier cases.

This analysis suggests that increasing flexibility serves whoever controls the software that runs the device. If the software respected users' freedoms, the added flexibility would serve users, but a piece of nonfree software serves the supplier, and the flexibility it brings is thus likely to be abused against users.

Software that deprives users of essential freedoms subjugates them and is thus unjust, unethical, intolerable. Analogous reasoning applied to hardware might lead to similar conclusions, but with current technology, some hardware analogues to essential software freedoms would be merely theoretical, so it is still generally reasonable to hold the position that no unacceptable harm is done if they are missing for now. The GNU Project holds this position.

Others already argue otherwise, and that may also be reasonable for some narrow but growing classes of hardware. As hardware technlogy evolves, freedoms that used to be merely theoretical may become practically viable at various layers of hardware designs. As theoretical freedoms transition to viable and valuable, constraining them becomes real harm, rendering unethical and thus intolerable some practices that may be currently found acceptable or tolerable.

Anyway, however (in)tolerable you find opaque hardware, a device with preloaded software equivalent to a hardware circuit should be equivalently (in)tolerable: there are no ethical grounds to accept one while rejecting the other when they're effectively indistinguishable black boxes.

But however (in)tolerable you find preloaded blobs and their hardware circuit equivalents, granting suppliers more control over devices, whether through user- or remotely-installed blobs, makes room for more user subjugation by suppliers, which is surely less tolerable.

They are not the same.

Some take the false equivalence disputed above further, arguing that greater supplier control is advantageous (to the user?!?), because suppliers can fix bugs and add features. That is a nice theory indeed. Quite naïve, too. Whatever power a supplier retains over the user, through the software, can and most likely will be used to exert control over the user and further exploit the user over time:

Designing devices to take updated commands from the supplier has proven, time and again, to be advantageous to suppliers, in detriment of users, and too prone to abuse. Consumer protection laws recognize and strive to correct such imbalances, and they may serve as a useful tool for us customers to recover our freedoms after being misled, even though these laws are often violated and have loopholes that businesses constantly strive to open and exploit.

When we select a product that behaves in a certain desirable way, and the supplier modifies it, through a software change, to no longer behave that way, that ought to be widely recognized as a bait-and-switch swindle. We should be able to get a court of law to penalize such misbehaviors, and to order a supplier not to mess with the device we selected. That could enable us to turn devices into equivalent to a hardware circuit, and though that's not ideal, it's progress for freedom, compared with supplier-controlled devices.

Once we hold suppliers accountable for downgrades, bricking and other such abuses imposed through exclusive control over the software, and courts order them to bring back to a functioning state, at their own expense, a device that they degraded or caused to stop working, whether by a software upgrade or by the deactivation of a server they designed the product to depend on, they probably won't engage in such malpractices so lightly, and perhaps they won't even design room for such abuses into their products quite as often.

Defrauding consumers and remedying damages one gives rise to have long been addressed by law. Strengthening consumer and human rights where they are lacking wouldn't put an end to abuses or abusers, but it would discourage them, and may further aid user freedom, leading suppliers towards making devices equivalent to a hardware circuit, or even enabling consumers to take control.

Although useful, such laws are neither essential nor sufficient to defend user freedom. To control our digital lives and to be safe from device suppliers' whims, what we need is a piece of pragmatic wisdom: we have to reject devices in which suppliers can modify what we can't.

They say you're better off accepting software updates for your devices without questions, and not minding when suppliers exert control over you through them.

I say they appear to be supporting the power that suppliers wish to hold over you, rather than standing for your freedom, and I encourage you to question why they'd seem to choose this alignment.

We are not the same.

Copyright 2022-2023 Alexandre Oliva

Permission is NOT YET granted to make and distribute verbatim copies of this entire document worldwide without royalty, provided the copyright notice, the document's official URL, and this permission notice are preserved.