May 8, 2021

AI and AM versus FM technology

Tumblr @maxofs2d
I always think of this tweet when I see people praising machine learning and the new kinds of AI as a panacea.
[Tumblr][Tweet]

It is really important to be able to spot the difference between AM versus FM technology, particularly when it comes to their promises.

Actual Machine (AM) technology involves a lot of well-understood costs & challenges. It’s usually not very exciting. The solutions offered are quite obvious.

Example: How do we lower traffic fatalities and provide more mobility for people of all classes? We close more lanes of roads and make them bus only; we pay to have more regular bus services and routes. We build all future intersections as roundabout and retrofit all that we can for maintenance. We make breathalyzers with ignition locks part of the start-up for driving a car (without any of the associated privatized fees).

But this is very boring.

On the other hand, “self-driving” cars and hyperloops feels exciting. None of the costs or downsides are talked about or well-understood in the public imagination, so it sounds great, even though we know those autonomous vehicles will probably find it cheaper to just drive around all day rather than pay for parking anywhere, and when they make decisions about who to protect in a crash, it’ll probably be to favor the ultrarich owners who can adopt the technology, not the unhoused person in a tent beside the road.

Fucking Magic (FM) technology isn’t going to solve anything we’re not prepared to solve right now because that’s not how it works. If we don’t have the will to solve policing at present, automating the bias just increases the level of harm a white supremacist surveillance state can visit upon the population while blaming “objective” machines.

There are some sorts of magic that are helpful or at least not harmful. But FM engineering solutions are not among them.

Leave a Reply

The Satanic Temple's Boogeyman

Queer Satanic

TST sued us from April 2020 to September 2024, and we are still here.