Felix Stalder via nettime-l on Sun, 27 Jul 2025 11:23:39 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Computational Culture issue ten. Special Issue: Situated Bayes


Hi Matthew,

Congratulations! A great issue, a really timely and urgent extension of the line of thinking that I encountered first in Joque's book. The use of Bayesian statistics might create an opening towards very different political ends than those which is is currently used for and that exploring this opening might be a more productive than simply "resisting (AI)". We talked a bit about that over dinner recently.

In much of the philosophy/epistemology concerning Bayesian statistics the issue of the "prior" is absolutely central, and your intention to turn of it from a problem for objectivity into the foundation for situatedness is absolutely correct.

What is usually less discussed, perhaps because the issue not unique to Bayesianism, is the question of the threshold. When is the likelihood of an hypothesis being true strong enough to act as if it were true?

In ML, they try, as you write, minimize the situatedness by using "noninformative priors" despite the extra compute this requires, but they can at least to be non-subjective. In many ways, the prior is subjective only in a context where computation is scare. In a context where computation is treated as abundant, it's meaningless, a random starting point in a very long line of iterations. It's not subjective, but brute force ;)

But the situatedness creeps back in through the threshold. What degree of error is acceptable, which is always also a question of who has to cover the costs of these errors. In this way, Bayesianism create a new type of externality.

I think this question of threshold, while not unique, is particularly urgent in Bayesian systems because they are less about generating knowledge (in a conventional scientific way, there the threshold is a stable p-value) than about enabling agency, on the spot, under a subjective risk/rewards ratio. In certain systems, say placement of advertisement, a 20% likelihood might be sufficient, in others, say, systems in HR departments, one would hope of a much higher threshold. The point being, the threshold is entirely subjective.

The consideration of the subjective/situated/political nature of threshold might open up less towards the issues you are concerned here, but more towards social justice question (how to distribute risks/rewards), but as a source of subjectivity it's a bit underrated.

Anyway, great issue!


all the best. Felix




On 7/25/25 09:28, Matthew Fuller via nettime-l wrote:
Computational Culture, a journal of software studies
Issue Ten, July 2025
Special Issue: Situated Bayes
Edited by Juni Schindler, Goda Klumbytė and Matthew Fuller
Special Issue Introduction
Juni Schindler, Goda Klumbytė, Matthew Fuller, [Situated Bayes – Feminist and pluriversal perspectives on Bayesian knowledge](http://computationalculture.net/situated-bayes/)


--
| |||||||||||||||| http://felix.openflows.com |
| |||||||||| https://tldr.nettime.org/@festal |
| for secure communication, please use signal |

--
# distributed via <nettime>: no commercial use without permission
# <nettime> is a moderated mailing list for net criticism,
# collaborative text filtering and cultural politics of the nets
# more info: https://www.nettime.org
# contact: nettime-l-owner@lists.nettime.org