Problems (7): PR-00001–PR-00007 — Meaning Crisis, Illegitimate Domination, Performance Society Exhaustion, Fascization, Epistemic Power Concentration, Platform Feudalism, Knowledge Isolation Models (5): Han, Fisher, Foucault, Vervaeke, Graeber/Anarchism Values (6): Epistemic Sovereignty, Authority Requires Justification, Exhaustion Is Structural, Mutual Aid Over Market, Digital Autonomy Is Political, Deep Reflection As Practice Arguments (3): Platform Feudalism → Democracy, PKM as Epistemic Strategy, Algorithmic Rationality Erodes Autonomy Organizations (5): IndieWeb, Wikimedia Deutschland, Reporter ohne Grenzen, Mehr Demokratie e.V., netzpolitik.org Plans (1): de-plan1-sven.md — Germany plan with 6 challenges, 5 strategies Data (1): DE-Democracy-Metrics — V-Dem, RSF, EIU, ARD-DeutschlandTREND, Bundeswahlleiter, More in Common Cross-linking: KNOWLEDGE-GRAPH.md + entities.json index Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
30 lines
2.8 KiB
Markdown
30 lines
2.8 KiB
Markdown
Claim:
|
|
|
|
Algorithmic rationality — the systematic substitution of human judgment by algorithmic decision-making — erodes epistemic autonomy in ways that are structurally invisible and therefore more dangerous than overt forms of epistemic coercion.
|
|
|
|
Argument Style:
|
|
|
|
Causal / diagnostic. This argument establishes a causal mechanism and argues for the distinctive danger of its invisibility.
|
|
|
|
Argument:
|
|
|
|
1. Epistemic autonomy is the capacity to form beliefs through one's own reasoning processes, evaluated against evidence one has independently assessed.
|
|
|
|
2. Algorithmic rationality operates by substituting algorithmic outputs for human judgment: recommendation algorithms determine what to read, scoring systems determine creditworthiness, feed curation determines what is visible as "news."
|
|
|
|
3. Unlike overt epistemic coercion (censorship, propaganda), algorithmic substitution presents itself as personalization, convenience, or neutral information delivery. The user appears to be choosing freely.
|
|
|
|
4. Foucault: truth regimes are most effective when their operation is invisible — when what is excluded from discourse is not felt as exclusion but as irrelevance. Algorithmic truth regimes achieve this: what the algorithm does not show does not feel censored; it simply does not appear.
|
|
|
|
5. Zuboff: the behavioral data extracted by surveillance capitalism is used not only to predict behavior but to modify it — through choice architecture, notification timing, content sequencing. This is not persuasion (which engages reasoning) but behavioral modification (which bypasses it).
|
|
|
|
6. Han: the performance society's tyranny of positivity has an algorithmic complement. Algorithms optimize for positive engagement signals (likes, shares, dwell time). Content that produces discomfort, contradiction, or sustained effort is systematically deprioritized.
|
|
|
|
7. The combined effect: users in algorithmically mediated information environments are systematically exposed to a narrowed range of perspectives, optimized for emotional activation over epistemic quality, and behaviorally modified toward continued engagement — all while experiencing this as personal choice.
|
|
|
|
8. The invisibility of this mechanism is its structural danger: overt propaganda can be identified and resisted. Algorithmic epistemic shaping is experienced as preference, not as constraint.
|
|
|
|
9. Therefore: algorithmic rationality erodes epistemic autonomy through structural invisibility, producing populations that are formally free (no censorship) but substantively heteronomous (their beliefs are shaped by systems they neither understand nor control).
|
|
|
|
10. Countermeasures must address the structural condition, not individual instances: epistemic sovereignty infrastructure (VA-00001), digital autonomy (VA-00005), and the ability to opt out of algorithmic mediation entirely.
|