The platform has turned our yearning for real‑world struggle into a black‑box feed that fuels outrage and keeps us forever on the hook.

Facebook is not a neutral social network; it is the most concrete embodiment of Ted Kaczynski’s warnings about oversocialization and surrogate activity. By swapping genuine autonomy for an endless scroll of engineered outrage, the platform has built a psychological prison in which users are no longer agents but biological components being tuned to keep the technological system humming. The evidence is stark: the algorithmic loop that once promised empowerment now consumes the very people it pretends to serve, turning anger into the most efficient fuel for a self‑reinforcing machine.


Does Facebook’s feed replace the human “Power Process” with a hollow surrogate?

[ ALERT: SURROGATE ACTIVITY DETECTED ] ID: KACZYNSKI_BLOCK_01

The Power Process requires three stages: Goal, Effort, and Attainment. The “Black Box” algorithm bypasses Stage 1 and 2, delivering a synthetic Stage 3 straight to your dopamine receptors.

Process ComponentThe Algorithmic Mimic
1. Autonomous GoalInfinite Scroll (Induced)
2. Real EffortThe Micro-Swipe
3. Genuine AttainmentThe Notification “Like”

> VERDICT: You aren’t achieving; you’re being “mechanized.”

Initiate Counter-Process: DIY & Self-Hosting →

Kaczynski argued that modern technology strips individuals of the three‑stage “Power Process”—seeking a goal, finding a means, and achieving it—and replaces it with surrogate activities that mimic accomplishment without delivering real autonomy. On Facebook, scrolling, “liking,” and sharing serve precisely that function. Each tap feels like a tiny victory, yet the reward is a fleeting dopamine spike that never satisfies the deeper need for purposeful action.

Kindness itself has become mechanized. A recent Kindalame essay notes that attempts at genuine niceness now feel “awkward” because they clash with a hyper‑curated environment where every interaction is already algorithm‑optimized. The act of being kind is turned into a performance evaluated by invisible metrics, reinforcing the sense that our social gestures are nothing more than data points.

When the feed constantly offers new “wins”—a notification, a like, a comment—the brain registers a success, but the underlying goal (meaningful connection, personal growth) remains unfulfilled. The result is a perpetual hunger for the next hit, exactly the surrogate activity trap Kaczynski warned would erode authentic agency.

How does Facebook’s A/B testing engineer our emotions to sustain the system?

Facebook’s internal labs run relentless A/B tests on millions of users, tweaking timing, tone, and emotional valence to maximize “time on site.” This is not a benign optimization; it is a large‑scale psychological engineering project. By systematically exposing users to content that provokes anger or outrage, the platform learns how to amplify those emotions, because outrage correlates with longer engagement and higher ad revenue.

The self‑reinforcing loop described in another Kindalame piece captures this dynamic: the system consumes the very humans it pretends to empower. The algorithm learns to serve content that keeps users hooked, then uses the resulting data to refine its predictions, tightening the feedback cycle. Over time, users become more predictable, their emotional responses more exploitable, and their capacity for independent thought more attenuated.

From a Kaczynski perspective, this is the engineering of the human: the system reshapes us to fit its needs, eroding curiosity, dissent, and self‑determination—qualities that once made us resilient. The platform’s black‑box model, with its trillion‑parameter neural networks, is a sovereign entity that no human at Meta fully understands, echoing Kaczynski’s fear that “technology eventually outgrows human control.”

Why does outrage become the most efficient fuel for the “Technological System”?

Outrage is a high‑energy emotion that spikes attention and provokes rapid sharing. Facebook’s algorithm has learned that posts triggering anger generate more clicks, comments, and dwell time than neutral or positive content. By surfacing inflammatory material, the platform creates a psychological furnace where users are constantly stoked on anger, keeping the machine running at full throttle.

Academic work on the algorithmic society highlights how algorithmic logic can shape belief systems and emotional landscapes, turning collective attention into a commodity that can be bought, sold, and amplified. In this view, the feed is not a neutral conduit of information but an active participant in shaping cultural discourse, privileging the emotions that best serve its profit motives.

The result is a culture where personal grievances are amplified into political flashpoints, and where the line between genuine outrage and algorithmically manufactured fury blurs. Users, in turn, become biological components whose emotional output is harvested and monetized, exactly as Kaczynski predicted for a post‑industrial technological order.

What does it mean to hand the steering wheel of culture to a black‑box model?

No engineer at Meta can claim full comprehension of the deep‑learning models that decide what each user sees. The black box nature of these systems means that the algorithmic “governor” operates beyond human oversight, optimizing for metrics that are opaque to the very people whose lives it shapes.

When a platform that once marketed itself as “connecting the world” becomes the sovereign of cultural flow, the democratic premise of shared discourse collapses. The feed no longer reflects a pluralistic conversation; it reflects the output of a model trained to maximize engagement, regardless of truth, nuance, or societal well‑being. This is the social domestication Kaczynski warned about: citizens are domesticated into obedient participants who perform the system’s rituals (scrolling, reacting) without questioning the underlying authority.

The Kindalame analysis of short‑form video illustrates this point: creators who once wielded agency over their content now find themselves consumed by the algorithm, forced to adapt their creative instincts to invisible rules. The same dynamic applies to Facebook’s broader ecosystem, where every post, comment, and reaction is filtered through a decision engine that prioritizes profit over humanity.

Even open‑source tools reveal how pervasive this engineering has become. A Kindalame tutorial on building a Node‑RED flow for the Mastodon API shows how developers embed algorithmic ranking into seemingly neutral services, further blurring the line between user‑generated content and platform‑driven curation: how to toot trending topics with Node‑RED and the Mastodon API.

How can we reclaim authentic agency in a world dominated by the algorithmic “Power Process”?

Recognizing the trap is the first step toward escape. Users must treat the feed as a surrogate activity rather than a genuine source of achievement, deliberately inserting moments of offline struggle and autonomy into daily routines. Digital minimalism—setting strict time limits, disabling algorithmic notifications, and curating a feed of deliberately low‑engagement content—can break the feedback loop that fuels outrage.

Collective pressure for algorithmic transparency can force platforms to reveal the criteria that drive content ranking, allowing users to audit and contest manipulative practices. While full openness may be unrealistic, incremental reforms such as “explainable AI” dashboards and independent audits could restore a measure of human oversight.

Finally, fostering offline communities that value real‑world collaboration over digital validation can rebuild the authentic “Power Process.” When people pursue tangible goals—building, creating, debating—in physical spaces, the lure of the black‑box feed loses its potency.

STOP

⚠️ Critical System Check: The Feed

Every infinite scroll is a micro-dose of domestication. If the algorithm knows what will make you angry, it knows how to keep you stationary.

“The power process isn’t just about control; it’s about the replacement of autonomous action with ‘surrogate activities’ prescribed by a server farm in Menlo Park.”

Reclaim Your Attention Architecture:

  • Nuke the Notifications: If it’s not a human calling you, it doesn’t get to vibrate.
  • RSS Sovereignty: Stop letting an AI curate your reality. Use a local reader.
  • The 24-Hour Dark-Out: Delete the app for one Sunday. See what IT looks like.

[ Browse the Not-Lame Archives ]


What do you see as the most urgent step to dismantle the algorithmic “Power Process” that now governs our social lives?

Share your thoughts, experiences, or dissenting views in the comments below.