Cover image

Affective Inertia: Teaching LLM Agents to Remember Who They Are

Opening — Why this matters now LLM agents are getting longer memories, better tools, and more elaborate planning stacks—yet they still suffer from a strangely human flaw: emotional whiplash. An agent that sounds empathetic at turn 5 can become oddly cold at turn 7, then conciliatory again by turn 9. For applications that rely on trust, continuity, or persuasion—mental health tools, tutors, social robots—this instability is not a cosmetic issue. It’s a structural one. ...

January 23, 2026 · 3 min · Zelina
Cover image

Feeling Without Feeling: How Emotive Machines Learn to Care (Functionally)

When we think of emotions, we often imagine something deeply human—joy, fear, frustration, and love, entangled with memory and meaning. But what if machines could feel too—at least functionally? A recent speculative research report by Hermann Borotschnig titled “Emotions in Artificial Intelligence”1 dives into this very question, offering a thought-provoking framework for how synthetic emotions might operate, and where their ethical boundaries lie. Emotions as Heuristic Shortcuts At its core, the paper proposes that emotions—rather than being mystical experiences—can be understood as heuristic regulators. In biology, emotions evolved not for introspective poetry but for speedy and effective action. Emotions are shortcuts, helping organisms react to threats, rewards, or uncertainties without deep calculation. ...

May 7, 2025 · 4 min