Cover image

Merge Without Mayhem: How Orthogonal Deltas Could Revolutionize Model Composition

In the era of foundation models, one challenge looms increasingly large: how to safely, scalably, and reversibly compose AI systems from multiple task-specific fine-tunings. Traditional solutions — from naïve weight averaging to adapter stacking — often create interference, forgetfulness, and compliance nightmares. But a recent paper introduces a promising new direction: Modular Delta Merging with Orthogonal Constraints (MDM-OC). Rather than combining entire model weights, MDM-OC treats each task-specific fine-tuned model as a delta from a shared base. Think of these deltas as compact, focused perturbations that encode only what changed to solve a given task. The twist? Before merging, each delta is orthogonalized — projected into a subspace that doesn’t overlap with others. This creates a modular, mathematically principled structure for interference-free integration. ...

August 2, 2025 · 3 min · Zelina