Weight = 1.0?
I dont know how isoc handle it but generally when we have 2 model(A and B) and set weight of model B to 1 l, it will return model B, no?
In task arithmetic, yes - but with how Iso_C handles it, there's a transformation on the deltas that flattens the magnitudes; they'll be similar to Model B directionally but draw out "long tail" behaviors of the instruct tune while tamping down highly confident priors. Definitely different, I've seen them glitching on benchmarks compared to instruction tuned (amplifies noise too) :)
Hmm. That's kind of unexpected default behavior, no?
Nah, that's more or less the expected default behavior that Gemini 3 and Claude Sonnet 4.5 (if I recall the precise Claude asked) predicted (somewhat triangulated, Claude had a worse idea what to expect overall but was more pessimistic about glitchy behavior). It's a lot less glitchy on the benchmark than the pretrained model, but the sharpest parts of the instruct training do serve a function when it comes to templating obedience (if you think about what gets most trained in instruct training it's probably the chat template, aka syntax parsing).
deltas that flattens the magnitudes
Should not this be customizable?
You could edit the code to customize that flattening with an interpolation factor, yeah (0-1 range); implementation I used didn't have that parameter yet (and I was interested in the full flattening).