jamesdumay commited on
Commit
563e75b
·
verified ·
1 Parent(s): dae6638

Publish Qwen3.6-35B-A3B-UD-Q4_K_XL full-v1 package (batch 1/30)

Browse files

Publish full-v1 Mesh package artifacts for Qwen3.6-35B-A3B-UD-Q4_K_XL (unsloth/Qwen3.6-35B-A3B-GGUF:UD-Q4_K_XL)

Upload batch 1/30.

.gitattributes CHANGED
@@ -58,3 +58,5 @@ variants/Q4_K_XL/experts/expert-020.gguf filter=lfs diff=lfs merge=lfs -text
58
  variants/Q4_K_XL/experts/expert-021.gguf filter=lfs diff=lfs merge=lfs -text
59
  variants/Q4_K_XL/experts/expert-022.gguf filter=lfs diff=lfs merge=lfs -text
60
  variants/Q4_K_XL/experts/expert-023.gguf filter=lfs diff=lfs merge=lfs -text
 
 
 
58
  variants/Q4_K_XL/experts/expert-021.gguf filter=lfs diff=lfs merge=lfs -text
59
  variants/Q4_K_XL/experts/expert-022.gguf filter=lfs diff=lfs merge=lfs -text
60
  variants/Q4_K_XL/experts/expert-023.gguf filter=lfs diff=lfs merge=lfs -text
61
+ variants/Q4_K_XL/experts/expert-024.gguf filter=lfs diff=lfs merge=lfs -text
62
+ variants/Q4_K_XL/experts/expert-025.gguf filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model:
4
+ - unsloth/Qwen3.6-35B-A3B-GGUF
5
+ pipeline_tag: image-text-to-text
6
+ library_name: mesh-llm
7
+ tags:
8
+ - distributed-inference
9
+ - gguf
10
+ - mesh-llm
11
+ - moe
12
+ - qwen
13
+ - qwen3_5_moe
14
+ - topology-independent
15
+ - unsloth
16
+ ---
17
+
18
+ # Mesh-LLM MoE Package
19
+
20
+ This repository stores Mesh-LLM topology-independent MoE package artifacts derived from `unsloth/Qwen3.6-35B-A3B-GGUF`.
21
+ It is published as `meshllm/qwen3.6-35b-a3b-gguf-moe` and is meant to be consumed by `mesh-llm serve`.
22
+
23
+ ## Source
24
+
25
+ - repo: `unsloth/Qwen3.6-35B-A3B-GGUF`
26
+ - revision: `9280dd353ab587157920d5bd391ada414d84e552`
27
+
28
+ ## What This Repository Contains
29
+
30
+ - `meshllm.json` describes the upstream source repo and all published variants in this package repository.
31
+ - `variants/<variant>/manifest.json` is the runtime entrypoint used to materialize MoE shards.
32
+ - `variants/<variant>/ranking.csv` and `variants/<variant>/analysis.json` contain the analyzer output for that variant.
33
+ - `variants/<variant>/trunk.gguf` plus `variants/<variant>/experts/` hold the topology-independent component artifacts.
34
+
35
+ ## Available Variants
36
+
37
+ ### `Q4_K_XL`
38
+
39
+ - Mesh model ref: `unsloth/Qwen3.6-35B-A3B-GGUF:Q4_K_XL`
40
+ - Distribution id: `Qwen3.6-35B-A3B-UD-Q4_K_XL`
41
+ - Manifest: `variants/Q4_K_XL/manifest.json`
42
+ - Serve with:
43
+
44
+ ```bash
45
+ mesh-llm serve 'unsloth/Qwen3.6-35B-A3B-GGUF:Q4_K_XL'
46
+ ```
47
+
48
+ ## Notes
49
+
50
+ - This is a derived Mesh-LLM package repository, not the original upstream model repository.
51
+ - `mesh-llm` will prefer published package artifacts from this repository when a matching catalog entry exists.
meshllm.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "schema_version": 1,
3
+ "source": {
4
+ "repo": "unsloth/Qwen3.6-35B-A3B-GGUF",
5
+ "revision": "9280dd353ab587157920d5bd391ada414d84e552"
6
+ },
7
+ "variants": {
8
+ "Q4_K_XL": {
9
+ "distribution_id": "Qwen3.6-35B-A3B-UD-Q4_K_XL",
10
+ "manifest": "variants/Q4_K_XL/manifest.json"
11
+ }
12
+ }
13
+ }
variants/Q4_K_XL/analysis.json ADDED
@@ -0,0 +1,614 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "analyzer_id": "full-v1",
3
+ "artifacts": {
4
+ "manifest": "manifest.json",
5
+ "ranking": "ranking.csv"
6
+ },
7
+ "benchmark": {
8
+ "baseline": "full-model",
9
+ "candidates": [
10
+ {
11
+ "corpora": [
12
+ {
13
+ "dataset": "google/IFEval",
14
+ "mean_score": 0.9893861447206662,
15
+ "node_scores": [
16
+ 0.9893861447206662
17
+ ],
18
+ "prompt_count": 128,
19
+ "source": "ifeval",
20
+ "worst_node_score": 0.9893861447206662
21
+ },
22
+ {
23
+ "dataset": "openai/gsm8k",
24
+ "mean_score": 0.9925805577451402,
25
+ "node_scores": [
26
+ 0.9925805577451402
27
+ ],
28
+ "prompt_count": 128,
29
+ "source": "gsm8k",
30
+ "worst_node_score": 0.9925805577451402
31
+ },
32
+ {
33
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
34
+ "mean_score": 0.9905676761581104,
35
+ "node_scores": [
36
+ 0.9905676761581104
37
+ ],
38
+ "prompt_count": 80,
39
+ "source": "mt_bench",
40
+ "worst_node_score": 0.9905676761581104
41
+ },
42
+ {
43
+ "dataset": "openai/openai_humaneval",
44
+ "mean_score": 0.9781133040508384,
45
+ "node_scores": [
46
+ 0.9781133040508384
47
+ ],
48
+ "prompt_count": 164,
49
+ "source": "humaneval",
50
+ "worst_node_score": 0.9781133040508384
51
+ }
52
+ ],
53
+ "mean_score": 0.9866954677452192,
54
+ "min_experts_per_node": 256,
55
+ "node_count": 1,
56
+ "node_scores": [
57
+ 0.9866954677452192
58
+ ],
59
+ "worst_node_score": 0.9866954677452192
60
+ },
61
+ {
62
+ "corpora": [
63
+ {
64
+ "dataset": "google/IFEval",
65
+ "mean_score": 0.017926327323496017,
66
+ "node_scores": [
67
+ 0.0,
68
+ 0.035852654646992034
69
+ ],
70
+ "prompt_count": 128,
71
+ "source": "ifeval",
72
+ "worst_node_score": 0.0
73
+ },
74
+ {
75
+ "dataset": "openai/gsm8k",
76
+ "mean_score": 0.0,
77
+ "node_scores": [
78
+ 0.0,
79
+ 0.0
80
+ ],
81
+ "prompt_count": 128,
82
+ "source": "gsm8k",
83
+ "worst_node_score": 0.0
84
+ },
85
+ {
86
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
87
+ "mean_score": 0.0,
88
+ "node_scores": [
89
+ 0.0,
90
+ 0.0
91
+ ],
92
+ "prompt_count": 80,
93
+ "source": "mt_bench",
94
+ "worst_node_score": 0.0
95
+ },
96
+ {
97
+ "dataset": "openai/openai_humaneval",
98
+ "mean_score": 0.0,
99
+ "node_scores": [
100
+ 0.0,
101
+ 0.0
102
+ ],
103
+ "prompt_count": 164,
104
+ "source": "humaneval",
105
+ "worst_node_score": 0.0
106
+ }
107
+ ],
108
+ "mean_score": 0.017926327323496017,
109
+ "min_experts_per_node": 128,
110
+ "node_count": 2,
111
+ "node_scores": [
112
+ 0.0,
113
+ 0.035852654646992034
114
+ ],
115
+ "worst_node_score": 0.0
116
+ },
117
+ {
118
+ "corpora": [
119
+ {
120
+ "dataset": "google/IFEval",
121
+ "mean_score": 0.05182848054795661,
122
+ "node_scores": [
123
+ 0.10365696109591321,
124
+ 0.0
125
+ ],
126
+ "prompt_count": 128,
127
+ "source": "ifeval",
128
+ "worst_node_score": 0.0
129
+ },
130
+ {
131
+ "dataset": "openai/gsm8k",
132
+ "mean_score": 0.0,
133
+ "node_scores": [
134
+ 0.0,
135
+ 0.0
136
+ ],
137
+ "prompt_count": 128,
138
+ "source": "gsm8k",
139
+ "worst_node_score": 0.0
140
+ },
141
+ {
142
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
143
+ "mean_score": 0.0,
144
+ "node_scores": [
145
+ 0.0,
146
+ 0.0
147
+ ],
148
+ "prompt_count": 80,
149
+ "source": "mt_bench",
150
+ "worst_node_score": 0.0
151
+ },
152
+ {
153
+ "dataset": "openai/openai_humaneval",
154
+ "mean_score": 0.0,
155
+ "node_scores": [
156
+ 0.0,
157
+ 0.0
158
+ ],
159
+ "prompt_count": 164,
160
+ "source": "humaneval",
161
+ "worst_node_score": 0.0
162
+ }
163
+ ],
164
+ "mean_score": 0.05182848054795661,
165
+ "min_experts_per_node": 192,
166
+ "node_count": 2,
167
+ "node_scores": [
168
+ 0.10365696109591321,
169
+ 0.0
170
+ ],
171
+ "worst_node_score": 0.0
172
+ },
173
+ {
174
+ "corpora": [
175
+ {
176
+ "dataset": "google/IFEval",
177
+ "mean_score": 0.08158752476527527,
178
+ "node_scores": [
179
+ 0.0,
180
+ 0.16317504953055054
181
+ ],
182
+ "prompt_count": 128,
183
+ "source": "ifeval",
184
+ "worst_node_score": 0.0
185
+ },
186
+ {
187
+ "dataset": "openai/gsm8k",
188
+ "mean_score": 0.0,
189
+ "node_scores": [
190
+ 0.0,
191
+ 0.0
192
+ ],
193
+ "prompt_count": 128,
194
+ "source": "gsm8k",
195
+ "worst_node_score": 0.0
196
+ },
197
+ {
198
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
199
+ "mean_score": 0.0,
200
+ "node_scores": [
201
+ 0.0,
202
+ 0.0
203
+ ],
204
+ "prompt_count": 80,
205
+ "source": "mt_bench",
206
+ "worst_node_score": 0.0
207
+ },
208
+ {
209
+ "dataset": "openai/openai_humaneval",
210
+ "mean_score": 0.0,
211
+ "node_scores": [
212
+ 0.0,
213
+ 0.0
214
+ ],
215
+ "prompt_count": 164,
216
+ "source": "humaneval",
217
+ "worst_node_score": 0.0
218
+ }
219
+ ],
220
+ "mean_score": 0.08158752476527527,
221
+ "min_experts_per_node": 224,
222
+ "node_count": 2,
223
+ "node_scores": [
224
+ 0.0,
225
+ 0.16317504953055054
226
+ ],
227
+ "worst_node_score": 0.0
228
+ },
229
+ {
230
+ "corpora": [
231
+ {
232
+ "dataset": "google/IFEval",
233
+ "mean_score": 0.14643480934750933,
234
+ "node_scores": [
235
+ 0.29286961869501865,
236
+ 0.0
237
+ ],
238
+ "prompt_count": 128,
239
+ "source": "ifeval",
240
+ "worst_node_score": 0.0
241
+ },
242
+ {
243
+ "dataset": "openai/gsm8k",
244
+ "mean_score": 0.0,
245
+ "node_scores": [
246
+ 0.0,
247
+ 0.0
248
+ ],
249
+ "prompt_count": 128,
250
+ "source": "gsm8k",
251
+ "worst_node_score": 0.0
252
+ },
253
+ {
254
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
255
+ "mean_score": 0.0,
256
+ "node_scores": [
257
+ 0.0,
258
+ 0.0
259
+ ],
260
+ "prompt_count": 80,
261
+ "source": "mt_bench",
262
+ "worst_node_score": 0.0
263
+ },
264
+ {
265
+ "dataset": "openai/openai_humaneval",
266
+ "mean_score": 0.0,
267
+ "node_scores": [
268
+ 0.0,
269
+ 0.0
270
+ ],
271
+ "prompt_count": 164,
272
+ "source": "humaneval",
273
+ "worst_node_score": 0.0
274
+ }
275
+ ],
276
+ "mean_score": 0.14643480934750933,
277
+ "min_experts_per_node": 240,
278
+ "node_count": 2,
279
+ "node_scores": [
280
+ 0.29286961869501865,
281
+ 0.0
282
+ ],
283
+ "worst_node_score": 0.0
284
+ },
285
+ {
286
+ "corpora": [
287
+ {
288
+ "dataset": "google/IFEval",
289
+ "mean_score": 0.09131748418873686,
290
+ "node_scores": [
291
+ 0.0,
292
+ 0.18263496837747373
293
+ ],
294
+ "prompt_count": 128,
295
+ "source": "ifeval",
296
+ "worst_node_score": 0.0
297
+ },
298
+ {
299
+ "dataset": "openai/gsm8k",
300
+ "mean_score": 0.0,
301
+ "node_scores": [
302
+ 0.0,
303
+ 0.0
304
+ ],
305
+ "prompt_count": 128,
306
+ "source": "gsm8k",
307
+ "worst_node_score": 0.0
308
+ },
309
+ {
310
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
311
+ "mean_score": 0.0,
312
+ "node_scores": [
313
+ 0.0,
314
+ 0.0
315
+ ],
316
+ "prompt_count": 80,
317
+ "source": "mt_bench",
318
+ "worst_node_score": 0.0
319
+ },
320
+ {
321
+ "dataset": "openai/openai_humaneval",
322
+ "mean_score": 0.0,
323
+ "node_scores": [
324
+ 0.0,
325
+ 0.0
326
+ ],
327
+ "prompt_count": 164,
328
+ "source": "humaneval",
329
+ "worst_node_score": 0.0
330
+ }
331
+ ],
332
+ "mean_score": 0.09131748418873686,
333
+ "min_experts_per_node": 248,
334
+ "node_count": 2,
335
+ "node_scores": [
336
+ 0.0,
337
+ 0.18263496837747373
338
+ ],
339
+ "worst_node_score": 0.0
340
+ },
341
+ {
342
+ "corpora": [
343
+ {
344
+ "dataset": "google/IFEval",
345
+ "mean_score": 0.20935244544841133,
346
+ "node_scores": [
347
+ 0.41870489089682267,
348
+ 0.0
349
+ ],
350
+ "prompt_count": 128,
351
+ "source": "ifeval",
352
+ "worst_node_score": 0.0
353
+ },
354
+ {
355
+ "dataset": "openai/gsm8k",
356
+ "mean_score": 0.0,
357
+ "node_scores": [
358
+ 0.0,
359
+ 0.0
360
+ ],
361
+ "prompt_count": 128,
362
+ "source": "gsm8k",
363
+ "worst_node_score": 0.0
364
+ },
365
+ {
366
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
367
+ "mean_score": 0.0,
368
+ "node_scores": [
369
+ 0.0,
370
+ 0.0
371
+ ],
372
+ "prompt_count": 80,
373
+ "source": "mt_bench",
374
+ "worst_node_score": 0.0
375
+ },
376
+ {
377
+ "dataset": "openai/openai_humaneval",
378
+ "mean_score": 0.0,
379
+ "node_scores": [
380
+ 0.0,
381
+ 0.0
382
+ ],
383
+ "prompt_count": 164,
384
+ "source": "humaneval",
385
+ "worst_node_score": 0.0
386
+ }
387
+ ],
388
+ "mean_score": 0.20935244544841133,
389
+ "min_experts_per_node": 252,
390
+ "node_count": 2,
391
+ "node_scores": [
392
+ 0.41870489089682267,
393
+ 0.0
394
+ ],
395
+ "worst_node_score": 0.0
396
+ },
397
+ {
398
+ "corpora": [
399
+ {
400
+ "dataset": "google/IFEval",
401
+ "mean_score": 0.21552568633647212,
402
+ "node_scores": [
403
+ 0.43105137267294424,
404
+ 0.0
405
+ ],
406
+ "prompt_count": 128,
407
+ "source": "ifeval",
408
+ "worst_node_score": 0.0
409
+ },
410
+ {
411
+ "dataset": "openai/gsm8k",
412
+ "mean_score": 0.0,
413
+ "node_scores": [
414
+ 0.0,
415
+ 0.0
416
+ ],
417
+ "prompt_count": 128,
418
+ "source": "gsm8k",
419
+ "worst_node_score": 0.0
420
+ },
421
+ {
422
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
423
+ "mean_score": 0.0,
424
+ "node_scores": [
425
+ 0.0,
426
+ 0.0
427
+ ],
428
+ "prompt_count": 80,
429
+ "source": "mt_bench",
430
+ "worst_node_score": 0.0
431
+ },
432
+ {
433
+ "dataset": "openai/openai_humaneval",
434
+ "mean_score": 0.0,
435
+ "node_scores": [
436
+ 0.0,
437
+ 0.0
438
+ ],
439
+ "prompt_count": 164,
440
+ "source": "humaneval",
441
+ "worst_node_score": 0.0
442
+ }
443
+ ],
444
+ "mean_score": 0.21552568633647212,
445
+ "min_experts_per_node": 254,
446
+ "node_count": 2,
447
+ "node_scores": [
448
+ 0.43105137267294424,
449
+ 0.0
450
+ ],
451
+ "worst_node_score": 0.0
452
+ },
453
+ {
454
+ "corpora": [
455
+ {
456
+ "dataset": "google/IFEval",
457
+ "mean_score": 0.21552568633647212,
458
+ "node_scores": [
459
+ 0.0,
460
+ 0.43105137267294424
461
+ ],
462
+ "prompt_count": 128,
463
+ "source": "ifeval",
464
+ "worst_node_score": 0.0
465
+ },
466
+ {
467
+ "dataset": "openai/gsm8k",
468
+ "mean_score": 0.0,
469
+ "node_scores": [
470
+ 0.0,
471
+ 0.0
472
+ ],
473
+ "prompt_count": 128,
474
+ "source": "gsm8k",
475
+ "worst_node_score": 0.0
476
+ },
477
+ {
478
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
479
+ "mean_score": 0.0,
480
+ "node_scores": [
481
+ 0.0,
482
+ 0.0
483
+ ],
484
+ "prompt_count": 80,
485
+ "source": "mt_bench",
486
+ "worst_node_score": 0.0
487
+ },
488
+ {
489
+ "dataset": "openai/openai_humaneval",
490
+ "mean_score": 0.0,
491
+ "node_scores": [
492
+ 0.0,
493
+ 0.0
494
+ ],
495
+ "prompt_count": 164,
496
+ "source": "humaneval",
497
+ "worst_node_score": 0.0
498
+ }
499
+ ],
500
+ "mean_score": 0.21552568633647212,
501
+ "min_experts_per_node": 255,
502
+ "node_count": 2,
503
+ "node_scores": [
504
+ 0.0,
505
+ 0.43105137267294424
506
+ ],
507
+ "worst_node_score": 0.0
508
+ }
509
+ ],
510
+ "corpora": [
511
+ {
512
+ "dataset": "google/IFEval",
513
+ "max_tokens": 128,
514
+ "prompt_count": 128,
515
+ "source": "ifeval"
516
+ },
517
+ {
518
+ "dataset": "openai/gsm8k",
519
+ "max_tokens": 128,
520
+ "prompt_count": 128,
521
+ "source": "gsm8k"
522
+ },
523
+ {
524
+ "dataset": "HuggingFaceH4/mt_bench_prompts",
525
+ "max_tokens": 128,
526
+ "prompt_count": 80,
527
+ "source": "mt_bench"
528
+ },
529
+ {
530
+ "dataset": "openai/openai_humaneval",
531
+ "max_tokens": 128,
532
+ "prompt_count": 164,
533
+ "source": "humaneval"
534
+ }
535
+ ],
536
+ "metric": "token_dice_similarity",
537
+ "quality_floor": 0.95,
538
+ "recommended_min_experts_per_node": 256,
539
+ "version": 3
540
+ },
541
+ "created_at": "2026-04-19T09:08:29+00:00",
542
+ "memory": {
543
+ "base_resident_bytes": 2678180352,
544
+ "expert_bytes": {
545
+ "bytes_per_expert": 76840960,
546
+ "kind": "uniform"
547
+ },
548
+ "expert_tensor_bytes_total": 19671285760,
549
+ "full_model_bytes": 22360456160,
550
+ "shard_file_overhead_bytes": 10990048
551
+ },
552
+ "parameters": {
553
+ "all_layers": true,
554
+ "context_size": 4096,
555
+ "prompt_count": null,
556
+ "prompt_set": null,
557
+ "token_count": 32
558
+ },
559
+ "planner": {
560
+ "recommended_overlap": 1
561
+ },
562
+ "ranking": {
563
+ "mass_checkpoints": [
564
+ {
565
+ "mass_fraction": 0.320624258670914,
566
+ "top_n": 1
567
+ },
568
+ {
569
+ "mass_fraction": 0.324429432772452,
570
+ "top_n": 2
571
+ },
572
+ {
573
+ "mass_fraction": 0.3318762258353961,
574
+ "top_n": 4
575
+ },
576
+ {
577
+ "mass_fraction": 0.3460204514826405,
578
+ "top_n": 8
579
+ },
580
+ {
581
+ "mass_fraction": 0.3726827485247638,
582
+ "top_n": 16
583
+ },
584
+ {
585
+ "mass_fraction": 0.4232942266007568,
586
+ "top_n": 32
587
+ },
588
+ {
589
+ "mass_fraction": 0.51767078795539,
590
+ "top_n": 64
591
+ },
592
+ {
593
+ "mass_fraction": 0.6919257754605005,
594
+ "top_n": 128
595
+ },
596
+ {
597
+ "mass_fraction": 1.0,
598
+ "top_n": 256
599
+ }
600
+ ],
601
+ "rows": 256,
602
+ "sha256": "sha256:5f514ec9a2a890798f6692b2606005fb84175092ef25555f8b5e1d0c625349b9"
603
+ },
604
+ "schema_version": 1,
605
+ "summary": {
606
+ "min_experts_per_node": 128,
607
+ "n_expert": 256,
608
+ "n_expert_used": 8
609
+ },
610
+ "tool": {
611
+ "name": "llama-moe-analyze",
612
+ "version": "mesh-llm-fork"
613
+ }
614
+ }
variants/Q4_K_XL/experts/expert-024.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:38c1fd53f06bd4b10668f6546ef2f62bcb19bf066ae9e850a978b2244459fd38
3
+ size 88122688
variants/Q4_K_XL/experts/expert-025.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f0f952eff9ea1e91fa934caf2b99a4a50b7d4acdca7e10746906f5df442398b6
3
+ size 88122688
variants/Q4_K_XL/manifest.json ADDED
@@ -0,0 +1,1294 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "schema_version": 1,
3
+ "format": "meshllm-moe-components",
4
+ "ranking_sha256": "sha256:5f514ec9a2a890798f6692b2606005fb84175092ef25555f8b5e1d0c625349b9",
5
+ "n_expert": 256,
6
+ "n_expert_used": 8,
7
+ "min_experts_per_node": 128,
8
+ "trunk": {
9
+ "path": "trunk.gguf",
10
+ "sha256": "sha256:c3096537447855d2412e79d408d89755858ec86449b7f9dd48df20485d860c36"
11
+ },
12
+ "experts": [
13
+ {
14
+ "path": "experts/expert-000.gguf",
15
+ "sha256": "sha256:06b68b6dbad301ccb0ecbe14c82073aaf0452ddb00d6a3dbdeeb9cbb94511b4d",
16
+ "expert_id": 0
17
+ },
18
+ {
19
+ "path": "experts/expert-001.gguf",
20
+ "sha256": "sha256:b42d77f6ad37bd0571d8a553d34b3d9036caeeffa5f20003a4b425c2ffe898bf",
21
+ "expert_id": 1
22
+ },
23
+ {
24
+ "path": "experts/expert-002.gguf",
25
+ "sha256": "sha256:f0ab0424bb9ece398c0c0e485e45283d213bae94c63aa56ccf4e903bcd35948e",
26
+ "expert_id": 2
27
+ },
28
+ {
29
+ "path": "experts/expert-003.gguf",
30
+ "sha256": "sha256:9dcfc52f74e39cd819a590ec70f8ba397103186adf16f9f8aeabb90295a9bdae",
31
+ "expert_id": 3
32
+ },
33
+ {
34
+ "path": "experts/expert-004.gguf",
35
+ "sha256": "sha256:0f8b3fb14a06ba01806424657d09c11aa30d626811971b24440486ecf418a4db",
36
+ "expert_id": 4
37
+ },
38
+ {
39
+ "path": "experts/expert-005.gguf",
40
+ "sha256": "sha256:65db0f1d97e052afac00adae4377a002f960c13daad8afe6c9113c6e0dcf337c",
41
+ "expert_id": 5
42
+ },
43
+ {
44
+ "path": "experts/expert-006.gguf",
45
+ "sha256": "sha256:ac609a7636700591edd3f1c368bab2739733695fb453bc13a4f6ad5a8ab7d631",
46
+ "expert_id": 6
47
+ },
48
+ {
49
+ "path": "experts/expert-007.gguf",
50
+ "sha256": "sha256:44d3a772fa1e3aa1a08fce756164b326e977f1e3a015e9621eb552ecdc44d641",
51
+ "expert_id": 7
52
+ },
53
+ {
54
+ "path": "experts/expert-008.gguf",
55
+ "sha256": "sha256:03a97aa6fdcd7a356d1f8d844944298fb5aa4852de7afe083632733c418b5368",
56
+ "expert_id": 8
57
+ },
58
+ {
59
+ "path": "experts/expert-009.gguf",
60
+ "sha256": "sha256:0a7fbcf6bb11b168e70cc2365b9739cf93de8a2b9482bf394da7d231f5c6c65c",
61
+ "expert_id": 9
62
+ },
63
+ {
64
+ "path": "experts/expert-010.gguf",
65
+ "sha256": "sha256:623dc808f8314a6c480449557048a2b53daa896495bb2cd20207256e37620bd5",
66
+ "expert_id": 10
67
+ },
68
+ {
69
+ "path": "experts/expert-011.gguf",
70
+ "sha256": "sha256:8c21cf8b361312a80b014894011899f64e10e68742c6a17b55ce160391707237",
71
+ "expert_id": 11
72
+ },
73
+ {
74
+ "path": "experts/expert-012.gguf",
75
+ "sha256": "sha256:b2e4a490b876d1b657b4739503dc882a7ead0ea2cc3240e587ec4c483b8d32de",
76
+ "expert_id": 12
77
+ },
78
+ {
79
+ "path": "experts/expert-013.gguf",
80
+ "sha256": "sha256:5471a63cfc37f6f48fa524288757870faf14cff04241521ca5f6a64ca9b7f179",
81
+ "expert_id": 13
82
+ },
83
+ {
84
+ "path": "experts/expert-014.gguf",
85
+ "sha256": "sha256:2ea28803371c1f68bb237766fb8ec7d5dbafffbb7918943eb3faa9f1d4e0e7e4",
86
+ "expert_id": 14
87
+ },
88
+ {
89
+ "path": "experts/expert-015.gguf",
90
+ "sha256": "sha256:2dcb96bddff4d9441c5be8014ebae6d116d231835ba7a1ac7c3813ebb00c70e1",
91
+ "expert_id": 15
92
+ },
93
+ {
94
+ "path": "experts/expert-016.gguf",
95
+ "sha256": "sha256:f6959cdcc8fd1774f37df021b4a61a52c8195005f55cf0481fb7a06f534431b5",
96
+ "expert_id": 16
97
+ },
98
+ {
99
+ "path": "experts/expert-017.gguf",
100
+ "sha256": "sha256:8ec6301f87f7a86ee30874c4c27258a83d6ec33c23f802220fe16ee501457d65",
101
+ "expert_id": 17
102
+ },
103
+ {
104
+ "path": "experts/expert-018.gguf",
105
+ "sha256": "sha256:854d4a225a506237eaab4055e461ce9f717eb556a87ff4096daf900c09a5059b",
106
+ "expert_id": 18
107
+ },
108
+ {
109
+ "path": "experts/expert-019.gguf",
110
+ "sha256": "sha256:22174a0dbd8f40d6a4c2b8ce8882a73481881f3df4bd58c55e9a18282682b62d",
111
+ "expert_id": 19
112
+ },
113
+ {
114
+ "path": "experts/expert-020.gguf",
115
+ "sha256": "sha256:ebdd62bd566e95da9ecebb9e8999fffce4fc936c29bfe5b7977438884ccc91d2",
116
+ "expert_id": 20
117
+ },
118
+ {
119
+ "path": "experts/expert-021.gguf",
120
+ "sha256": "sha256:d851fbba946976ed3bd06d55343def5ddba1f04e7bcf7b88f4be3ef94ed0e195",
121
+ "expert_id": 21
122
+ },
123
+ {
124
+ "path": "experts/expert-022.gguf",
125
+ "sha256": "sha256:179a5936092b352710bf8dfe4da04ab8beeb7b5ec8d6e2453c5df8c1056f57ab",
126
+ "expert_id": 22
127
+ },
128
+ {
129
+ "path": "experts/expert-023.gguf",
130
+ "sha256": "sha256:241aa67474bdf5f7b644d84e624d54d3eceb1b4b4cc8a2f8a6af5ba304936f16",
131
+ "expert_id": 23
132
+ },
133
+ {
134
+ "path": "experts/expert-024.gguf",
135
+ "sha256": "sha256:38c1fd53f06bd4b10668f6546ef2f62bcb19bf066ae9e850a978b2244459fd38",
136
+ "expert_id": 24
137
+ },
138
+ {
139
+ "path": "experts/expert-025.gguf",
140
+ "sha256": "sha256:f0f952eff9ea1e91fa934caf2b99a4a50b7d4acdca7e10746906f5df442398b6",
141
+ "expert_id": 25
142
+ },
143
+ {
144
+ "path": "experts/expert-026.gguf",
145
+ "sha256": "sha256:37e178beddca8cfe845ff59a75250430c11afbe6b48b08551756e6588357e5ef",
146
+ "expert_id": 26
147
+ },
148
+ {
149
+ "path": "experts/expert-027.gguf",
150
+ "sha256": "sha256:f130ad48e3956912bdc3b3f51cad4fbcf00b7facd06905746275095dcc44a609",
151
+ "expert_id": 27
152
+ },
153
+ {
154
+ "path": "experts/expert-028.gguf",
155
+ "sha256": "sha256:0caec2beccb567a822cd751c6c785867b33dff775125a8eb5a5eff3dcca0dcb8",
156
+ "expert_id": 28
157
+ },
158
+ {
159
+ "path": "experts/expert-029.gguf",
160
+ "sha256": "sha256:07d1a157f74fce0c19e5babb628a19dea99a2103b2a76fd259400ff19e5b4bf9",
161
+ "expert_id": 29
162
+ },
163
+ {
164
+ "path": "experts/expert-030.gguf",
165
+ "sha256": "sha256:6c937c9e0569b46692f7949a6c6d27523d956c0c78415b7ed1adedb7307b4321",
166
+ "expert_id": 30
167
+ },
168
+ {
169
+ "path": "experts/expert-031.gguf",
170
+ "sha256": "sha256:8416e3f289c3201d9187438d3c6cc92a94257170b35b12193b34ad89ed00092b",
171
+ "expert_id": 31
172
+ },
173
+ {
174
+ "path": "experts/expert-032.gguf",
175
+ "sha256": "sha256:77f0c44a0bbf0a61c412878759bdf31994296d6dceb76eb1bf207ee1d1072054",
176
+ "expert_id": 32
177
+ },
178
+ {
179
+ "path": "experts/expert-033.gguf",
180
+ "sha256": "sha256:366b170c824bfe86bf1725ff4922ae5525f517c4a28dffafe8e0c0e1b6f455f1",
181
+ "expert_id": 33
182
+ },
183
+ {
184
+ "path": "experts/expert-034.gguf",
185
+ "sha256": "sha256:58082aa3b10d399d8ca1f1b0c0ea1b87c0fd28e78db0beed613fa9bfee013007",
186
+ "expert_id": 34
187
+ },
188
+ {
189
+ "path": "experts/expert-035.gguf",
190
+ "sha256": "sha256:ec3cec46f96b9e7732a2a671de7e706aa43e2196be1f71901edd028222e075f1",
191
+ "expert_id": 35
192
+ },
193
+ {
194
+ "path": "experts/expert-036.gguf",
195
+ "sha256": "sha256:a1abef7d7dbcf8ae67e3137c1c261c9457e1a83c9c7a865e2558f5780ed22029",
196
+ "expert_id": 36
197
+ },
198
+ {
199
+ "path": "experts/expert-037.gguf",
200
+ "sha256": "sha256:92d97f2ad418803b0ab31d3e0fb8f701a137a9546540612f6cbea9412d6a864d",
201
+ "expert_id": 37
202
+ },
203
+ {
204
+ "path": "experts/expert-038.gguf",
205
+ "sha256": "sha256:c82e85942d589eef62face5c7392e953dd957fd82809d53fd65ac8ff55b747c9",
206
+ "expert_id": 38
207
+ },
208
+ {
209
+ "path": "experts/expert-039.gguf",
210
+ "sha256": "sha256:60b63f0a4697a3ab13ad5a629c3ee08f575ed34c0a6ba1eddef4379264b3669f",
211
+ "expert_id": 39
212
+ },
213
+ {
214
+ "path": "experts/expert-040.gguf",
215
+ "sha256": "sha256:e0a2991222efcc534f224d28e9d056f9ca37032b1cd7594adfd38bef4bd0ddf9",
216
+ "expert_id": 40
217
+ },
218
+ {
219
+ "path": "experts/expert-041.gguf",
220
+ "sha256": "sha256:b330ca5c5cf270872e77419fe489a40b59e4996e9a66a62648d6ebfc308d520e",
221
+ "expert_id": 41
222
+ },
223
+ {
224
+ "path": "experts/expert-042.gguf",
225
+ "sha256": "sha256:4ea5caa8cf30cc99073118f3524dfb30d40825e556f0d3f91abe4fe7b451a439",
226
+ "expert_id": 42
227
+ },
228
+ {
229
+ "path": "experts/expert-043.gguf",
230
+ "sha256": "sha256:9f5c610777b5eb6e46c6fb403a11ca359e2fe3c7c8b981e25f3a3271cc171e19",
231
+ "expert_id": 43
232
+ },
233
+ {
234
+ "path": "experts/expert-044.gguf",
235
+ "sha256": "sha256:8bf7ece9ba49f25de14f2aad3be9e637a5461578c7782325f66a4a6a7e33e2d2",
236
+ "expert_id": 44
237
+ },
238
+ {
239
+ "path": "experts/expert-045.gguf",
240
+ "sha256": "sha256:207fed18829c7680a1b048f4486b085787009d599fde31cf53a0e4862cd4b788",
241
+ "expert_id": 45
242
+ },
243
+ {
244
+ "path": "experts/expert-046.gguf",
245
+ "sha256": "sha256:a09f6fd94c7f0f8604765b4fc4d6718b4df839a1aab240783c2cac648ce56006",
246
+ "expert_id": 46
247
+ },
248
+ {
249
+ "path": "experts/expert-047.gguf",
250
+ "sha256": "sha256:2dffed31c80bf5fe67bba3d700fd494ce4918b8b637fa42aefad303f2d3eda98",
251
+ "expert_id": 47
252
+ },
253
+ {
254
+ "path": "experts/expert-048.gguf",
255
+ "sha256": "sha256:5ba8bb3fc9aefd461d0f30c751eac677916fc1a7e15f42a12a626ee366eda350",
256
+ "expert_id": 48
257
+ },
258
+ {
259
+ "path": "experts/expert-049.gguf",
260
+ "sha256": "sha256:a5c958087daeae43a0604a5bced583d899e67c05c3cfb842349a441cf991f528",
261
+ "expert_id": 49
262
+ },
263
+ {
264
+ "path": "experts/expert-050.gguf",
265
+ "sha256": "sha256:f4eba1dd0017c7f2bad540698448b006843605793b9dc9e2b794911012710ec2",
266
+ "expert_id": 50
267
+ },
268
+ {
269
+ "path": "experts/expert-051.gguf",
270
+ "sha256": "sha256:0cf629336d050ba73f1083a097523cafde1204b426faa2c39193e2709e7ab8a5",
271
+ "expert_id": 51
272
+ },
273
+ {
274
+ "path": "experts/expert-052.gguf",
275
+ "sha256": "sha256:a415374881ac0c6f6b8a1bcfcc36ad1ff48c39c3ea38b9e9b77d11d7ff6930da",
276
+ "expert_id": 52
277
+ },
278
+ {
279
+ "path": "experts/expert-053.gguf",
280
+ "sha256": "sha256:3d129cb185ca39210cad6c55cd8065b1918cbadde681b28eff959648fc71f7cc",
281
+ "expert_id": 53
282
+ },
283
+ {
284
+ "path": "experts/expert-054.gguf",
285
+ "sha256": "sha256:cfb204968f43a9397c642bc4962490b745e83a0cf1845fefccdb6e7665110172",
286
+ "expert_id": 54
287
+ },
288
+ {
289
+ "path": "experts/expert-055.gguf",
290
+ "sha256": "sha256:504cadf2556a8b5e81355312fb1b1277a60d8908a07dd1ef1ae5f4fb5b746634",
291
+ "expert_id": 55
292
+ },
293
+ {
294
+ "path": "experts/expert-056.gguf",
295
+ "sha256": "sha256:140efdb8e67283d082305bb047b062d87da8167219c4fe4761a51b3e84f75165",
296
+ "expert_id": 56
297
+ },
298
+ {
299
+ "path": "experts/expert-057.gguf",
300
+ "sha256": "sha256:12499a15287e7d8510e897c6047eaad732d4f07bf2259c581a3ae079d080dee2",
301
+ "expert_id": 57
302
+ },
303
+ {
304
+ "path": "experts/expert-058.gguf",
305
+ "sha256": "sha256:e337f02dd07a00d8205bf9838c11e7242833dea462d439b30f52801c163cdea5",
306
+ "expert_id": 58
307
+ },
308
+ {
309
+ "path": "experts/expert-059.gguf",
310
+ "sha256": "sha256:5924f8c45cd97535b5dfd3ac7db65a1ffdf8f143a948692b20cb893a4700a734",
311
+ "expert_id": 59
312
+ },
313
+ {
314
+ "path": "experts/expert-060.gguf",
315
+ "sha256": "sha256:b0b42b3bb26016f7f118ebe815fc272c9520060279d452db8b81f0cff74eaf38",
316
+ "expert_id": 60
317
+ },
318
+ {
319
+ "path": "experts/expert-061.gguf",
320
+ "sha256": "sha256:9abd4c4bac2be45a50d026b45ba0fab71d3408790c9ca3d114f5a617f48e5591",
321
+ "expert_id": 61
322
+ },
323
+ {
324
+ "path": "experts/expert-062.gguf",
325
+ "sha256": "sha256:311778ecf2752711caf96c1c0e886fb3c57592e9453a47b22d74774d1fe1d297",
326
+ "expert_id": 62
327
+ },
328
+ {
329
+ "path": "experts/expert-063.gguf",
330
+ "sha256": "sha256:3249acf4c7b4ba065e113bca8128e6dc901bd17a0e95812c84a820c7b97d83c3",
331
+ "expert_id": 63
332
+ },
333
+ {
334
+ "path": "experts/expert-064.gguf",
335
+ "sha256": "sha256:c6ccf9675f35f1cbb43b0c2121d6afaa7cb6eec159707cc7d7ff6d2e9f36e575",
336
+ "expert_id": 64
337
+ },
338
+ {
339
+ "path": "experts/expert-065.gguf",
340
+ "sha256": "sha256:428742adeb997223f32e50c07073b49373e76ec4e8ee2651e0525ae6a66bb6e0",
341
+ "expert_id": 65
342
+ },
343
+ {
344
+ "path": "experts/expert-066.gguf",
345
+ "sha256": "sha256:67c71d93e6b7ff95e9804542360a1bd52df1fae90dcee4c27e149e9bfd7201ed",
346
+ "expert_id": 66
347
+ },
348
+ {
349
+ "path": "experts/expert-067.gguf",
350
+ "sha256": "sha256:902a51287947b2ea608f3a6c8700a792e7de7c02281c744c532674fcb41b02f6",
351
+ "expert_id": 67
352
+ },
353
+ {
354
+ "path": "experts/expert-068.gguf",
355
+ "sha256": "sha256:742559846598aa924677730b6b50dfe9bd73521899846eaf94e351565ae6a745",
356
+ "expert_id": 68
357
+ },
358
+ {
359
+ "path": "experts/expert-069.gguf",
360
+ "sha256": "sha256:7f2ecbbe198136e7b56790538f409874e8c5e02f0722f26106d6c8842b4257f3",
361
+ "expert_id": 69
362
+ },
363
+ {
364
+ "path": "experts/expert-070.gguf",
365
+ "sha256": "sha256:afcd6a45e7373b0d41f36bd071cc74b78950217ea8ef42dc761ab11a3be27d54",
366
+ "expert_id": 70
367
+ },
368
+ {
369
+ "path": "experts/expert-071.gguf",
370
+ "sha256": "sha256:bcbf97f9744a38f009c2604d2122d11bdef5aa0a060f5d3e85d5fa4efa9cf851",
371
+ "expert_id": 71
372
+ },
373
+ {
374
+ "path": "experts/expert-072.gguf",
375
+ "sha256": "sha256:9d81115fae2998b6c9833c913e9b49d6e75950ffe132ffa5fd2c278aea5060ef",
376
+ "expert_id": 72
377
+ },
378
+ {
379
+ "path": "experts/expert-073.gguf",
380
+ "sha256": "sha256:08b6693b721b5cb3ceb794d2e9956f220b05aa08e97236d6ee38b442016df790",
381
+ "expert_id": 73
382
+ },
383
+ {
384
+ "path": "experts/expert-074.gguf",
385
+ "sha256": "sha256:967d6367d02901a0c5e3fc662ce24366673f91d16c329ef1353f5d828f6b4274",
386
+ "expert_id": 74
387
+ },
388
+ {
389
+ "path": "experts/expert-075.gguf",
390
+ "sha256": "sha256:d7f199d53c89e24edb9c5a56f9159edafd50b1bea3f5fa4e69b46e4ea58a79da",
391
+ "expert_id": 75
392
+ },
393
+ {
394
+ "path": "experts/expert-076.gguf",
395
+ "sha256": "sha256:d7d9b17ca851e89dab6025d48935b2f10dbf689093a8e5728a42179e3398c1d3",
396
+ "expert_id": 76
397
+ },
398
+ {
399
+ "path": "experts/expert-077.gguf",
400
+ "sha256": "sha256:e224a853056ad1463c36e0bf89847c7fbbb06a75439f98d77a99880775126b08",
401
+ "expert_id": 77
402
+ },
403
+ {
404
+ "path": "experts/expert-078.gguf",
405
+ "sha256": "sha256:964830a8d3d52a50a0e5638d8ea1e84930852a599fd2ffe0775f79896f296ba8",
406
+ "expert_id": 78
407
+ },
408
+ {
409
+ "path": "experts/expert-079.gguf",
410
+ "sha256": "sha256:feaafde8f1cce40b7a6789bd58534079581b7bd10d6a1c6136b7a8b95dcfef57",
411
+ "expert_id": 79
412
+ },
413
+ {
414
+ "path": "experts/expert-080.gguf",
415
+ "sha256": "sha256:7e236c793bbf4133545313d7eaeb5d184220a425d8cec8256794e07c00a8a406",
416
+ "expert_id": 80
417
+ },
418
+ {
419
+ "path": "experts/expert-081.gguf",
420
+ "sha256": "sha256:6139e375b2937fd83520328fbba4b7a12b3842e449ea36ea43746d12955b123b",
421
+ "expert_id": 81
422
+ },
423
+ {
424
+ "path": "experts/expert-082.gguf",
425
+ "sha256": "sha256:443b8a1985cb2f20b0fa636a8a033b2daaf99fac763a44608749e67274fdc4f7",
426
+ "expert_id": 82
427
+ },
428
+ {
429
+ "path": "experts/expert-083.gguf",
430
+ "sha256": "sha256:281da3a550e8de85167be21b333a7357213c68802194aa423937abeac529f806",
431
+ "expert_id": 83
432
+ },
433
+ {
434
+ "path": "experts/expert-084.gguf",
435
+ "sha256": "sha256:d944f398e0cff8e9c81cd1af578d6243015d243ce8ab07e030da041f7649b9ad",
436
+ "expert_id": 84
437
+ },
438
+ {
439
+ "path": "experts/expert-085.gguf",
440
+ "sha256": "sha256:53ecf7f904febdb072c69e99c53a32d56eb1a921c2f7cfbf8d90b1cf8a2eecd8",
441
+ "expert_id": 85
442
+ },
443
+ {
444
+ "path": "experts/expert-086.gguf",
445
+ "sha256": "sha256:5c25f522179dbad0bc4d13356ab67a81093792165eabaf3aae7eab8809b33c31",
446
+ "expert_id": 86
447
+ },
448
+ {
449
+ "path": "experts/expert-087.gguf",
450
+ "sha256": "sha256:e52d4719b523289319bdc69d54e0a941e90a9854e648e3f2650b089dd6759e66",
451
+ "expert_id": 87
452
+ },
453
+ {
454
+ "path": "experts/expert-088.gguf",
455
+ "sha256": "sha256:4737392f25a9a89d1aa03933c4fee511f0787ff9b857892c7fbc7bf392a1f7f7",
456
+ "expert_id": 88
457
+ },
458
+ {
459
+ "path": "experts/expert-089.gguf",
460
+ "sha256": "sha256:3189f82de59835763b5971eb3a064b5e2c52fd84698c4d5cb528d85e8159c65f",
461
+ "expert_id": 89
462
+ },
463
+ {
464
+ "path": "experts/expert-090.gguf",
465
+ "sha256": "sha256:865ca0521c06a587685139d1039ffb58be4281c3ac2d901f8d02c772ee65af6b",
466
+ "expert_id": 90
467
+ },
468
+ {
469
+ "path": "experts/expert-091.gguf",
470
+ "sha256": "sha256:a3a98ad2eb34a9338e1ec638f5dda6bc941da4b147e692e4ecdbf75e4411225a",
471
+ "expert_id": 91
472
+ },
473
+ {
474
+ "path": "experts/expert-092.gguf",
475
+ "sha256": "sha256:dea97b8adb7b09cd79ee2914e0652cda3eb2f2ec2445d6c06a727c304f7a3193",
476
+ "expert_id": 92
477
+ },
478
+ {
479
+ "path": "experts/expert-093.gguf",
480
+ "sha256": "sha256:9ce2320402b4adc8d250eadfe82d196eda45f7f038ad3e8e725d87b37f244f8e",
481
+ "expert_id": 93
482
+ },
483
+ {
484
+ "path": "experts/expert-094.gguf",
485
+ "sha256": "sha256:956ee4221f50d96607e4c715d9359a4d4c7e3644d929e768d7ac7f17f9a23cd6",
486
+ "expert_id": 94
487
+ },
488
+ {
489
+ "path": "experts/expert-095.gguf",
490
+ "sha256": "sha256:0c626074274eb03b4a0cb519e47f184f50346a8cbed26ba32bb110885ba53281",
491
+ "expert_id": 95
492
+ },
493
+ {
494
+ "path": "experts/expert-096.gguf",
495
+ "sha256": "sha256:5af56e7e07fa85ccc27e3397eb85cfb86e96f79611fff5d85423b0882af66cd8",
496
+ "expert_id": 96
497
+ },
498
+ {
499
+ "path": "experts/expert-097.gguf",
500
+ "sha256": "sha256:6afd53270269177cbf2e2d2ef6611e4fdac6b5025a672effaf97df1e5a80c416",
501
+ "expert_id": 97
502
+ },
503
+ {
504
+ "path": "experts/expert-098.gguf",
505
+ "sha256": "sha256:6083905713d8de2e7f8c58e1802803c1337f4f288f11c160d523992de8168d08",
506
+ "expert_id": 98
507
+ },
508
+ {
509
+ "path": "experts/expert-099.gguf",
510
+ "sha256": "sha256:b93b0488b5e76c2e24f81c3b5dbbe7df2802ae134a95bdeb170c93d118d97974",
511
+ "expert_id": 99
512
+ },
513
+ {
514
+ "path": "experts/expert-100.gguf",
515
+ "sha256": "sha256:279e04cefbc8c8f5b28e1d1cb522af24c5019444c29e5682743300b4c68427ae",
516
+ "expert_id": 100
517
+ },
518
+ {
519
+ "path": "experts/expert-101.gguf",
520
+ "sha256": "sha256:7c7303091966bd4ab2b460265d138e27fecd242622450c2e194fd1914e06da59",
521
+ "expert_id": 101
522
+ },
523
+ {
524
+ "path": "experts/expert-102.gguf",
525
+ "sha256": "sha256:dcd55a89f43a4c2707b26367da08c80b874037b518535231b19c4d2f72e00d43",
526
+ "expert_id": 102
527
+ },
528
+ {
529
+ "path": "experts/expert-103.gguf",
530
+ "sha256": "sha256:a602bd2820ebbf181ddbb39488d82f9ddd05aab4e8ef51a8060448b6d27c39f1",
531
+ "expert_id": 103
532
+ },
533
+ {
534
+ "path": "experts/expert-104.gguf",
535
+ "sha256": "sha256:2205168a3bec7dfa55a440f19d2d2eba097a5198d8b84300a740cc1b5f0c1fff",
536
+ "expert_id": 104
537
+ },
538
+ {
539
+ "path": "experts/expert-105.gguf",
540
+ "sha256": "sha256:5f50dffcad8553050d27d7432044ab0f0b2c3603844bb9f25769303579d74a8f",
541
+ "expert_id": 105
542
+ },
543
+ {
544
+ "path": "experts/expert-106.gguf",
545
+ "sha256": "sha256:968b2a0a521b6942d3c4ced818dff830db3a6cdb3a84bdf7612104840415548b",
546
+ "expert_id": 106
547
+ },
548
+ {
549
+ "path": "experts/expert-107.gguf",
550
+ "sha256": "sha256:1726d78ee71ac45bcb69103e1d2fd3d138739b965633fdbb5bdeeaa53f66c857",
551
+ "expert_id": 107
552
+ },
553
+ {
554
+ "path": "experts/expert-108.gguf",
555
+ "sha256": "sha256:30f4a1c24c1b35b9d62b544756c7a12249bc03f0c4924b05a98c964bb3696d88",
556
+ "expert_id": 108
557
+ },
558
+ {
559
+ "path": "experts/expert-109.gguf",
560
+ "sha256": "sha256:bdf8eec8b6467740f81d4a3e493aed94b127b4e8ec9cd033513612baa6bde5e8",
561
+ "expert_id": 109
562
+ },
563
+ {
564
+ "path": "experts/expert-110.gguf",
565
+ "sha256": "sha256:a05c3b3b574c51950095433af8d03f2e94119d2a83adec9335ee76a6ed4e1ef6",
566
+ "expert_id": 110
567
+ },
568
+ {
569
+ "path": "experts/expert-111.gguf",
570
+ "sha256": "sha256:f22e3880056183320247c1e1349edec98cd446503f08dbb88a2c010a536fcb29",
571
+ "expert_id": 111
572
+ },
573
+ {
574
+ "path": "experts/expert-112.gguf",
575
+ "sha256": "sha256:7eabcf811ae0252b051291591e039b44c7f505f2520189bf76e9e5bbddef03f5",
576
+ "expert_id": 112
577
+ },
578
+ {
579
+ "path": "experts/expert-113.gguf",
580
+ "sha256": "sha256:fdb6cd42cdb3d7107edfac6a2fcfb6fa91b677bf920a9a7d1e67c9b1ed03a64f",
581
+ "expert_id": 113
582
+ },
583
+ {
584
+ "path": "experts/expert-114.gguf",
585
+ "sha256": "sha256:71cd841aa49ee5e79ca1eebb38c610766f495fd3a5ef3733eae4e6a5019287de",
586
+ "expert_id": 114
587
+ },
588
+ {
589
+ "path": "experts/expert-115.gguf",
590
+ "sha256": "sha256:e57167ccdd116086f1f1c69233e628ba74ae2bff6a0b52b45672d6d407f2ff76",
591
+ "expert_id": 115
592
+ },
593
+ {
594
+ "path": "experts/expert-116.gguf",
595
+ "sha256": "sha256:aee9af5fd2946d802d3d0f2ba4eb04dbf75745b17f3e942a9d020297b68df015",
596
+ "expert_id": 116
597
+ },
598
+ {
599
+ "path": "experts/expert-117.gguf",
600
+ "sha256": "sha256:71aea1b398b3fd7a4a8f32fe691c501b846d1745c1341ac1dff7ba193bbf6755",
601
+ "expert_id": 117
602
+ },
603
+ {
604
+ "path": "experts/expert-118.gguf",
605
+ "sha256": "sha256:e2e28ec1f8e172c5cbda69cba3dd4b29fa54384b6b6ae26e81d02035cbf9b2b3",
606
+ "expert_id": 118
607
+ },
608
+ {
609
+ "path": "experts/expert-119.gguf",
610
+ "sha256": "sha256:5d716cf66f61a9521f26c5f5724424adabf2a38e2979c50f8084124aa8d82408",
611
+ "expert_id": 119
612
+ },
613
+ {
614
+ "path": "experts/expert-120.gguf",
615
+ "sha256": "sha256:56206271be2f95f7821bccea9d9faf08dfc57d83ff9e4cf3e8ff98c98b2a914f",
616
+ "expert_id": 120
617
+ },
618
+ {
619
+ "path": "experts/expert-121.gguf",
620
+ "sha256": "sha256:21c89b88a3603fafb651d1425d512ec63efcdc3f53ef177e0182a5e04c913be3",
621
+ "expert_id": 121
622
+ },
623
+ {
624
+ "path": "experts/expert-122.gguf",
625
+ "sha256": "sha256:9a4d743fad720e21129b7892d31659110a7aed4416e7e5d253498aa9cf6b5ec2",
626
+ "expert_id": 122
627
+ },
628
+ {
629
+ "path": "experts/expert-123.gguf",
630
+ "sha256": "sha256:60574211c91c36c4fed014db27943a0b10b48293c009a05aa88b31e774ca8662",
631
+ "expert_id": 123
632
+ },
633
+ {
634
+ "path": "experts/expert-124.gguf",
635
+ "sha256": "sha256:7be3b4f1afc84053bde14872d9cb20fb8322cdc0ab335e0cf7461a178d2da121",
636
+ "expert_id": 124
637
+ },
638
+ {
639
+ "path": "experts/expert-125.gguf",
640
+ "sha256": "sha256:d6cadf26b56f1b1976239cca04a9abe390b866999a212412b64a8fd6a167acbc",
641
+ "expert_id": 125
642
+ },
643
+ {
644
+ "path": "experts/expert-126.gguf",
645
+ "sha256": "sha256:974a203e23763f898707c30c5bd91d25dcc0f9324b124283c2d4ebddfd49555c",
646
+ "expert_id": 126
647
+ },
648
+ {
649
+ "path": "experts/expert-127.gguf",
650
+ "sha256": "sha256:5a83b79062370ba4d25ff431a92dbb7bdb0690bae86e617430a8af9ed3b78765",
651
+ "expert_id": 127
652
+ },
653
+ {
654
+ "path": "experts/expert-128.gguf",
655
+ "sha256": "sha256:129d7806948e1b8ff342f096a4b51effe25a0edae8a08669c0730263e0eb4b7d",
656
+ "expert_id": 128
657
+ },
658
+ {
659
+ "path": "experts/expert-129.gguf",
660
+ "sha256": "sha256:3cf20bc839bbd3c5da6afdecf26cb093acf989d3436d3c42f8d1f4a26774d7b2",
661
+ "expert_id": 129
662
+ },
663
+ {
664
+ "path": "experts/expert-130.gguf",
665
+ "sha256": "sha256:1db6e4dc6706f40a24560cf3a9f2991bfd4cd6ee8f9fef2155533e6a0be2055f",
666
+ "expert_id": 130
667
+ },
668
+ {
669
+ "path": "experts/expert-131.gguf",
670
+ "sha256": "sha256:5c949b1181ff678492cd9d4f81b7a9ecb853d6182f9beb5af58bc85dd026eea2",
671
+ "expert_id": 131
672
+ },
673
+ {
674
+ "path": "experts/expert-132.gguf",
675
+ "sha256": "sha256:c555a6e5c97fe11f7442d85cbbbe74a755a314d9a55dddc83bc4271a5fd7b4e6",
676
+ "expert_id": 132
677
+ },
678
+ {
679
+ "path": "experts/expert-133.gguf",
680
+ "sha256": "sha256:8ff60458aeabbe180f18bc7111a90fb8c9b3de97ae411264148c4cbb596ddab8",
681
+ "expert_id": 133
682
+ },
683
+ {
684
+ "path": "experts/expert-134.gguf",
685
+ "sha256": "sha256:e0e8e3cc5844593dea4f84d05cf515a90e0efdf4d15d0de38f035d522ee06bd2",
686
+ "expert_id": 134
687
+ },
688
+ {
689
+ "path": "experts/expert-135.gguf",
690
+ "sha256": "sha256:a3df25fce3cb5b650401e71e7d17d83a820a0228a0a07680efd47b947c09134e",
691
+ "expert_id": 135
692
+ },
693
+ {
694
+ "path": "experts/expert-136.gguf",
695
+ "sha256": "sha256:57734c22ed37b5609368d0d09a18217d104e19477539d1b4e83280eaa60a883e",
696
+ "expert_id": 136
697
+ },
698
+ {
699
+ "path": "experts/expert-137.gguf",
700
+ "sha256": "sha256:5e7a6e6fb35d4454c424f45acdaa2f665c1e6923d99b782666efd7416d69a033",
701
+ "expert_id": 137
702
+ },
703
+ {
704
+ "path": "experts/expert-138.gguf",
705
+ "sha256": "sha256:1f9e67e0653d93c66a530461c201f030b275fe55fd9800ce8b99dd6dcaa05f72",
706
+ "expert_id": 138
707
+ },
708
+ {
709
+ "path": "experts/expert-139.gguf",
710
+ "sha256": "sha256:5ee4f84ee54601d58d1582a2977f377cd50aad9b85d138ebeab0630673e02d84",
711
+ "expert_id": 139
712
+ },
713
+ {
714
+ "path": "experts/expert-140.gguf",
715
+ "sha256": "sha256:a1c1498a1005c0f3b7112ee01e20caf2c9bf786c0bebcd96e6a8273dcaaa90c9",
716
+ "expert_id": 140
717
+ },
718
+ {
719
+ "path": "experts/expert-141.gguf",
720
+ "sha256": "sha256:fb50a5678170acf2f855b8ab7549143ee92b364a142d43314b59dd901accefe5",
721
+ "expert_id": 141
722
+ },
723
+ {
724
+ "path": "experts/expert-142.gguf",
725
+ "sha256": "sha256:d1ce6fe2cbe61909525dec8b130796931a31c682deca194d0f5c94e497145961",
726
+ "expert_id": 142
727
+ },
728
+ {
729
+ "path": "experts/expert-143.gguf",
730
+ "sha256": "sha256:3af728eb2692849ca7adebd375d2fa9846fead365df07f1504b62cf3dd567835",
731
+ "expert_id": 143
732
+ },
733
+ {
734
+ "path": "experts/expert-144.gguf",
735
+ "sha256": "sha256:a8ff3125ac2a97312e274d12cdcc69acf7f66823e3d8a2510de39e20897b9d25",
736
+ "expert_id": 144
737
+ },
738
+ {
739
+ "path": "experts/expert-145.gguf",
740
+ "sha256": "sha256:79848ffa4f52f9dea687ab5c7edba2d5720a0a91ee51b081336a78806bf00951",
741
+ "expert_id": 145
742
+ },
743
+ {
744
+ "path": "experts/expert-146.gguf",
745
+ "sha256": "sha256:a3a793972a471364f2bc6dfbf70a03a181503169b896d132a1d6decb21751b7d",
746
+ "expert_id": 146
747
+ },
748
+ {
749
+ "path": "experts/expert-147.gguf",
750
+ "sha256": "sha256:102b417509db7cebd7869a8c69eb28a8b95009b7678b8d367aa402c381be4041",
751
+ "expert_id": 147
752
+ },
753
+ {
754
+ "path": "experts/expert-148.gguf",
755
+ "sha256": "sha256:6048e6fbc3ed27735653811fd34fbca4828890115a5332aa590c8095ee1d6f8b",
756
+ "expert_id": 148
757
+ },
758
+ {
759
+ "path": "experts/expert-149.gguf",
760
+ "sha256": "sha256:8e68773c4a282dcca9a072346feccc87a879e5a8ef770f1de699331851e1e1d8",
761
+ "expert_id": 149
762
+ },
763
+ {
764
+ "path": "experts/expert-150.gguf",
765
+ "sha256": "sha256:86cea9a46129f76f2f16263ef3ef3bd44b70bbc593f10829b423126617bc45db",
766
+ "expert_id": 150
767
+ },
768
+ {
769
+ "path": "experts/expert-151.gguf",
770
+ "sha256": "sha256:15bf53049eade9bb1ef4654fca843d2b3ec94af9994c5f9a273f2a5c047c0a7a",
771
+ "expert_id": 151
772
+ },
773
+ {
774
+ "path": "experts/expert-152.gguf",
775
+ "sha256": "sha256:c7f6f3cf81a2ffcf603ee688bacb36316ce5d025d0e4a5b2ef78b424ff541505",
776
+ "expert_id": 152
777
+ },
778
+ {
779
+ "path": "experts/expert-153.gguf",
780
+ "sha256": "sha256:edc3694c004f4c4ab0137855e8cf7be8df78f7537998d7a6f3b972bd8903af67",
781
+ "expert_id": 153
782
+ },
783
+ {
784
+ "path": "experts/expert-154.gguf",
785
+ "sha256": "sha256:6c78670a78797f982484f083f1a8dd66c6060378580f5342f9cd0a74bb03ba73",
786
+ "expert_id": 154
787
+ },
788
+ {
789
+ "path": "experts/expert-155.gguf",
790
+ "sha256": "sha256:ee8fca508474984403cf066eb94bf1c4d4243c7f111ee9ce67d5f67c0508ee28",
791
+ "expert_id": 155
792
+ },
793
+ {
794
+ "path": "experts/expert-156.gguf",
795
+ "sha256": "sha256:d9cd0b3778d42032cef7bed7f94082408fa77b59ee20e9de10e1ecd5849dd1a5",
796
+ "expert_id": 156
797
+ },
798
+ {
799
+ "path": "experts/expert-157.gguf",
800
+ "sha256": "sha256:3867379123052b1b2712172d9626ef12c3aa0c15abdb1f10aec3e743577b374a",
801
+ "expert_id": 157
802
+ },
803
+ {
804
+ "path": "experts/expert-158.gguf",
805
+ "sha256": "sha256:57876ae4e1ae4abc5485c9da1876d2e9370af97e353f6462109a737b92c9ff70",
806
+ "expert_id": 158
807
+ },
808
+ {
809
+ "path": "experts/expert-159.gguf",
810
+ "sha256": "sha256:c99808d03a74a59d83596820906422d0f9276456cd1ddb175e7ed3520700b013",
811
+ "expert_id": 159
812
+ },
813
+ {
814
+ "path": "experts/expert-160.gguf",
815
+ "sha256": "sha256:2aed3db35f4dfa7059d20afa3c5f79cbc881d7016979b11e8b7818c7350b70e1",
816
+ "expert_id": 160
817
+ },
818
+ {
819
+ "path": "experts/expert-161.gguf",
820
+ "sha256": "sha256:0dd4a00a2824c6a380161287b96d58e727fbae288cf2448ce26467f9a34076b2",
821
+ "expert_id": 161
822
+ },
823
+ {
824
+ "path": "experts/expert-162.gguf",
825
+ "sha256": "sha256:bbbc94ee3d68e0e0367828443a91e10fbf3a71ab2ecbe6a1e0d80196de5c4db8",
826
+ "expert_id": 162
827
+ },
828
+ {
829
+ "path": "experts/expert-163.gguf",
830
+ "sha256": "sha256:5eb7de300a8973f5227a944d63c5287246855dd70684bfdb05b3ef548e97bd95",
831
+ "expert_id": 163
832
+ },
833
+ {
834
+ "path": "experts/expert-164.gguf",
835
+ "sha256": "sha256:9b51862b2e12f79bca20c52f094c6443b47ec6590dc2bec37acacdcf38943202",
836
+ "expert_id": 164
837
+ },
838
+ {
839
+ "path": "experts/expert-165.gguf",
840
+ "sha256": "sha256:81b2391b04e841128538b7216532830dfd867f524f04982dbfb0242b455271c1",
841
+ "expert_id": 165
842
+ },
843
+ {
844
+ "path": "experts/expert-166.gguf",
845
+ "sha256": "sha256:e763dede99d642c5caad20c29ccbfdcae955be00508defcd49e909c6db03dfca",
846
+ "expert_id": 166
847
+ },
848
+ {
849
+ "path": "experts/expert-167.gguf",
850
+ "sha256": "sha256:9b65891c9ac02d4d02a49f69112874a22d9a472c2a28ca2a96670d1b063bf53d",
851
+ "expert_id": 167
852
+ },
853
+ {
854
+ "path": "experts/expert-168.gguf",
855
+ "sha256": "sha256:a9b8c70f366e2551c90ad81ef4327e2ef566d28a70e5f593a63d6ac3714745f6",
856
+ "expert_id": 168
857
+ },
858
+ {
859
+ "path": "experts/expert-169.gguf",
860
+ "sha256": "sha256:f3d3cba5323b594b9d2033edc3da6044c28492cad34eaa4b28582e169a6fa6c9",
861
+ "expert_id": 169
862
+ },
863
+ {
864
+ "path": "experts/expert-170.gguf",
865
+ "sha256": "sha256:7bc99e216926fc84d1f79fb0ad7d5bba60ce73b6256e41b7c8e6e97d7967b996",
866
+ "expert_id": 170
867
+ },
868
+ {
869
+ "path": "experts/expert-171.gguf",
870
+ "sha256": "sha256:f1af4b83fa113267be1698e662cdf307214ab2a34ba2b0f9b16acc8ad0ba5921",
871
+ "expert_id": 171
872
+ },
873
+ {
874
+ "path": "experts/expert-172.gguf",
875
+ "sha256": "sha256:71d035c8bec880ce71559cb32f867ab42aa8470094ad91221c1c4cdc69914176",
876
+ "expert_id": 172
877
+ },
878
+ {
879
+ "path": "experts/expert-173.gguf",
880
+ "sha256": "sha256:63bb7fdf4f0cc13a6e7be37bd96882a1c1602011df5a87d186c3a2b99e5426fa",
881
+ "expert_id": 173
882
+ },
883
+ {
884
+ "path": "experts/expert-174.gguf",
885
+ "sha256": "sha256:ee6e3ae5d5af28346bb6f344d24200d8b10d1a39eeb22e27e944b67997835783",
886
+ "expert_id": 174
887
+ },
888
+ {
889
+ "path": "experts/expert-175.gguf",
890
+ "sha256": "sha256:8c73f0b74b9e8d66b1ab59df76a25bc649b481930d96e095cb8383739ca3173f",
891
+ "expert_id": 175
892
+ },
893
+ {
894
+ "path": "experts/expert-176.gguf",
895
+ "sha256": "sha256:8920f7549b6b0dd619c67e7883261cef91fadc05376a198fde4da6a9ef9dfc99",
896
+ "expert_id": 176
897
+ },
898
+ {
899
+ "path": "experts/expert-177.gguf",
900
+ "sha256": "sha256:3dc514f7b91eb5646f71f22693b0be5f997160b35ea3842a50c08ed4e4997625",
901
+ "expert_id": 177
902
+ },
903
+ {
904
+ "path": "experts/expert-178.gguf",
905
+ "sha256": "sha256:d3c39d04ba15adef1829d1a6c4c716536aa90c03f5b051ff98cf9a6246b6bc1e",
906
+ "expert_id": 178
907
+ },
908
+ {
909
+ "path": "experts/expert-179.gguf",
910
+ "sha256": "sha256:0f8bc529a90ed5ee5cb87f8bf51bf7db6718c4620dd00f523734a411c446ffc9",
911
+ "expert_id": 179
912
+ },
913
+ {
914
+ "path": "experts/expert-180.gguf",
915
+ "sha256": "sha256:a9421305dac8130720560da4829295a6fbf9f9bdb52fef82e26a2618f9397cfd",
916
+ "expert_id": 180
917
+ },
918
+ {
919
+ "path": "experts/expert-181.gguf",
920
+ "sha256": "sha256:c85f56ac4aca0c0977783b95a48fc35d72224152c5e88c89dea905190db4fa5e",
921
+ "expert_id": 181
922
+ },
923
+ {
924
+ "path": "experts/expert-182.gguf",
925
+ "sha256": "sha256:b8806428e5ad3490fe1a7c0e57130a2bcc9262cd3a3e29d6a57da27139ccebf0",
926
+ "expert_id": 182
927
+ },
928
+ {
929
+ "path": "experts/expert-183.gguf",
930
+ "sha256": "sha256:e86f46152b6c03b8c70fef3b7c2e83dbff22e1174fb81b04c4845c218f7e0f82",
931
+ "expert_id": 183
932
+ },
933
+ {
934
+ "path": "experts/expert-184.gguf",
935
+ "sha256": "sha256:17ec33fab87c34446511f2b55dae96643f39b96fe68efcf8b8679da5a9af3a45",
936
+ "expert_id": 184
937
+ },
938
+ {
939
+ "path": "experts/expert-185.gguf",
940
+ "sha256": "sha256:e7e40d1cf490eee7583d7ace7818af2621f3a098393e7211d02c5cc608c979c7",
941
+ "expert_id": 185
942
+ },
943
+ {
944
+ "path": "experts/expert-186.gguf",
945
+ "sha256": "sha256:1751f4fc8c84b39cbd41c6f5ec8069b1cc229869e83de8948e6ff592224627b2",
946
+ "expert_id": 186
947
+ },
948
+ {
949
+ "path": "experts/expert-187.gguf",
950
+ "sha256": "sha256:dd9dd0d34a5b040b7dec5a4b3ed454246c6cb9ec124afcd8777d363f7bb0f8d6",
951
+ "expert_id": 187
952
+ },
953
+ {
954
+ "path": "experts/expert-188.gguf",
955
+ "sha256": "sha256:fa8c276abaa1f7e304a0bf63f7e2ecb630a0b4f262ea3b0a4949292b7432f2ca",
956
+ "expert_id": 188
957
+ },
958
+ {
959
+ "path": "experts/expert-189.gguf",
960
+ "sha256": "sha256:029b737214dac83984a4553ecc777322e39c7919418a5f08be496352a3d01b1a",
961
+ "expert_id": 189
962
+ },
963
+ {
964
+ "path": "experts/expert-190.gguf",
965
+ "sha256": "sha256:466ed3623e0f4357a16011aca395657ab1705ce367991d32c26a94cb78d1ae58",
966
+ "expert_id": 190
967
+ },
968
+ {
969
+ "path": "experts/expert-191.gguf",
970
+ "sha256": "sha256:4fb08f003d43e901c40143c3e4f84cade08b4ee2e5fabbc279d84693b48c9b72",
971
+ "expert_id": 191
972
+ },
973
+ {
974
+ "path": "experts/expert-192.gguf",
975
+ "sha256": "sha256:ecffaab499ae06f298a18276b32e0697e7a8d5c8249c538485e46eb057a03284",
976
+ "expert_id": 192
977
+ },
978
+ {
979
+ "path": "experts/expert-193.gguf",
980
+ "sha256": "sha256:925488a17285c7de486ef1da886159b6bf3d6e0ba12d8bacce60033cca397f20",
981
+ "expert_id": 193
982
+ },
983
+ {
984
+ "path": "experts/expert-194.gguf",
985
+ "sha256": "sha256:b8a95364bc2b4bf8a89b0a197d3da0dcd4925b59915a0c8a23e4c2cd5da45d65",
986
+ "expert_id": 194
987
+ },
988
+ {
989
+ "path": "experts/expert-195.gguf",
990
+ "sha256": "sha256:11e3f796f563d404cea36fe98a14d41f96075d153bb3008d95b2c88922173c75",
991
+ "expert_id": 195
992
+ },
993
+ {
994
+ "path": "experts/expert-196.gguf",
995
+ "sha256": "sha256:4ad19364aaed4fbadac678d853b18b9e1bf5744cac998ce3579890fded7ffc9c",
996
+ "expert_id": 196
997
+ },
998
+ {
999
+ "path": "experts/expert-197.gguf",
1000
+ "sha256": "sha256:dc5343d82a35cca10f94bf6d07e797bf34b06079f5bc27b07d3316c31c38da1d",
1001
+ "expert_id": 197
1002
+ },
1003
+ {
1004
+ "path": "experts/expert-198.gguf",
1005
+ "sha256": "sha256:53ef817177e50ce3d0e282a119bd2dd1b985cae92cb9477ee7a55ad93ca4f90f",
1006
+ "expert_id": 198
1007
+ },
1008
+ {
1009
+ "path": "experts/expert-199.gguf",
1010
+ "sha256": "sha256:7d307b464186b30c6fcbc1c141b723b2ce0fbddc4b5826b98d713dc44e4d63cf",
1011
+ "expert_id": 199
1012
+ },
1013
+ {
1014
+ "path": "experts/expert-200.gguf",
1015
+ "sha256": "sha256:4a9a706f922166312d37b6726e9ea5f79b698b2f05e8d1da0906640f2931be9b",
1016
+ "expert_id": 200
1017
+ },
1018
+ {
1019
+ "path": "experts/expert-201.gguf",
1020
+ "sha256": "sha256:ecc095389dc0832c0d8b7610af81cee8e1beca8e1fb36c2bd9ab4cb5fdeeba30",
1021
+ "expert_id": 201
1022
+ },
1023
+ {
1024
+ "path": "experts/expert-202.gguf",
1025
+ "sha256": "sha256:989ddee2d92f2f73e677dc7b5c95388914c9cf8ee7dcc9dce1020184d8450f47",
1026
+ "expert_id": 202
1027
+ },
1028
+ {
1029
+ "path": "experts/expert-203.gguf",
1030
+ "sha256": "sha256:ba0b0ca412516471ae594c47f718f19a53e3889e969ec311f0e60ae21324ab8c",
1031
+ "expert_id": 203
1032
+ },
1033
+ {
1034
+ "path": "experts/expert-204.gguf",
1035
+ "sha256": "sha256:f932f7ef294b022c9e96c170606f7ee91e5047f6b50f049a8c6569cb7bbb0341",
1036
+ "expert_id": 204
1037
+ },
1038
+ {
1039
+ "path": "experts/expert-205.gguf",
1040
+ "sha256": "sha256:9a8f37b3f01e5ad2ed6ebeb21a4bab25789c1f384e1a59de830dc35557d6d9dc",
1041
+ "expert_id": 205
1042
+ },
1043
+ {
1044
+ "path": "experts/expert-206.gguf",
1045
+ "sha256": "sha256:6365be61811a3d0745a920f10123d45b7b7669df0cdffa0e4421d2422119b444",
1046
+ "expert_id": 206
1047
+ },
1048
+ {
1049
+ "path": "experts/expert-207.gguf",
1050
+ "sha256": "sha256:6e1474bae08e54982c5e7430fb60c5535dd2948d8fde9666464547e07cc976ec",
1051
+ "expert_id": 207
1052
+ },
1053
+ {
1054
+ "path": "experts/expert-208.gguf",
1055
+ "sha256": "sha256:653b0c73a8e15c63d680b18efcf290c2c02c60344c426d791b9390d9943b2b65",
1056
+ "expert_id": 208
1057
+ },
1058
+ {
1059
+ "path": "experts/expert-209.gguf",
1060
+ "sha256": "sha256:b326eec3eede1c4d0707fbe5ed9b1b6b5a5e09278d3a2e81674b61822f7d49d7",
1061
+ "expert_id": 209
1062
+ },
1063
+ {
1064
+ "path": "experts/expert-210.gguf",
1065
+ "sha256": "sha256:6c98564616e0859756d62f0c931521f239862ca8ad32a48a3186c4c138adc1ef",
1066
+ "expert_id": 210
1067
+ },
1068
+ {
1069
+ "path": "experts/expert-211.gguf",
1070
+ "sha256": "sha256:57211650bc0a3cb96ea615868ba770af20033a15d416720421f2ba4311c61d40",
1071
+ "expert_id": 211
1072
+ },
1073
+ {
1074
+ "path": "experts/expert-212.gguf",
1075
+ "sha256": "sha256:b512271789d82a0b18c0bb328a87b03263389b2219ca9f4ce37ca566ca48f257",
1076
+ "expert_id": 212
1077
+ },
1078
+ {
1079
+ "path": "experts/expert-213.gguf",
1080
+ "sha256": "sha256:25d99b540532b92a58d48511a3ac6cb23ce606b6c311bd3c0a722f33c6931996",
1081
+ "expert_id": 213
1082
+ },
1083
+ {
1084
+ "path": "experts/expert-214.gguf",
1085
+ "sha256": "sha256:035a673993c9ccfe32e8ee123e3545a67e7b6883e0668590b910cd76cc00a854",
1086
+ "expert_id": 214
1087
+ },
1088
+ {
1089
+ "path": "experts/expert-215.gguf",
1090
+ "sha256": "sha256:5d26ed14b1358724c2c0420becc196cf95dcc935718cab29990e5994e9fed17c",
1091
+ "expert_id": 215
1092
+ },
1093
+ {
1094
+ "path": "experts/expert-216.gguf",
1095
+ "sha256": "sha256:edaef55a7863d88e80e9e6e4e10141f46c0b9c29b88c33f5f386412806c94af9",
1096
+ "expert_id": 216
1097
+ },
1098
+ {
1099
+ "path": "experts/expert-217.gguf",
1100
+ "sha256": "sha256:350bbe67a1955b5e244d0aa868470c37f249453911540560183fd2a12da35c80",
1101
+ "expert_id": 217
1102
+ },
1103
+ {
1104
+ "path": "experts/expert-218.gguf",
1105
+ "sha256": "sha256:50cc6aa307a52ba7d9e157425407ca97e0582d48b69e0fd6b93ad2ceb187ad20",
1106
+ "expert_id": 218
1107
+ },
1108
+ {
1109
+ "path": "experts/expert-219.gguf",
1110
+ "sha256": "sha256:f79945124a75e465974e1606baff6c1d9a877a735ea064a431ef64869d228c39",
1111
+ "expert_id": 219
1112
+ },
1113
+ {
1114
+ "path": "experts/expert-220.gguf",
1115
+ "sha256": "sha256:cd285f1854e93a6eff7c2ebf894ba2c407ac4ac6fa5efce3ee8e2de2d0fa5f89",
1116
+ "expert_id": 220
1117
+ },
1118
+ {
1119
+ "path": "experts/expert-221.gguf",
1120
+ "sha256": "sha256:09296acfdafd1c7d254aa3d9ef6448545665b2283aaadc021561f40a207c82ac",
1121
+ "expert_id": 221
1122
+ },
1123
+ {
1124
+ "path": "experts/expert-222.gguf",
1125
+ "sha256": "sha256:64f3ef87c266121d01d92446c6f8665ca16b31d8517ea256eb22e9a5025a5e2d",
1126
+ "expert_id": 222
1127
+ },
1128
+ {
1129
+ "path": "experts/expert-223.gguf",
1130
+ "sha256": "sha256:99cbf7801333653e21823b0b4ee9216be191ee4f5f1a8dc913d570940aabd44f",
1131
+ "expert_id": 223
1132
+ },
1133
+ {
1134
+ "path": "experts/expert-224.gguf",
1135
+ "sha256": "sha256:f91f76ae8f5a241fbb5c00919405aa61437cdbe668e91c4c59986aa0518fbaae",
1136
+ "expert_id": 224
1137
+ },
1138
+ {
1139
+ "path": "experts/expert-225.gguf",
1140
+ "sha256": "sha256:556f570b62676521e78cfbb085fa9272ed26d54b98dfc215b75f98d89ac5a0c2",
1141
+ "expert_id": 225
1142
+ },
1143
+ {
1144
+ "path": "experts/expert-226.gguf",
1145
+ "sha256": "sha256:2fe4a25df5bd58ceffd8af12c7e8191374b26f03e37549d687098d2b6230bdae",
1146
+ "expert_id": 226
1147
+ },
1148
+ {
1149
+ "path": "experts/expert-227.gguf",
1150
+ "sha256": "sha256:9b02905ece1de47a921049b26bc6b4fbad78307da337bcc22dc15241d9c57b73",
1151
+ "expert_id": 227
1152
+ },
1153
+ {
1154
+ "path": "experts/expert-228.gguf",
1155
+ "sha256": "sha256:4741b18365d9d186042d87aab4a80619560bd4fdf74b8d908f3e60e77106f108",
1156
+ "expert_id": 228
1157
+ },
1158
+ {
1159
+ "path": "experts/expert-229.gguf",
1160
+ "sha256": "sha256:8d9d61a8ca2ea76d3f7991500300431569c02029db70975bc3a8795b957ddbbf",
1161
+ "expert_id": 229
1162
+ },
1163
+ {
1164
+ "path": "experts/expert-230.gguf",
1165
+ "sha256": "sha256:d49d76d5100af4a746553f176d4812c790ec2187037c3741aebb040472676c12",
1166
+ "expert_id": 230
1167
+ },
1168
+ {
1169
+ "path": "experts/expert-231.gguf",
1170
+ "sha256": "sha256:970a2bb09bf1511f32281e71d9da89fb844958ed47c8a64e5fc613acffe8d5df",
1171
+ "expert_id": 231
1172
+ },
1173
+ {
1174
+ "path": "experts/expert-232.gguf",
1175
+ "sha256": "sha256:e3cabf92c43c13c75c7b1c034d4d84facf399d8a097237150f7a9c6f68cb50b8",
1176
+ "expert_id": 232
1177
+ },
1178
+ {
1179
+ "path": "experts/expert-233.gguf",
1180
+ "sha256": "sha256:85c630738fe531d2570cef84e0fe7f7c003eccb8c7405e10447a2cba686ff1a5",
1181
+ "expert_id": 233
1182
+ },
1183
+ {
1184
+ "path": "experts/expert-234.gguf",
1185
+ "sha256": "sha256:4ab0c523d537c5df8d1f9314156b0f73b27da1e55bc58e8514f7a9bc4081bb1a",
1186
+ "expert_id": 234
1187
+ },
1188
+ {
1189
+ "path": "experts/expert-235.gguf",
1190
+ "sha256": "sha256:0bd6611b2e29ce94958bc2f8ed469a2a99a89f9458a36a9ad020801315f93826",
1191
+ "expert_id": 235
1192
+ },
1193
+ {
1194
+ "path": "experts/expert-236.gguf",
1195
+ "sha256": "sha256:46bbe3a45b4f0489f9ba4f1c0b8964f3b196a79978b14009f50f2f474a66f5b2",
1196
+ "expert_id": 236
1197
+ },
1198
+ {
1199
+ "path": "experts/expert-237.gguf",
1200
+ "sha256": "sha256:bd404221cf23645579471a62f968ae90f96b0ddb7a900335d7339eb629c6fda6",
1201
+ "expert_id": 237
1202
+ },
1203
+ {
1204
+ "path": "experts/expert-238.gguf",
1205
+ "sha256": "sha256:89c579cbff048cede5bcd75588c3d9e52cf1395f3cebd43d8691f0547db9fbef",
1206
+ "expert_id": 238
1207
+ },
1208
+ {
1209
+ "path": "experts/expert-239.gguf",
1210
+ "sha256": "sha256:1eea6227599a9b5a1fba3af1aba19b48f827ae83b70c6dd29df1d0910082dbc9",
1211
+ "expert_id": 239
1212
+ },
1213
+ {
1214
+ "path": "experts/expert-240.gguf",
1215
+ "sha256": "sha256:13c00288f65dc5b23f8c3ff21fd3b8445371ddab9c64f871c3dd34d60b1407db",
1216
+ "expert_id": 240
1217
+ },
1218
+ {
1219
+ "path": "experts/expert-241.gguf",
1220
+ "sha256": "sha256:89049300d9a03047a3fbaa2fa2bb9113337aa63a75254c9ee08c410a8bf37943",
1221
+ "expert_id": 241
1222
+ },
1223
+ {
1224
+ "path": "experts/expert-242.gguf",
1225
+ "sha256": "sha256:b2767d9772fa27af2deb0143cd326c16b10d2ef61620b6e1ac1c867cadf3641d",
1226
+ "expert_id": 242
1227
+ },
1228
+ {
1229
+ "path": "experts/expert-243.gguf",
1230
+ "sha256": "sha256:132fffad3a6708f1d169e78183e3578044b07f96edd97258415a9d2ae04c225e",
1231
+ "expert_id": 243
1232
+ },
1233
+ {
1234
+ "path": "experts/expert-244.gguf",
1235
+ "sha256": "sha256:6e39083421a4ca6ac5bfd450c98460d3099ad8857a78f82af8f3da774056cb37",
1236
+ "expert_id": 244
1237
+ },
1238
+ {
1239
+ "path": "experts/expert-245.gguf",
1240
+ "sha256": "sha256:34f4c784262faacc4ad334b54b587a57222ec426315648b7e8aa53622d6c95ad",
1241
+ "expert_id": 245
1242
+ },
1243
+ {
1244
+ "path": "experts/expert-246.gguf",
1245
+ "sha256": "sha256:05847744920574b537e1d04e4b81a4519c16d3aec8d598f3d7aeeb57a3b4e064",
1246
+ "expert_id": 246
1247
+ },
1248
+ {
1249
+ "path": "experts/expert-247.gguf",
1250
+ "sha256": "sha256:c7e45ace7ae5c6c65c2bd3f66e4c4b0cfd294c13812d6ee25823e69822f832a5",
1251
+ "expert_id": 247
1252
+ },
1253
+ {
1254
+ "path": "experts/expert-248.gguf",
1255
+ "sha256": "sha256:89a1859713e96f37f3b3e9dc272fb2091ff16514727d9d135ec6d571e6abcf7a",
1256
+ "expert_id": 248
1257
+ },
1258
+ {
1259
+ "path": "experts/expert-249.gguf",
1260
+ "sha256": "sha256:2acc01996220071e21e39ae0936f0f237d1a7b7393f275b00dd40cd0c56814cc",
1261
+ "expert_id": 249
1262
+ },
1263
+ {
1264
+ "path": "experts/expert-250.gguf",
1265
+ "sha256": "sha256:c83a55990597af4c63caeaca907df5fdcf1a95c3b19934c2b9e0b8b38273002d",
1266
+ "expert_id": 250
1267
+ },
1268
+ {
1269
+ "path": "experts/expert-251.gguf",
1270
+ "sha256": "sha256:fd411a199fb0f66264c770f288291321ce2d4e12784799d2cd201130029f4a22",
1271
+ "expert_id": 251
1272
+ },
1273
+ {
1274
+ "path": "experts/expert-252.gguf",
1275
+ "sha256": "sha256:ad6ac9c730c379da4d0dccc888eac25d002e9d2ece8aae366611f752fe6b7286",
1276
+ "expert_id": 252
1277
+ },
1278
+ {
1279
+ "path": "experts/expert-253.gguf",
1280
+ "sha256": "sha256:dd07ba6c2ffe784435dc75ec652f93a6ee36616c3294486f2345de679d689d91",
1281
+ "expert_id": 253
1282
+ },
1283
+ {
1284
+ "path": "experts/expert-254.gguf",
1285
+ "sha256": "sha256:3008c59c58b0b7c829af28a75bd6231a88a6cb72ee16a56b3d9c6cdd4960f5a5",
1286
+ "expert_id": 254
1287
+ },
1288
+ {
1289
+ "path": "experts/expert-255.gguf",
1290
+ "sha256": "sha256:6b6d49a10a6d8c10bde98daa8db96bc23fcd8f721055ba88d15593b1067409e1",
1291
+ "expert_id": 255
1292
+ }
1293
+ ]
1294
+ }
variants/Q4_K_XL/ranking.csv ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MoE expert ranking by gate mass
2
+ # Model: /Users/jdumay/.cache/huggingface/hub/models--unsloth--Qwen3.6-35B-A3B-GGUF/snapshots/9280dd353ab587157920d5bd391ada414d84e552/Qwen3.6-35B-A3B-UD-Q4_K_XL.gguf
3
+ # Experts: 256 (top-8)
4
+ # Prompts: 10 x 32 tokens
5
+ # Layers logged: all
6
+ # Total token-layer observations: 3304225
7
+ #
8
+ # Format: expert_id,gate_mass,mass_pct,selection_count
9
+ # Sorted by gate_mass descending (hottest first)
10
+ 0,12897.1,32.0625,3277738
11
+ 243,153.063,0.380518,1539
12
+ 89,150.49,0.37412,1298
13
+ 60,149.057,0.370558,1645
14
+ 224,146.013,0.362991,1489
15
+ 64,142.587,0.354475,1426
16
+ 95,140.235,0.348627,1550
17
+ 189,140.116,0.348331,1045
18
+ 229,136.135,0.338433,1522
19
+ 36,136.132,0.338425,1348
20
+ 125,134.277,0.333815,1138
21
+ 108,133.868,0.332798,1271
22
+ 254,133.33,0.33146,1197
23
+ 167,133.256,0.331275,1316
24
+ 43,132.86,0.330292,1652
25
+ 88,132.632,0.329726,1366
26
+ 137,132.245,0.328764,1186
27
+ 165,131.32,0.326463,1255
28
+ 160,129.608,0.322208,1217
29
+ 46,129.056,0.320835,1160
30
+ 35,128.984,0.320657,1510
31
+ 134,128.404,0.319215,1327
32
+ 65,127.282,0.316426,1397
33
+ 103,126.612,0.314761,1334
34
+ 191,126.241,0.313836,1438
35
+ 182,126.235,0.313821,1221
36
+ 72,125.683,0.31245,1384
37
+ 56,125.096,0.31099,1116
38
+ 204,125.043,0.31086,1170
39
+ 203,125.007,0.31077,1321
40
+ 158,125.007,0.310769,1112
41
+ 20,124.022,0.308322,1580
42
+ 248,123.246,0.306392,1154
43
+ 220,122.568,0.304705,1275
44
+ 61,122.235,0.303879,1274
45
+ 163,121.789,0.30277,1178
46
+ 221,121.181,0.301257,1050
47
+ 206,121.022,0.300863,1151
48
+ 105,120.96,0.300708,769
49
+ 201,120.941,0.30066,976
50
+ 47,120.627,0.29988,1263
51
+ 151,119.378,0.296776,1568
52
+ 127,119.272,0.296513,1033
53
+ 99,119.161,0.296236,1090
54
+ 208,119.019,0.295884,924
55
+ 219,118.918,0.295632,1047
56
+ 171,118.732,0.29517,1007
57
+ 111,118.732,0.29517,1115
58
+ 185,118.714,0.295126,1265
59
+ 251,118.705,0.295103,1041
60
+ 121,118.695,0.295078,978
61
+ 116,118.639,0.294939,1415
62
+ 87,118.281,0.294049,1160
63
+ 209,117.858,0.292998,1164
64
+ 217,117.032,0.290944,921
65
+ 107,116.513,0.289654,953
66
+ 42,116.391,0.289351,976
67
+ 239,115.628,0.287454,937
68
+ 2,115.512,0.287164,1055
69
+ 250,115.34,0.286737,1130
70
+ 130,115.325,0.286701,1245
71
+ 128,115.31,0.286663,981
72
+ 205,115.303,0.286645,1045
73
+ 28,115.267,0.286555,1104
74
+ 252,115.165,0.286301,907
75
+ 30,114.915,0.285681,798
76
+ 162,114.525,0.284711,986
77
+ 44,114.387,0.284369,1270
78
+ 138,114.295,0.284139,970
79
+ 32,114.198,0.283898,1098
80
+ 225,114.078,0.283601,793
81
+ 41,113.612,0.282441,683
82
+ 154,112.79,0.280397,950
83
+ 176,112.788,0.280392,967
84
+ 69,112.288,0.279151,791
85
+ 37,112.096,0.278674,897
86
+ 26,112.058,0.278579,1135
87
+ 86,111.989,0.278406,1013
88
+ 202,111.718,0.277733,1047
89
+ 63,111.708,0.277709,1091
90
+ 247,111.689,0.277661,872
91
+ 218,111.589,0.277412,1028
92
+ 112,111.27,0.276619,838
93
+ 169,111.268,0.276614,984
94
+ 13,111.201,0.276447,1305
95
+ 133,110.741,0.275304,951
96
+ 22,110.69,0.275178,1174
97
+ 91,110.66,0.275102,863
98
+ 173,110.387,0.274423,876
99
+ 39,110.049,0.273584,996
100
+ 124,110.037,0.273554,1042
101
+ 98,110.025,0.273525,902
102
+ 106,109.999,0.27346,642
103
+ 233,109.938,0.273307,1053
104
+ 70,109.569,0.27239,949
105
+ 166,109.373,0.271903,944
106
+ 180,109.137,0.271316,840
107
+ 51,108.977,0.27092,1057
108
+ 198,108.871,0.270656,685
109
+ 177,108.763,0.270386,663
110
+ 238,108.697,0.270223,1056
111
+ 196,108.531,0.269811,923
112
+ 92,108.279,0.269184,1172
113
+ 215,108.18,0.268938,785
114
+ 58,108.128,0.268809,1041
115
+ 79,108.052,0.268618,836
116
+ 15,108.027,0.268558,1011
117
+ 110,108.011,0.268517,972
118
+ 11,107.797,0.267986,938
119
+ 172,107.401,0.267001,914
120
+ 40,107.34,0.26685,837
121
+ 153,107.159,0.266398,890
122
+ 143,106.979,0.265951,764
123
+ 242,106.872,0.265685,932
124
+ 144,106.858,0.265651,813
125
+ 3,106.841,0.26561,1008
126
+ 75,106.839,0.265603,952
127
+ 54,106.475,0.264698,922
128
+ 132,106.44,0.264612,938
129
+ 17,106.359,0.26441,598
130
+ 199,106.254,0.264148,899
131
+ 192,105.588,0.262493,772
132
+ 59,105.534,0.262359,945
133
+ 1,105.412,0.262055,1279
134
+ 214,105.398,0.26202,682
135
+ 194,105.087,0.261248,779
136
+ 184,105.021,0.261084,527
137
+ 255,104.999,0.261029,811
138
+ 78,104.945,0.260895,758
139
+ 84,104.917,0.260825,826
140
+ 27,104.76,0.260434,882
141
+ 71,104.571,0.259965,930
142
+ 52,104.5,0.259789,736
143
+ 175,104.442,0.259645,679
144
+ 200,104.317,0.259333,744
145
+ 195,104.209,0.259065,822
146
+ 10,104.207,0.259059,729
147
+ 152,103.981,0.258498,724
148
+ 174,103.938,0.258391,704
149
+ 102,103.895,0.258284,628
150
+ 66,103.891,0.258274,877
151
+ 83,103.844,0.258158,784
152
+ 226,103.62,0.257602,628
153
+ 93,103.383,0.257011,744
154
+ 49,103.371,0.256983,551
155
+ 57,103.199,0.256553,824
156
+ 73,103.193,0.25654,702
157
+ 117,103.142,0.256413,747
158
+ 115,103.094,0.256293,760
159
+ 45,102.958,0.255955,740
160
+ 149,102.6,0.255066,681
161
+ 123,102.465,0.254729,744
162
+ 129,102.158,0.253967,793
163
+ 25,102.056,0.253713,636
164
+ 159,101.731,0.252906,611
165
+ 53,101.159,0.251482,852
166
+ 232,101.149,0.251457,689
167
+ 236,101.135,0.251424,809
168
+ 100,101.131,0.251413,513
169
+ 147,101.089,0.251309,909
170
+ 150,100.953,0.25097,570
171
+ 228,100.72,0.250392,571
172
+ 212,100.615,0.25013,708
173
+ 148,100.426,0.24966,851
174
+ 241,99.8775,0.248297,684
175
+ 155,99.8737,0.248288,461
176
+ 24,99.7311,0.247933,655
177
+ 170,99.6893,0.247829,777
178
+ 82,99.6649,0.247768,465
179
+ 68,99.5295,0.247432,579
180
+ 85,99.5103,0.247384,863
181
+ 4,99.4539,0.247244,754
182
+ 19,99.405,0.247123,761
183
+ 126,99.3764,0.247051,677
184
+ 16,99.3484,0.246982,690
185
+ 230,99.1009,0.246366,518
186
+ 14,98.9578,0.246011,790
187
+ 141,98.7835,0.245577,603
188
+ 164,98.717,0.245412,837
189
+ 77,98.7115,0.245398,595
190
+ 145,98.7103,0.245395,798
191
+ 23,98.71,0.245395,801
192
+ 12,98.1749,0.244064,710
193
+ 146,98.1143,0.243914,761
194
+ 213,98.0514,0.243757,595
195
+ 55,98.0126,0.243661,857
196
+ 31,97.9284,0.243452,503
197
+ 227,97.8974,0.243374,570
198
+ 48,97.7811,0.243085,616
199
+ 234,97.7319,0.242963,737
200
+ 237,97.6486,0.242756,584
201
+ 34,97.6462,0.24275,558
202
+ 38,97.6346,0.242721,709
203
+ 113,97.3721,0.242069,657
204
+ 140,97.2809,0.241842,445
205
+ 253,97.2683,0.241811,448
206
+ 80,97.091,0.24137,498
207
+ 97,97.083,0.24135,677
208
+ 33,96.7513,0.240525,715
209
+ 94,96.7201,0.240448,735
210
+ 231,96.4082,0.239672,580
211
+ 90,96.3466,0.239519,678
212
+ 9,96.336,0.239493,601
213
+ 156,96.1985,0.239151,661
214
+ 157,96.0247,0.238719,798
215
+ 120,95.8212,0.238213,823
216
+ 18,95.4528,0.237297,871
217
+ 6,95.2122,0.236699,571
218
+ 210,95.2075,0.236687,508
219
+ 183,95.1461,0.236535,641
220
+ 187,95.0521,0.236301,492
221
+ 104,95.0467,0.236288,647
222
+ 223,94.8627,0.23583,433
223
+ 178,94.7889,0.235647,549
224
+ 119,94.7123,0.235456,575
225
+ 142,94.4449,0.234792,669
226
+ 193,94.3387,0.234528,707
227
+ 76,94.3314,0.234509,634
228
+ 197,94.3222,0.234487,772
229
+ 122,94.2152,0.234221,799
230
+ 168,94.0656,0.233849,317
231
+ 207,94.0196,0.233734,631
232
+ 161,94.0048,0.233697,709
233
+ 101,93.9512,0.233564,639
234
+ 114,93.8643,0.233348,641
235
+ 244,93.7773,0.233132,485
236
+ 74,93.7731,0.233121,709
237
+ 188,93.155,0.231585,599
238
+ 7,92.5198,0.230006,613
239
+ 222,92.278,0.229405,727
240
+ 139,92.1443,0.229072,642
241
+ 136,92.0111,0.228741,417
242
+ 179,91.9323,0.228545,511
243
+ 246,91.8896,0.228439,537
244
+ 118,91.8888,0.228437,481
245
+ 81,91.7182,0.228013,610
246
+ 5,91.053,0.226359,633
247
+ 186,90.6356,0.225322,521
248
+ 50,90.5887,0.225205,756
249
+ 235,90.5667,0.22515,602
250
+ 29,90.5197,0.225033,527
251
+ 109,90.1993,0.224237,590
252
+ 8,90.0863,0.223956,437
253
+ 245,89.7939,0.223229,347
254
+ 67,88.9806,0.221207,465
255
+ 131,88.8688,0.220929,468
256
+ 181,88.1751,0.219205,583
257
+ 211,87.618,0.21782,566
258
+ 135,87.2305,0.216856,499
259
+ 62,86.7473,0.215655,386
260
+ 96,85.6512,0.21293,557
261
+ 21,84.3823,0.209776,551
262
+ 249,84.0732,0.209007,460
263
+ 216,83.6198,0.20788,426
264
+ 240,81.7215,0.203161,381
265
+ 190,81.3988,0.202359,461
variants/Q4_K_XL/run.log ADDED
@@ -0,0 +1,333 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ $ /Users/jdumay/.codex/worktrees/dddc/mesh-llm/llama.cpp/build/bin/llama-moe-analyze -m /Users/jdumay/.cache/huggingface/hub/models--unsloth--Qwen3.6-35B-A3B-GGUF/snapshots/9280dd353ab587157920d5bd391ada414d84e552/Qwen3.6-35B-A3B-UD-Q4_K_XL.gguf --all-layers --export-ranking /Users/jdumay/.cache/mesh-llm/packages/unsloth/Qwen3.6-35B-A3B-GGUF/9280dd353ab587157920d5bd391ada414d84e552/variants/Q4_K_XL/ranking.csv -n 32 -c 4096 -ngl 0
2
+
3
+ [stdout]
4
+
5
+ [stderr]
6
+ ggml_metal_device_init: tensor API disabled for pre-M5 and pre-A19 devices
7
+ ggml_metal_library_init: using embedded metal library
8
+ ggml_metal_library_init: loaded in 0.013 sec
9
+ ggml_metal_rsets_init: creating a residency set collection (keep_alive = 180 s)
10
+ ggml_metal_device_init: GPU name: MTL0
11
+ ggml_metal_device_init: GPU family: MTLGPUFamilyApple7 (1007)
12
+ ggml_metal_device_init: GPU family: MTLGPUFamilyCommon3 (3003)
13
+ ggml_metal_device_init: GPU family: MTLGPUFamilyMetal4 (5002)
14
+ ggml_metal_device_init: simdgroup reduction = true
15
+ ggml_metal_device_init: simdgroup matrix mul. = true
16
+ ggml_metal_device_init: has unified memory = true
17
+ ggml_metal_device_init: has bfloat = true
18
+ ggml_metal_device_init: has tensor = false
19
+ ggml_metal_device_init: use residency sets = true
20
+ ggml_metal_device_init: use shared buffers = true
21
+ ggml_metal_device_init: recommendedMaxWorkingSetSize = 115448.73 MB
22
+ common_init_result: fitting params to device memory, for bugs during this step try to reproduce them with -fit off, or provide --verbose logs if the bug only occurs with -fit on
23
+ llama_params_fit_impl: projected to use 16 MiB of device memory vs. 110100 MiB of free device memory
24
+ llama_params_fit_impl: will leave 110084 >= 1024 MiB of free device memory, no changes needed
25
+ llama_params_fit: successfully fit params to free device memory
26
+ llama_params_fit: fitting params to free memory took 0.41 seconds
27
+ llama_model_load_from_file_impl: using device MTL0 (Apple M1 Ultra) (unknown id) - 110100 MiB free
28
+ llama_model_loader: loaded meta data with 54 key-value pairs and 733 tensors from /Users/jdumay/.cache/huggingface/hub/models--unsloth--Qwen3.6-35B-A3B-GGUF/snapshots/9280dd353ab587157920d5bd391ada414d84e552/Qwen3.6-35B-A3B-UD-Q4_K_XL.gguf (version GGUF V3 (latest))
29
+ llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
30
+ llama_model_loader: - kv 0: general.architecture str = qwen35moe
31
+ llama_model_loader: - kv 1: general.type str = model
32
+ llama_model_loader: - kv 2: general.sampling.top_k i32 = 20
33
+ llama_model_loader: - kv 3: general.sampling.top_p f32 = 0.950000
34
+ llama_model_loader: - kv 4: general.sampling.temp f32 = 1.000000
35
+ llama_model_loader: - kv 5: general.name str = Qwen3.6-35B-A3B
36
+ llama_model_loader: - kv 6: general.basename str = Qwen3.6-35B-A3B
37
+ llama_model_loader: - kv 7: general.quantized_by str = Unsloth
38
+ llama_model_loader: - kv 8: general.size_label str = 35B-A3B
39
+ llama_model_loader: - kv 9: general.license str = apache-2.0
40
+ llama_model_loader: - kv 10: general.license.link str = https://huggingface.co/Qwen/Qwen3.6-3...
41
+ llama_model_loader: - kv 11: general.repo_url str = https://huggingface.co/unsloth
42
+ llama_model_loader: - kv 12: general.base_model.count u32 = 1
43
+ llama_model_loader: - kv 13: general.base_model.0.name str = Qwen3.6 35B A3B
44
+ llama_model_loader: - kv 14: general.base_model.0.organization str = Qwen
45
+ llama_model_loader: - kv 15: general.base_model.0.repo_url str = https://huggingface.co/Qwen/Qwen3.6-3...
46
+ llama_model_loader: - kv 16: general.tags arr[str,3] = ["qwen3_5_moe", "qwen", "image-text-t...
47
+ llama_model_loader: - kv 17: qwen35moe.block_count u32 = 40
48
+ llama_model_loader: - kv 18: qwen35moe.context_length u32 = 262144
49
+ llama_model_loader: - kv 19: qwen35moe.embedding_length u32 = 2048
50
+ llama_model_loader: - kv 20: qwen35moe.attention.head_count u32 = 16
51
+ llama_model_loader: - kv 21: qwen35moe.attention.head_count_kv u32 = 2
52
+ llama_model_loader: - kv 22: qwen35moe.rope.dimension_sections arr[i32,4] = [11, 11, 10, 0]
53
+ llama_model_loader: - kv 23: qwen35moe.rope.freq_base f32 = 10000000.000000
54
+ llama_model_loader: - kv 24: qwen35moe.attention.layer_norm_rms_epsilon f32 = 0.000001
55
+ llama_model_loader: - kv 25: qwen35moe.expert_count u32 = 256
56
+ llama_model_loader: - kv 26: qwen35moe.expert_used_count u32 = 8
57
+ llama_model_loader: - kv 27: qwen35moe.attention.key_length u32 = 256
58
+ llama_model_loader: - kv 28: qwen35moe.attention.value_length u32 = 256
59
+ llama_model_loader: - kv 29: qwen35moe.expert_feed_forward_length u32 = 512
60
+ llama_model_loader: - kv 30: qwen35moe.expert_shared_feed_forward_length u32 = 512
61
+ llama_model_loader: - kv 31: qwen35moe.ssm.conv_kernel u32 = 4
62
+ llama_model_loader: - kv 32: qwen35moe.ssm.state_size u32 = 128
63
+ llama_model_loader: - kv 33: qwen35moe.ssm.group_count u32 = 16
64
+ llama_model_loader: - kv 34: qwen35moe.ssm.time_step_rank u32 = 32
65
+ llama_model_loader: - kv 35: qwen35moe.ssm.inner_size u32 = 4096
66
+ llama_model_loader: - kv 36: qwen35moe.full_attention_interval u32 = 4
67
+ llama_model_loader: - kv 37: qwen35moe.rope.dimension_count u32 = 64
68
+ llama_model_loader: - kv 38: tokenizer.ggml.model str = gpt2
69
+ llama_model_loader: - kv 39: tokenizer.ggml.pre str = qwen35
70
+ llama_model_loader: - kv 40: tokenizer.ggml.tokens arr[str,248320] = ["!", "\"", "#", "$", "%", "&", "'", ...
71
+ llama_model_loader: - kv 41: tokenizer.ggml.token_type arr[i32,248320] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
72
+ llama_model_loader: - kv 42: tokenizer.ggml.merges arr[str,247587] = ["Ġ Ġ", "ĠĠ ĠĠ", "i n", "Ġ t",...
73
+ llama_model_loader: - kv 43: tokenizer.ggml.eos_token_id u32 = 248046
74
+ llama_model_loader: - kv 44: tokenizer.ggml.padding_token_id u32 = 248055
75
+ llama_model_loader: - kv 45: tokenizer.ggml.bos_token_id u32 = 248044
76
+ llama_model_loader: - kv 46: tokenizer.ggml.add_bos_token bool = false
77
+ llama_model_loader: - kv 47: tokenizer.chat_template str = {%- set image_count = namespace(value...
78
+ llama_model_loader: - kv 48: general.quantization_version u32 = 2
79
+ llama_model_loader: - kv 49: general.file_type u32 = 15
80
+ llama_model_loader: - kv 50: quantize.imatrix.file str = Qwen3.6-35B-A3B-GGUF/imatrix_unsloth....
81
+ llama_model_loader: - kv 51: quantize.imatrix.dataset str = unsloth_calibration_Qwen3.6-35B-A3B.txt
82
+ llama_model_loader: - kv 52: quantize.imatrix.entries_count u32 = 510
83
+ llama_model_loader: - kv 53: quantize.imatrix.chunks_count u32 = 76
84
+ llama_model_loader: - type f32: 361 tensors
85
+ llama_model_loader: - type q8_0: 252 tensors
86
+ llama_model_loader: - type q4_K: 78 tensors
87
+ llama_model_loader: - type q5_K: 38 tensors
88
+ llama_model_loader: - type q6_K: 4 tensors
89
+ print_info: file format = GGUF V3 (latest)
90
+ print_info: file type = Q4_K - Medium
91
+ print_info: file size = 20.81 GiB (5.16 BPW)
92
+ load: 0 unused tokens
93
+ load: printing all EOG tokens:
94
+ load: - 248044 ('<|endoftext|>')
95
+ load: - 248046 ('<|im_end|>')
96
+ load: - 248063 ('<|fim_pad|>')
97
+ load: - 248064 ('<|repo_name|>')
98
+ load: - 248065 ('<|file_sep|>')
99
+ load: special tokens cache size = 33
100
+ load: token to piece cache size = 1.7581 MB
101
+ print_info: arch = qwen35moe
102
+ print_info: vocab_only = 0
103
+ print_info: no_alloc = 0
104
+ print_info: n_ctx_train = 262144
105
+ print_info: n_embd = 2048
106
+ print_info: n_embd_inp = 2048
107
+ print_info: n_layer = 40
108
+ print_info: n_head = 16
109
+ print_info: n_head_kv = 2
110
+ print_info: n_rot = 64
111
+ print_info: n_swa = 0
112
+ print_info: is_swa_any = 0
113
+ print_info: n_embd_head_k = 256
114
+ print_info: n_embd_head_v = 256
115
+ print_info: n_gqa = 8
116
+ print_info: n_embd_k_gqa = 512
117
+ print_info: n_embd_v_gqa = 512
118
+ print_info: f_norm_eps = 0.0e+00
119
+ print_info: f_norm_rms_eps = 1.0e-06
120
+ print_info: f_clamp_kqv = 0.0e+00
121
+ print_info: f_max_alibi_bias = 0.0e+00
122
+ print_info: f_logit_scale = 0.0e+00
123
+ print_info: f_attn_scale = 0.0e+00
124
+ print_info: n_ff = 0
125
+ print_info: n_expert = 256
126
+ print_info: n_expert_used = 8
127
+ print_info: n_expert_groups = 0
128
+ print_info: n_group_used = 0
129
+ print_info: causal attn = 1
130
+ print_info: pooling type = -1
131
+ print_info: rope type = 40
132
+ print_info: rope scaling = linear
133
+ print_info: freq_base_train = 10000000.0
134
+ print_info: freq_scale_train = 1
135
+ print_info: n_ctx_orig_yarn = 262144
136
+ print_info: rope_yarn_log_mul = 0.0000
137
+ print_info: rope_finetuned = unknown
138
+ print_info: mrope sections = [11, 11, 10, 0]
139
+ print_info: ssm_d_conv = 4
140
+ print_info: ssm_d_inner = 4096
141
+ print_info: ssm_d_state = 128
142
+ print_info: ssm_dt_rank = 32
143
+ print_info: ssm_n_group = 16
144
+ print_info: ssm_dt_b_c_rms = 0
145
+ print_info: model type = 35B.A3B
146
+ print_info: model params = 34.66 B
147
+ print_info: general.name = Qwen3.6-35B-A3B
148
+ print_info: vocab type = BPE
149
+ print_info: n_vocab = 248320
150
+ print_info: n_merges = 247587
151
+ print_info: BOS token = 248044 '<|endoftext|>'
152
+ print_info: EOS token = 248046 '<|im_end|>'
153
+ print_info: EOT token = 248046 '<|im_end|>'
154
+ print_info: PAD token = 248055 '<|vision_pad|>'
155
+ print_info: LF token = 198 'Ċ'
156
+ print_info: FIM PRE token = 248060 '<|fim_prefix|>'
157
+ print_info: FIM SUF token = 248062 '<|fim_suffix|>'
158
+ print_info: FIM MID token = 248061 '<|fim_middle|>'
159
+ print_info: FIM PAD token = 248063 '<|fim_pad|>'
160
+ print_info: FIM REP token = 248064 '<|repo_name|>'
161
+ print_info: FIM SEP token = 248065 '<|file_sep|>'
162
+ print_info: EOG token = 248044 '<|endoftext|>'
163
+ print_info: EOG token = 248046 '<|im_end|>'
164
+ print_info: EOG token = 248063 '<|fim_pad|>'
165
+ print_info: EOG token = 248064 '<|repo_name|>'
166
+ print_info: EOG token = 248065 '<|file_sep|>'
167
+ print_info: max token length = 256
168
+ load_tensors: loading model tensors, this can take a while... (mmap = true, direct_io = false)
169
+ load_tensors: offloading 0 repeating layers to GPU
170
+ load_tensors: offloaded 0/41 layers to GPU
171
+ load_tensors: CPU_Mapped model buffer size = 20798.80 MiB
172
+ load_tensors: CPU_REPACK model buffer size = 20699.06 MiB
173
+ .................................................................................................
174
+ common_init_result: added <|endoftext|> logit bias = -inf
175
+ common_init_result: added <|im_end|> logit bias = -inf
176
+ common_init_result: added <|fim_pad|> logit bias = -inf
177
+ common_init_result: added <|repo_name|> logit bias = -inf
178
+ common_init_result: added <|file_sep|> logit bias = -inf
179
+ llama_context: constructing llama_context
180
+ llama_context: n_seq_max = 1
181
+ llama_context: n_ctx = 4096
182
+ llama_context: n_ctx_seq = 4096
183
+ llama_context: n_batch = 2048
184
+ llama_context: n_ubatch = 512
185
+ llama_context: causal_attn = 1
186
+ llama_context: flash_attn = auto
187
+ llama_context: kv_unified = false
188
+ llama_context: freq_base = 10000000.0
189
+ llama_context: freq_scale = 1
190
+ llama_context: n_ctx_seq (4096) < n_ctx_train (262144) -- the full capacity of the model will not be utilized
191
+ ggml_metal_init: allocating
192
+ ggml_metal_init: found device: Apple M1 Ultra
193
+ ggml_metal_init: picking default device: Apple M1 Ultra
194
+ ggml_metal_init: use fusion = true
195
+ ggml_metal_init: use concurrency = true
196
+ ggml_metal_init: use graph optimize = true
197
+ llama_context: CPU output buffer size = 0.95 MiB
198
+ llama_kv_cache: CPU KV buffer size = 80.00 MiB
199
+ llama_kv_cache: size = 80.00 MiB ( 4096 cells, 10 layers, 1/1 seqs), K (f16): 40.00 MiB, V (f16): 40.00 MiB
200
+ llama_kv_cache: attn_rot_k = 0, n_embd_head_k_all = 256
201
+ llama_kv_cache: attn_rot_v = 0, n_embd_head_k_all = 256
202
+ llama_memory_recurrent: CPU RS buffer size = 62.81 MiB
203
+ llama_memory_recurrent: size = 62.81 MiB ( 1 cells, 40 layers, 1 seqs), R (f32): 2.81 MiB, S (f32): 60.00 MiB
204
+ sched_reserve: reserving ...
205
+ sched_reserve: Flash Attention was auto, set to enabled
206
+ sched_reserve: resolving fused Gated Delta Net support:
207
+ sched_reserve: fused Gated Delta Net (autoregressive) enabled
208
+ sched_reserve: fused Gated Delta Net (chunked) enabled
209
+ sched_reserve: MTL0 compute buffer size = 16.01 MiB
210
+ sched_reserve: CPU compute buffer size = 493.00 MiB
211
+ sched_reserve: graph nodes = 3729
212
+ sched_reserve: graph splits = 281 (with bs=512), 1 (with bs=1)
213
+ sched_reserve: reserve took 8.30 ms, sched copies = 1
214
+
215
+ === MoE Expert Routing Analysis ===
216
+ Model experts: 256, used per token: 8
217
+ Logging ALL MoE layers (--all-layers)
218
+ Will export expert ranking to: /Users/jdumay/.cache/mesh-llm/packages/unsloth/Qwen3.6-35B-A3B-GGUF/9280dd353ab587157920d5bd391ada414d84e552/variants/Q4_K_XL/ranking.csv
219
+ Running 10 prompts, generating 32 tokens each
220
+ Logging first 9999 MoE layers per eval
221
+
222
+ Prompt 1/10: <|im_start|>user
223
+ Write a Python function to find the nth Fib...
224
+ collected 2640 layer snapshots (total: 2640)
225
+ Prompt 2/10: <|im_start|>user
226
+ Write a Rust function that reads a CSV file...
227
+ collected 2640 layer snapshots (total: 5280)
228
+ Prompt 3/10: <|im_start|>user
229
+ Explain how a B-tree index works in a datab...
230
+ collected 2640 layer snapshots (total: 7920)
231
+ Prompt 4/10: <|im_start|>user
232
+ If all roses are flowers and some flowers f...
233
+ collected 2640 layer snapshots (total: 10560)
234
+ Prompt 5/10: <|im_start|>user
235
+ A train travels 120km in 2 hours. It then s...
236
+ collected 2640 layer snapshots (total: 13200)
237
+ Prompt 6/10: <|im_start|>user
238
+ Hello! What's the best way to learn a new l...
239
+ collected 2640 layer snapshots (total: 15840)
240
+ Prompt 7/10: <|im_start|>user
241
+ Tell me a joke about programmers.<|im_end|>...
242
+ collected 1840 layer snapshots (total: 17680)
243
+ Prompt 8/10: <|im_start|>user
244
+ Summarize the key differences between TCP a...
245
+ collected 2640 layer snapshots (total: 20320)
246
+ Prompt 9/10: <|im_start|>user
247
+ Translate 'The weather is beautiful today' ...
248
+ collected 2640 layer snapshots (total: 22960)
249
+ Prompt 10/10: <|im_start|>user
250
+ List 5 healthy breakfast options with brief...
251
+ collected 2640 layer snapshots (total: 25600)
252
+
253
+ === Expert Popularity (gate mass, summed across all tokens & logged layers) ===
254
+ Total tokens × layers: 3304225
255
+
256
+ Top 20 experts by gate mass:
257
+ Expert Mass Mass% Selected
258
+ 0 12897.1294 32.06 3277738
259
+ 243 153.0632 0.38 1539
260
+ 89 150.4899 0.37 1298
261
+ 60 149.0570 0.37 1645
262
+ 224 146.0130 0.36 1489
263
+ 64 142.5874 0.35 1426
264
+ 95 140.2351 0.35 1550
265
+ 189 140.1161 0.35 1045
266
+ 229 136.1348 0.34 1522
267
+ 36 136.1316 0.34 1348
268
+ 125 134.2771 0.33 1138
269
+ 108 133.8681 0.33 1271
270
+ 254 133.3297 0.33 1197
271
+ 167 133.2555 0.33 1316
272
+ 43 132.8598 0.33 1652
273
+ 88 132.6322 0.33 1366
274
+ 137 132.2451 0.33 1186
275
+ 165 131.3198 0.33 1255
276
+ 160 129.6081 0.32 1217
277
+ 46 129.0560 0.32 1160
278
+
279
+ Concentration:
280
+ Top 4 experts: 33.2% of total gate mass
281
+ Top 8 experts: 34.6% of total gate mass
282
+ Top 16 experts: 37.3% of total gate mass
283
+ Top 32 experts: 42.3% of total gate mass
284
+ Top 64 experts: 51.8% of total gate mass
285
+
286
+ Exported expert ranking to: /Users/jdumay/.cache/mesh-llm/packages/unsloth/Qwen3.6-35B-A3B-GGUF/9280dd353ab587157920d5bd391ada414d84e552/variants/Q4_K_XL/ranking.csv
287
+ Use with moe-split: --group-map <file generated from this ranking>
288
+
289
+
290
+ === Group Masking Analysis (best-group capture ratio) ===
291
+ For each group count, what fraction of the unrestricted top-8 mass
292
+ is captured by the best single group?
293
+
294
+ Groups Replicas Exp/Grp Mean P25 P50 P5
295
+ 2 0 128 0.999 1.000 1.000 1.000
296
+ 2 1 128 0.999 1.000 1.000 1.000
297
+ 2 2 128 0.999 1.000 1.000 1.000
298
+ 2 4 128 0.999 1.000 1.000 1.000
299
+ 2 8 128 0.999 1.000 1.000 1.000
300
+
301
+ 4 0 64 0.997 1.000 1.000 1.000
302
+ 4 1 64 0.997 1.000 1.000 1.000
303
+ 4 2 64 0.998 1.000 1.000 1.000
304
+ 4 4 64 0.998 1.000 1.000 1.000
305
+ 4 8 64 0.998 1.000 1.000 1.000
306
+
307
+ 8 0 32 0.997 1.000 1.000 1.000
308
+ 8 1 32 0.997 1.000 1.000 1.000
309
+ 8 2 32 0.997 1.000 1.000 1.000
310
+ 8 4 32 0.997 1.000 1.000 1.000
311
+ 8 8 32 0.997 1.000 1.000 1.000
312
+
313
+ === Interpretation ===
314
+ Mean close to 1.0 = masking barely hurts (best group captures most of top-k mass)
315
+ Mean < 0.7 = significant quality risk from group restriction
316
+ P5 close to 1.0 = even worst-case tokens are OK
317
+ P5 < 0.5 = some tokens will be badly served by any single group
318
+
319
+ === Phase 1b: Masked Generation Quality (logprob comparison) ===
320
+ Testing 4 groups (64 experts/group) vs baseline (all 256 experts)
321
+ Using first 5 prompts, generating 32 tokens each
322
+
323
+ Group 0 (experts 0-63 + 2 hot replicas): avg logprob delta = -0.3751
324
+ Group 1 (experts 64-127 + 2 hot replicas): avg logprob delta = -0.1213
325
+ Group 2 (experts 128-191 + 2 hot replicas): avg logprob delta = -0.1904
326
+ Group 3 (experts 192-255 + 2 hot replicas): avg logprob delta = -0.3067
327
+
328
+ === Interpretation (logprob delta) ===
329
+ Delta near 0.0 = masking barely affects generation quality
330
+ Delta < -0.1 = noticeable quality loss
331
+ Delta < -0.5 = significant degradation
332
+
333
+ ggml_metal_free: deallocating