Instructions to use diffunity/model_for_matt_2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use diffunity/model_for_matt_2 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("diffunity/model_for_matt_2", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update adapter_config.json
#1
by mtp0326 - opened
- adapter_config.json +1 -1
adapter_config.json
CHANGED
|
@@ -29,7 +29,7 @@
|
|
| 29 |
"query",
|
| 30 |
"value"
|
| 31 |
],
|
| 32 |
-
"task_type": "
|
| 33 |
"use_dora": false,
|
| 34 |
"use_rslora": false
|
| 35 |
}
|
|
|
|
| 29 |
"query",
|
| 30 |
"value"
|
| 31 |
],
|
| 32 |
+
"task_type": "QUESTION_ANS",
|
| 33 |
"use_dora": false,
|
| 34 |
"use_rslora": false
|
| 35 |
}
|