Update README.md
Browse files
README.md
CHANGED
@@ -27,7 +27,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
27 |
|
28 |
# distilbert-base-uncased-finetuned-clinc
|
29 |
|
30 |
-
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
|
|
|
31 |
It achieves the following results on the evaluation set:
|
32 |
- Loss: 0.7773
|
33 |
- Accuracy: 0.9174
|
|
|
27 |
|
28 |
# distilbert-base-uncased-finetuned-clinc
|
29 |
|
30 |
+
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset. The model is used in Chapter 8: Making Transformers Efficient in Production in the [NLP with Transformers book](https://learning.oreilly.com/library/view/natural-language-processing/9781098103231/). You can find the full code in the accompanying [Github repository](https://github.com/nlp-with-transformers/notebooks/blob/main/08_model-compression.ipynb).
|
31 |
+
|
32 |
It achieves the following results on the evaluation set:
|
33 |
- Loss: 0.7773
|
34 |
- Accuracy: 0.9174
|