pharaouk commited on
Commit
633bb65
1 Parent(s): 2fbb425

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -9,7 +9,7 @@ license_link: LICENSE
9
  (THIS IS MICROSOFT'S ORIGINAL MODEL, UPLOADED HERE ONLY FOR RESEARCH PURPOSES AND ACCESSIBILITY AS THE AI AZURE STUDIO IS NOT CONVENIENT FOR RESEARCH. RESEARCH ONLY. RESEARCH. RESEARCH, PLEASE DONT SUE US MSFT, THIS IS 100% FOR RESEARCH.)
10
 
11
 
12
- **Here is Microsoft's official Phi-2 repo:** https://huggingface.co/microsoft/phi-2
13
 
14
  The phi-2 is a language model with 2.7 billion parameters. The phi-2 model was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, the phi-2 showcased a nearly state-of-the-art performance among models with less than 10 billion parameters.
15
 
 
9
  (THIS IS MICROSOFT'S ORIGINAL MODEL, UPLOADED HERE ONLY FOR RESEARCH PURPOSES AND ACCESSIBILITY AS THE AI AZURE STUDIO IS NOT CONVENIENT FOR RESEARCH. RESEARCH ONLY. RESEARCH. RESEARCH, PLEASE DONT SUE US MSFT, THIS IS 100% FOR RESEARCH.)
10
 
11
 
12
+ **Here is Microsoft's official Phi-2 repo (float16):** https://huggingface.co/microsoft/phi-2
13
 
14
  The phi-2 is a language model with 2.7 billion parameters. The phi-2 model was trained using the same data sources as phi-1, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, the phi-2 showcased a nearly state-of-the-art performance among models with less than 10 billion parameters.
15