Correct technical report link
Browse files
README.md
CHANGED
@@ -41,7 +41,7 @@ This approach progressively trains from input embeddings to full parameters, eff
|
|
41 |
Our method enhances the model's cross-linguistic applicability by carefully integrating new linguistic tokens, focusing on causal language modeling pre-training.
|
42 |
We leverage the inherent capabilities of foundational models trained on English to efficiently transfer knowledge and reasoning to Korean, optimizing the adaptation process.
|
43 |
|
44 |
-
For
|
45 |
|
46 |
Here’s an simplified code for our key approach:
|
47 |
|
|
|
41 |
Our method enhances the model's cross-linguistic applicability by carefully integrating new linguistic tokens, focusing on causal language modeling pre-training.
|
42 |
We leverage the inherent capabilities of foundational models trained on English to efficiently transfer knowledge and reasoning to Korean, optimizing the adaptation process.
|
43 |
|
44 |
+
For more details, please refer to our technical report: [Efficient and Effective Vocabulary Expansion Towards Multilingual Large Language Models](https://arxiv.org/abs/2402.14714).
|
45 |
|
46 |
Here’s an simplified code for our key approach:
|
47 |
|