ClueAI xuehaha commited on
Commit
6e6af1f
1 Parent(s): ac50e58

Update README.md (#3)

Browse files

- Update README.md (3b4a27f8c426bd22c36e73cc32ecbd250ee1fc89)


Co-authored-by: XueHang <[email protected]>

Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -52,6 +52,7 @@ widget:
52
  问题:小米的创始人是谁?
53
  答案:
54
  library_name: paddlenlp
 
55
  ---
56
 
57
  <a href="https://colab.research.google.com/drive/1hlSMYEq3pyX-fwTSqIOT1um80kU1yOJF?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg"></a>
@@ -72,14 +73,14 @@ PromptCLUE:全中文任务零样本学习模型
72
  # 加载模型
73
  from paddlenlp.transformers import AutoTokenizer, T5ForConditionalGeneration
74
 
75
- tokenizer = AutoTokenizer.from_pretrained("ClueAI/PromptCLUE-base-paddle", from_hf_hub=True)
76
- model = T5ForConditionalGeneration.from_pretrained("ClueAI/PromptCLUE-base-paddle", from_hf_hub=True)
77
  ```
78
 
79
  使用模型进行预测推理方法:
80
  ```python
81
  import torch
82
- #这里使用的是paddle的cpu版本,使用paddle的gpu版本推理会更快
83
  def preprocess(text):
84
  return text.replace("\n", "_")
85
 
 
52
  问题:小米的创始人是谁?
53
  答案:
54
  library_name: paddlenlp
55
+ pipeline_tag: text2text-generation
56
  ---
57
 
58
  <a href="https://colab.research.google.com/drive/1hlSMYEq3pyX-fwTSqIOT1um80kU1yOJF?usp=sharing"><img src="https://colab.research.google.com/assets/colab-badge.svg"></a>
 
73
  # 加载模型
74
  from paddlenlp.transformers import AutoTokenizer, T5ForConditionalGeneration
75
 
76
+ tokenizer = AutoTokenizer.from_pretrained("ClueAI/PromptCLUE-base", from_hf_hub=False)
77
+ model = T5ForConditionalGeneration.from_pretrained("ClueAI/PromptCLUE-base", from_hf_hub=False)
78
  ```
79
 
80
  使用模型进行预测推理方法:
81
  ```python
82
  import torch
83
+ #这里使用的是paddle的gpu版本,推理更快
84
  def preprocess(text):
85
  return text.replace("\n", "_")
86