JeffreyLau commited on
Commit
55e70d3
1 Parent(s): 13309d9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +153 -0
README.md CHANGED
@@ -1,3 +1,156 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ # SSCI-BERT: A pretrained language model for social scientific text
5
+
6
+
7
+ ## Introduction
8
+
9
+ The research for social science texts needs the support natural language processing tools.
10
+
11
+ The pre-trained language model has greatly improved the accuracy of text mining in general texts. At present, there is an urgent need for a pre-trained language model specifically for the automatic processing of scientific texts in social science.
12
+
13
+ We used the abstract of social science research as the training set. Based on the deep language model framework of BERT, we constructed [SSCI-BERT and SSCI-SciBERT](https://github.com/S-T-Full-Text-Knowledge-Mining/SSCI-BERT) pre-training language models by [transformers/run_mlm.py](https://github.com/huggingface/transformers/blob/main/examples/pytorch/language-modeling/run_mlm.py).
14
+
15
+ We designed four downstream tasks of Text Classification on different social scientific article corpus to verify the performance of the model.
16
+
17
+ - SSCI-BERT and SSCI-SciBERT are trained on the abstract of articles published in SSCI journals from 1986 to 2021. The training set involved in the experiment included a total of `503910614 words`.
18
+ - Based on the idea of Domain-Adaptive Pretraining, `SSCI-BERT` and `SSCI-SciBERT` combine a large amount of abstracts of scientific articles based on the BERT structure, and continue to train the BERT and SSCI-SciBERT models respectively to obtain pre-training models for the automatic processing of Social science research texts.
19
+
20
+
21
+
22
+ ## News
23
+
24
+ - 2022-03-24 : SSCIBERT and SSCI-SciBERT has been put forward for the first time.
25
+
26
+
27
+
28
+ ## How to use
29
+
30
+ ### Huggingface Transformers
31
+
32
+ The `from_pretrained` method based on [Huggingface Transformers](https://github.com/huggingface/transformers) can directly obtain SSCI-BERT and SSCI-SciBERT models online.
33
+
34
+
35
+
36
+ - SSCI-BERT
37
+
38
+ ```python
39
+ from transformers import AutoTokenizer, AutoModel
40
+
41
+ tokenizer = AutoTokenizer.from_pretrained("KM4STfulltext/SSCI-BERT-e2")
42
+
43
+ model = AutoModel.from_pretrained("KM4STfulltext/SSCI-BERT-e2")
44
+ ```
45
+
46
+ - SSCI-SciBERT
47
+
48
+ ```python
49
+ from transformers import AutoTokenizer, AutoModel
50
+
51
+ tokenizer = AutoTokenizer.from_pretrained("KM4STfulltext/SSCI-SciBERT-e2")
52
+
53
+ model = AutoModel.from_pretrained("KM4STfulltext/SSCI-SciBERT-e2")
54
+ ```
55
+
56
+ ### Download Models
57
+
58
+ - The version of the model we provide is `PyTorch`.
59
+
60
+ ### From Huggingface
61
+
62
+ - Download directly through Huggingface's official website.
63
+ - [KM4STfulltext/SSCI-BERT-e2](https://huggingface.co/KM4STfulltext/SSCI-BERT-e2)
64
+ - [KM4STfulltext/SSCI-SciBERT-e2](https://huggingface.co/KM4STfulltext/SSCI-SciBERT-e2)
65
+ - [KM4STfulltext/SSCI-BERT-e4 ](https://huggingface.co/KM4STfulltext/SSCI-BERT-e4)
66
+ - [KM4STfulltext/SSCI-SciBERT-e4](https://huggingface.co/KM4STfulltext/SSCI-SciBERT-e4)
67
+
68
+ ### From Google Drive
69
+
70
+ We have put the model on Google Drive for users.
71
+
72
+ | Model | DATASET(year) | Base Model |
73
+ | ------------------------------------------------------------ | ------------- | ---------------------- |
74
+ | [SSCI-BERT-e2](https://drive.google.com/drive/folders/1xEDnovlwGO2JxqCaf3rdjS2cB6DOxhj4?usp=sharing) | 1986-2021 | Bert-base-cased |
75
+ | [SSCI-SciBERT-e2](https://drive.google.com/drive/folders/16DtIvnHvbrR_92MwgthRRsULW6An9te1?usp=sharing) (recommended) | 1986-2021 | Scibert-scivocab-cased |
76
+ | [SSCI-BERT-e4](https://drive.google.com/drive/folders/1sr6Av8p904Jrjps37g7E8aj4HnAHXSxW?usp=sharing) | 1986-2021 | Bert-base-cased |
77
+ | [SSCI-SciBERT-e4](https://drive.google.com/drive/folders/1ty-b4TIFu8FbilgC4VcI7Bgn_O5MDMVe?usp=sharing) | 1986-2021 | Scibert-scivocab-cased |
78
+
79
+ ## Evaluation & Results
80
+
81
+ - We use SSCI-BERT and SSCI-SciBERT to perform Text Classificationon different social science research corpus. The experimental results are as follows. Relevant data sets are available for download in the **Verification task datasets** folder of this project.
82
+
83
+ #### JCR Title Classify Dataset
84
+
85
+ | Model | accuracy | macro avg | weighted avg |
86
+ | ---------------------- | -------- | --------- | ------------ |
87
+ | Bert-base-cased | 28.43 | 22.06 | 21.86 |
88
+ | Scibert-scivocab-cased | 38.48 | 33.89 | 33.92 |
89
+ | SSCI-BERT-e2 | 40.43 | 35.37 | 35.33 |
90
+ | SSCI-SciBERT-e2 | 41.35 | 37.27 | 37.25 |
91
+ | SSCI-BERT-e4 | 40.65 | 35.49 | 35.40 |
92
+ | SSCI-SciBERT-e4 | 41.13 | 36.96 | 36.94 |
93
+ | Support | 2300 | 2300 | 2300 |
94
+
95
+ #### JCR Abstract Classify Dataset
96
+
97
+ | Model | accuracy | macro avg | weighted avg |
98
+ | ---------------------- | -------- | --------- | ------------ |
99
+ | Bert-base-cased | 48.59 | 42.8 | 42.82 |
100
+ | Scibert-scivocab-cased | 55.59 | 51.4 | 51.81 |
101
+ | SSCI-BERT-e2 | 58.05 | 53.31 | 53.73 |
102
+ | SSCI-SciBERT-e2 | 59.95 | 56.51 | 57.12 |
103
+ | SSCI-BERT-e4 | 59.00 | 54.97 | 55.59 |
104
+ | SSCI-SciBERT-e4 | 60.00 | 56.38 | 56.90 |
105
+ | Support | 2200 | 2200 | 2200 |
106
+
107
+ #### JCR Mixed Titles and Abstracts Dataset
108
+
109
+ | **Model** | **accuracy** | **macro avg** | **weighted avg** |
110
+ | ---------------------- | ------------ | -------------- | ----------------- |
111
+ | Bert-base-cased | 58.24 | 57.27 | 57.25 |
112
+ | Scibert-scivocab-cased | 59.58 | 58.65 | 58.68 |
113
+ | SSCI-BERT-e2 | 60.89 | 60.24 | 60.30 |
114
+ | SSCI-SciBERT-e2 | 60.96 | 60.54 | 60.51 |
115
+ | SSCI-BERT-e4 | 61.00 | 60.48 | 60.43 |
116
+ | SSCI-SciBERT-e4 | 61.24 | 60.71 | 60.75 |
117
+ | Support | 4500 | 4500 | 4500 |
118
+
119
+ #### SSCI Abstract Structural Function Recognition (Classify Dataset)
120
+
121
+ | | Bert-base-cased | SSCI-BERT-e2 | SSCI-BERT-e4 | support |
122
+ | ------------ | -------------------------- | ------------------- | ------------------- | ----------- |
123
+ | B | 63.77 | 64.29 | 64.63 | 224 |
124
+ | P | 53.66 | 57.14 | 57.99 | 95 |
125
+ | M | 87.63 | 88.43 | 89.06 | 323 |
126
+ | R | 86.81 | 88.28 | **88.47** | 419 |
127
+ | C | 78.32 | 79.82 | 78.95 | 316 |
128
+ | accuracy | 79.59 | 80.9 | 80.97 | 1377 |
129
+ | macro avg | 74.04 | 75.59 | 75.82 | 1377 |
130
+ | weighted avg | 79.02 | 80.32 | 80.44 | 1377 |
131
+ | | **Scibert-scivocab-cased** | **SSCI-SciBERT-e2** | **SSCI-SciBERT-e4** | **support** |
132
+ | B | 69.98 | **70.95** | **70.95** | 224 |
133
+ | P | 58.89 | **60.12** | 58.96 | 95 |
134
+ | M | 89.37 | **90.12** | 88.11 | 323 |
135
+ | R | 87.66 | 88.07 | 87.44 | 419 |
136
+ | C | 80.7 | 82.61 | **82.94** | 316 |
137
+ | accuracy | 81.63 | **82.72** | 82.06 | 1377 |
138
+ | macro avg | 77.32 | **78.37** | 77.68 | 1377 |
139
+ | weighted avg | 81.6 | **82.58** | 81.92 | 1377 |
140
+
141
+ ## Cited
142
+
143
+ - If our content is helpful for your research work, please quote our research in your article.
144
+ - If you want to quote our research, you can use this url (https://github.com/S-T-Full-Text-Knowledge-Mining/SSCI-BERT) as an alternative before our paper is published.
145
+
146
+ ## Disclaimer
147
+
148
+ - The experimental results presented in the report only show the performance under a specific data set and hyperparameter combination, and cannot represent the essence of each model. The experimental results may change due to random number seeds and computing equipment.
149
+ - **Users can use the model arbitrarily within the scope of the license, but we are not responsible for the direct or indirect losses caused by using the content of the project.**
150
+
151
+
152
+ ## Acknowledgment
153
+
154
+ - SSCI-BERT was trained based on [BERT-Base-Cased]([google-research/bert: TensorFlow code and pre-trained models for BERT (github.com)](https://github.com/google-research/bert)).
155
+ - SSCI-SciBERT was trained based on [scibert-scivocab-cased]([allenai/scibert: A BERT model for scientific text. (github.com)](https://github.com/allenai/scibert))
156
+