nishan-chatterjee commited on
Commit
e33ca78
1 Parent(s): 468c17d

adding example texts for the widget

Browse files
Files changed (1) hide show
  1. README.md +14 -1
README.md CHANGED
@@ -17,6 +17,15 @@ tags:
17
  - multilingual
18
  pipeline_tag: text-classification
19
  inference: True
 
 
 
 
 
 
 
 
 
20
  ---
21
 
22
  # Multilingual Persuasion Detection in Memes
@@ -26,12 +35,16 @@ Given only the “textual content” of a meme, the goal is to identify which of
26
  ### Hierarchy
27
  <img src="images/persuasion_techniques_hierarchy_graph.png" width="622" height="350">
28
 
29
- ### Input Example
30
  - **Input:** "I HATE TRUMP\n\nMOST TERRORIST DO",
31
  - **Outputs:**
32
  - Child-only Label List: ['Name calling/Labeling', 'Loaded Language']
33
  - Complete Hierarchical Label List: ['Ethos', 'Ad Hominem', 'Name calling/Labeling', 'Pathos', 'Loaded Language']
34
 
 
 
 
 
35
  ## Training Hyperparameters
36
  - Base Model: "facebook/mbart-large-50-many-to-many-mmt"
37
  - Learning Rate: 5e-05
 
17
  - multilingual
18
  pipeline_tag: text-classification
19
  inference: True
20
+ widget:
21
+ - text: "THIS IS WHY YOU NEED. A SHARPIE WITH YOU AT ALL TIMES."
22
+ example_title: "Sentence 1"
23
+ - text: "WHEN YOU'RE THE FBI, THEY LET YOU DO IT."
24
+ example_title: "Sentence 2"
25
+ - text: "Move your ships away!\n\noooook\n\nMove your ships away!\n\nNo, and I just added 10 more"
26
+ example_title: "Sentence 3"
27
+ - text: "Let's Make America Great Again!"
28
+ example_title: "Sentence 4"
29
  ---
30
 
31
  # Multilingual Persuasion Detection in Memes
 
35
  ### Hierarchy
36
  <img src="images/persuasion_techniques_hierarchy_graph.png" width="622" height="350">
37
 
38
+ ### Usage Example
39
  - **Input:** "I HATE TRUMP\n\nMOST TERRORIST DO",
40
  - **Outputs:**
41
  - Child-only Label List: ['Name calling/Labeling', 'Loaded Language']
42
  - Complete Hierarchical Label List: ['Ethos', 'Ad Hominem', 'Name calling/Labeling', 'Pathos', 'Loaded Language']
43
 
44
+ Note:
45
+ - Make sure to have the dependencies installed in your environment from requirements.txt
46
+ - Make to have the trained model and tokenizer in the same directory as inference.py
47
+
48
  ## Training Hyperparameters
49
  - Base Model: "facebook/mbart-large-50-many-to-many-mmt"
50
  - Learning Rate: 5e-05