-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathtasks.json
218 lines (216 loc) · 8.9 KB
/
tasks.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
{
"csqa": {
"task_id": "csqa",
"few_shot_split": "train",
"few_shot_ids": [0, 1, 3, 4, 5],
"input_keys": ["question"],
"labels": {
"a": "a",
"b": "b",
"c": "c",
"d": "d",
"e": "e"
},
"task_description": "answer some multiple choice questions"
},
"gsm8k": {
"task_id": "gsm8k",
"few_shot_split": "train",
"few_shot_ids": [0, 1, 2, 3],
"input_keys": ["question"],
"task_description": "the following math problem"
},
"mmlu": {
"task_id": "mmlu",
"few_shot_split": "dev",
"few_shot_ids": [],
"few_shot_rationale": [],
"input_keys": ["question"],
"labels": {
"a": "a",
"b": "b",
"c": "c",
"d": "d"
},
"task_description": "answer some multiple choice questions"
},
"cola": {
"task_id": "cola",
"few_shot_split": "train",
"few_shot_ids": [0, 18],
"few_shot_rationale": [
"The sentence has a parallel structure connected by a comma. The subject-verb agreement is followed and the verb tenses are correct. Overall, it follows correct gramatical rules in English.",
"Pub is not a type of beverage that one can drink but rather the place where people drink. One correct example would be that `they drank at the pub`."
],
"input_keys": ["sentence"],
"labels": {
"acceptable": 1,
"unacceptable": 0
},
"task_description": "whether it is a grammatically acceptable English sentence"
},
"mnli": {
"task_id": "mnli",
"few_shot_split": "train",
"few_shot_ids": [0, 3, 8],
"few_shot_rationale": [
"The premise neither entails nor contradicts the hypothesis.",
"The premise entails the hypothesis, given that both point to 'them' as the source or owner of the information.",
"Gays and lesbians are homosexual, which is the opposite of heterosexual."
],
"input_keys": ["premise", "hypothesis"],
"labels": {
"entailment": 0,
"neutral": 1,
"contradiction": 2
},
"task_description": "whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral)"
},
"mnli_matched": {
"task_id": "mnli_matched",
"few_shot_split": "validation",
"few_shot_ids": [0, 1, 2],
"few_shot_rationale": [
"The premise neither entails nor contradicts the hypothesis.",
"The premise points to a site where Government Executive articles can be searched, where as the second sentence states the opposite.",
"The two sentences convey the same mixed feelings towards a person."
],
"input_keys": ["premise", "hypothesis"],
"labels": {
"entailment": 0,
"neutral": 1,
"contradiction": 2
},
"task_description": "whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral)"
},
"mnli_mismatched": {
"task_id": "mnli_mismatched",
"few_shot_split": "validation",
"few_shot_ids": [0, 2, 8],
"few_shot_rationale": [
"The premise states that the contribution was helpful, whereas the conclusion states that opposite.",
"The premise states that 'we' serve a meal that includes a florentine terrain, and the second sentence says the same.",
"The premise states a phenomenon and the hypothesis states people's attitude towards that phenomenon. The premise neither entails nor contradicts the hypothesis."
],
"input_keys": ["premise", "hypothesis"],
"labels": {
"entailment": 0,
"neutral": 1,
"contradiction": 2
},
"task_description": "whether the premise entails the hypothesis (entailment), contradicts the hypothesis (contradiction), or neither (neutral)"
},
"mrpc": {
"task_id": "mrpc",
"few_shot_split": "train",
"few_shot_ids": [0, 1],
"few_shot_rationale": [
"Both sentence are saying that Amrozi accuses of his brother who is the only witness of deliberately distorting the evidence.",
"The second sentence contains additional information of when Yucaipa bought the chain."
],
"input_keys": ["sentence1", "sentence2"],
"labels": {
"not_equivalent": 0,
"equivalent": 1
},
"task_description": "whether a pair of sentences are semantically equivalent"
},
"qnli": {
"task_id": "qnli",
"few_shot_split": "train",
"few_shot_ids": [0, 2],
"few_shot_rationale": [
"The sentence does not provide any information regarding the time when the third Digimon series begin.",
"The sentence provides the two things that Popper argue Tarski's theory involves in an evaluation of truth: assertions and the facts to which they refer."
],
"input_keys": ["question", "sentence"],
"labels": {
"entailment": 0,
"not_entailment": 1
},
"task_description": "whether the sentence contains the answer to the question"
},
"qqp": {
"task_id": "qqp",
"few_shot_split": "train",
"few_shot_ids": [0, 1],
"few_shot_rationale": [
"The two questions are asking different things.",
"Horny emotions is the same as horniness and both sentences are asking about how to control it."
],
"input_keys": ["question1", "question2"],
"labels": {
"not_duplicate": 0,
"duplicate": 1
},
"task_description": "whether a pair of questions are semantically equivalent"
},
"rte": {
"task_id": "rte",
"few_shot_split": "train",
"few_shot_ids": [0, 1],
"few_shot_rationale": [
"The two sentences have opposite meanings.",
"Pope Benedict XVI is the new pope, and pope is the leader of Roman Catholic Church. Therefore, the second sentence entails the first sentence."
],
"input_keys": ["sentence1", "sentence2"],
"labels": {
"entailment": 0,
"not_entailment": 1
},
"task_description": "whether the first sentence entails the second sentence"
},
"sst2": {
"task_id": "sst2",
"few_shot_split": "train",
"few_shot_ids": [1, 2],
"few_shot_rationale": [
"Saying something contains no wit expresses a negative attitude.",
"It says human nature is beautiful."
],
"input_keys": ["sentence"],
"labels": {
"negative": 0,
"positive": 1
},
"task_description": "whether the sentiment of the sentence is positive or negative"
},
"stsb": {
"task_id": "stsb",
"few_shot_split": "train",
"few_shot_ids": [0, 29, 32, 74, 45, 141],
"few_shot_rationale": [
"Both sentences are saying that an airplain is taking off.",
"Both sentences are saying that a girl is flying a kite. However, the second sentence contains a bit of extra information, i.e., that the girl is also running.",
"Both sentences mention that a women is dancing and singing. However, the first sentence metion that there is someone else while the second sentence talks about the weather.",
"The women is peeling something according to both sentences, but the sentences disagree as to what the woman is peeling.",
"Both sentences involves a person playing a musical instrument. But the gender of the person and the intrument are both different in the two sentences.",
"The two sentences are about entirely different things."
],
"input_keys": ["sentence1", "sentence2"],
"labels": {
"0": 0,
"1": 1,
"2": 2,
"3": 3,
"4": 4,
"5": 5
},
"task_description": "the similarity between a pair of sentences, with similarity score ranging between 0 and 5."
},
"wnli": {
"task_id": "wnli",
"few_shot_split": "train",
"few_shot_ids": [0, 3],
"few_shot_rationale": [
"The first sentence mentions that when the pin was pulled out from the carrot, 'it' had a hole. Clearly, 'it' refers to carrot given the context.",
"The first sentence mentions that Steve is being influenced by Fred, so it is not true that Steve influences someone."
],
"input_keys": ["sentence1", "sentence2"],
"labels": {
"not_entailment": 0,
"entailment": 1
},
"task_description": "whether the first sentence entails the second sentence"
}
}