RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper
•
1907.11692
•
Published
•
9
Work in progress
SciBERT-based model pre-trained with materials science scientific fulltext
Luca Foppiano Pedro Ortiz Suarez
Results obtained via 10-fold cross-validation, using DeLFT (https://github.com/kermitt2/delft)
| Model | Precision | Recall | F1 |
|---|---|---|---|
| SciBERT (baseline) | 81.62% | 84.23% | 82.90% |
| MatSciBERT (Gupta) | 81.45% | 84.36% | 82.88% |
| MatTPUSciBERT | 82.13% | 85.15% | 83.61% |
| MatBERT (Ceder) | 81.25% | 83.99% | 82.60% |
| BatteryScibert-cased | 81.09% | 84.14% | 82.59% |
| Model | Precision | Recall | F1 |
|---|---|---|---|
| SciBERT (baseline) | 88.73% | 86.76% | 87.73% |
| MatSciBERT (Gupta) | 84.98% | 90.12% | 87.47% |
| MatTPUSciBERT | 88.62% | 86.33% | 87.46% |
| MatBERT (Ceder) | 85.08% | 89.93% | 87.44% |
| BatteryScibert-cased | 85.02% | 89.30% | 87.11% |
| BatteryScibert-cased | 81.09% | 84.14% | 82.59% |
This work was supported by Google, through the researchers program https://cloud.google.com/edu/researchers
TBA