Model Card: GPT-2-DAPT-Ohsumed
A domain-adapted GPT-2, further pre-trained on the Ohsumed dataset text.
Model Details
Description
This model is based on the GPT-2 architecture and was further pre-trained (domain-adapted) using the text in Ohsumed dataset, excluding its test split.
- Developed by: Cesar Gonzalez-Gutierrez
- Funded by: ERC
- Architecture: GPT-2
- Language: English
- License: MIT
- Base model: GPT-2
Checkpoints
Intermediate checkpoints from the pre-training process are available and can be accessed using specific tags, which correspond to training epochs and steps:
| Epoch | Step | Tags | |
|---|---|---|---|
| 1 | 97 | epoch-1 | step-97 |
| 5 | 489 | epoch-5 | step-489 |
| 10 | 978 | epoch-10 | step-978 |
| 20 | 1956 | epoch-20 | step-1956 |
| 40 | 3913 | epoch-40 | step-3913 |
| 60 | 5870 | epoch-60 | step-5870 |
| 80 | 7826 | epoch-80 | step-7826 |
| 100 | 9783 | epoch-100 | step-9783 |
| 120 | 11740 | epoch-120 | step-11740 |
| 140 | 13696 | epoch-140 | step-13696 |
| 160 | 15653 | epoch-160 | step-15653 |
| 180 | 17610 | epoch-180 | step-17610 |
| 198 | 19400 | epoch-198 | step-19400 |
To load a model from a specific intermediate checkpoint, use the revision parameter with the corresponding tag:
from transformers import AutoModelForCausalLM
model = AutoModelForMaskedLM.from_pretrained("<model-name>", revision="<checkpoint-tag>")
Sources
- Paper: [Information pending]
Training Details
For more details on the training procedure, please refer to the base model's documentation: Training procedure.
Training Data
All texts from Ohsumed dataset, excluding the test partition.
Training Hyperparameters
- Precision: fp16
- Batch size: 8
- Gradient accumulation steps: 12
Uses
For typical use cases and limitations, please refer to the base model's guidance: Inteded uses & limitations.
Bias, Risks, and Limitations
This model inherits potential risks and limitations from the base model. Refer to: Limitations and bias.
Environmental Impact
- Hardware Type: NVIDIA A100 PCIE 40GB
- Runtime: 35 h
- Cluster Provider: Artemisa
- Compute Region: EU
- Carbon Emitted: 5.42 kg CO2 eq.
Citation
BibTeX:
[More Information Needed]
- Downloads last month
- 18
Model tree for cglez/gpt2-dapt-ohsumed
Base model
openai-community/gpt2