SBLGNT_to_FALAM
LoRA adapter for Bible translation: SBLGNT โ FALAM
Model Details
- Base Model: google/madlad400-3b-mt
- Task: Machine Translation (English โ Falam Chin)
- Training Data: Bible verse pairs
- Method: LoRA (Low-Rank Adaptation)
Training Details
- Training Pairs: 6,788
- Validation Pairs: 61
- Validation Strategy: SingleBookStrategy(book=2PE)
- Epochs: 20
- Learning Rate: 0.0003
- LoRA Rank: 32
- LoRA Alpha: 64
Metrics
| Metric | Score |
|---|---|
| BLEU | 19.24 |
| chrF | 48.07 |
| Levenshtein Similarity | 0.54 |
Usage
from transformers import T5ForConditionalGeneration, T5Tokenizer
from peft import PeftModel
# Load base model and tokenizer
model_name = "google/madlad400-3b-mt"
model = T5ForConditionalGeneration.from_pretrained(model_name)
tokenizer = T5Tokenizer.from_pretrained(model_name)
# Load LoRA adapter
model = PeftModel.from_pretrained(model, "fotcode/SBLGNT_to_FALAM")
# Translate
input_text = "<2cfm> In the beginning God created the heavens and the earth."
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs, max_length=128)
translation = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(translation)
Citation
If you use this model, please cite:
@misc{bible-translation-lora,
title={Bible Translation LoRA Adapters},
author={LRL Translation Project},
year={2025},
url={https://huggingface.co/fotcode/SBLGNT_to_FALAM}
}
- Downloads last month
- 7