SentenceTransformer based on thomaskim1130/stella_en_1.5B_v5-FinanceRAG
This is a sentence-transformers model finetuned from thomaskim1130/stella_en_1.5B_v5-FinanceRAG. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Paper
Link https://arxiv.org/abs/2503.15191
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: thomaskim1130/stella_en_1.5B_v5-FinanceRAG
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 1024 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: Qwen2Model
(1): Pooling({'word_embedding_dimension': 1536, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 1536, 'out_features': 1024, 'bias': True, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: Title: \nText: | _id | q616570e0 |\n| title | |\n| text | what was the ratio of the purchase in december 2012 to the purchase in january 2013\n\nwas ratio of purchase in december 2012 to purchase in january 2013\n\n\n',
'Title: \nText: | _id | d61657130 |\n| title | |\n| text | issuer purchases of equity securities during the three months ended december 31 , 2012 , we repurchased 619314 shares of our common stock for an aggregate of approximately $ 46.0 million , including commissions and fees , pursuant to our publicly announced stock repurchase program , as follows : period total number of shares purchased ( 1 ) average price paid per share ( 2 ) total number of shares purchased as part of publicly announced plans or programs approximate dollar value of shares that may yet be\n\ndollar value of shares that may yet be purchased under the plans or programs ( in millions ) .\n\nperiod | total number of shares purchased ( 1 ) | average price paid per share ( 2 ) | total number of shares purchased as part of publicly announced plans orprograms | approximate dollar value of shares that may yet be purchased under the plans orprograms ( in millions )\n\n-------------------- | -------------------------------------- | ---------------------------------- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------\n\noctober 2012 | 27524 | $ 72.62 | 27524 | $ 1300.1\n\nnovember 2012 | 489390 | $ 74.22 | 489390 | $ 1263.7\n\ndecember 2012 | 102400 | $ 74.83 | 102400 | $ 1256.1\n\ntotal fourth quarter | 619314 | $ 74.25 | 619314 | $ 1256.1\n\n( 1 ) repurchases made pursuant to the $ 1.5 billion stock repurchase program approved by our board of directors in march 2011 ( the 201c2011 buyback 201d ) .\nunder this program , our management is authorized to purchase shares from time to time through open market purchases or privately negotiated transactions at prevailing prices as permitted by securities laws and other legal requirements , and subject to market conditions and other factors .\n\nto facilitate repurchases , we make purchases pursuant to trading plans under rule 10b5-1 of the exchange act , which allows us to repurchase shares during periods when we otherwise might be prevented from doing so under insider trading laws or because of self-imposed trading blackout periods .\nthis program may be discontinued at any time .\n( 2 ) average price per share is calculated using the aggregate price , excluding commissions and fees .\n\nwe continued to repurchase shares of our common stock pursuant to our 2011 buyback subsequent to december 31 , 2012 .\nbetween january 1 , 2013 and january 21 , 2013 , we repurchased an additional 15790 shares of our common stock for an aggregate of $ 1.2 million , including commissions and fees , pursuant to the 2011 buyback .\n\nas a result , as of january 21 , 2013 , we had repurchased a total of approximately 4.3 million shares of our common stock under the 2011 buyback for an aggregate of $ 245.2 million , including commissions and fees .\nwe expect to continue to manage the pacing of the remaining $ 1.3 billion under the 2011 buyback in response to general market conditions and other relevant factors.\n\nissuer purchases of equity securities during three months ended december 31, 2012 repurchased 619314 shares of common stock for aggregate approximately $ 46. 0 million , including commissions and fees pursuant to our publicly announced stock repurchase program follows : period total number of shares purchased 1 ) average price paid per share ( 2 ) total number of shares purchased as part of publicly announced plans or programs approximate dollar value of shares be purchased under plans or programs ( in\n\nshares be purchased under plans or programs ( in millions ).\n\nrepurchases made to $ 1. 5 billion stock repurchase program approved by board of directors in march 2011 ( 201c2011 buyback 201d ).\n under program management authorized to purchase shares through open market purchases or privately negotiated transactions at prevailing prices as permitted by securities laws other legal requirements subject to market conditions other factors.\n\nfacilitate repurchases make purchases pursuant trading plans under rule 10b5-1 of exchange act allows to repurchase shares during periods prevented from under insider trading laws or self-imposed trading blackout periods.\n program may be discontinued any time.\n average price per share calculated using aggregate price excluding commissions and fees.\n continued to repurchase shares common stock pursuant to 2011 buyback subsequent to december 31 , 2012.\n\nbetween january 1 , 2013 and january 21 , 2013 repurchased additional 15790 shares common stock for aggregate of $ 1. 2 million , including commissions and fees pursuant to 2011 buyback.\n as of january 21 , 2013 had repurchased total of approximately 4. 3 million shares of common stock under 2011 buyback for aggregate of $ 245. 2 million , including commissions and fees.\n\nexpect to continue manage pacing remaining $ 1. 3 billion under 2011 buyback in response to general market conditions other relevant factors.\n\nperiod | total number of shares purchased ( 1 ) | average price paid per share ( 2 ) | total number of shares purchased as part of publicly announced plans orprograms | approximate dollar value of shares that may yet be purchased under the plans orprograms ( in millions )\n\n-------------------- | -------------------------------------- | ---------------------------------- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------\n\noctober 2012 | 27524 | $ 72.62 | 27524 | $ 1300.1\n\nnovember 2012 | 489390 | $ 74.22 | 489390 | $ 1263.7\n\ndecember 2012 | 102400 | $ 74.83 | 102400 | $ 1256.1 \ntotal fourth quarter | 619314 | $ 74.25 | 619314 | $ 1256.1\n\n\n',
'Title: \nText: | _id | d1a729e44 |\n| title | |\n| text | Adjusted Revenue has limitations as a financial measure, should be considered as supplemental in nature, and is not meant as a substitute for the related financial information prepared in accordance with GAAP. These limitations include the following:\n\n• Adjusted Revenue is net of transaction-based costs, which is our largest cost of revenue item; • Adjusted Revenue is net of bitcoin costs, which could be a significant cost; • The deferred revenue adjustment that is added back to Adjusted Revenue will never be recognized as revenue by the Company; and • other companies, including companies in our industry, may calculate Adjusted Revenue differently or not at all, which reduces its usefulness as a comparative measure.\n\nBecause of these limitations, you should consider Adjusted Revenue alongside other financial performance measures, including total net revenue and our financial results presented in accordance with GAAP.\nThe following table presents a reconciliation of total net revenue to Adjusted Revenue for each of the periods indicated:\n\n| | | Year Ended December 31, | | \n--------------------------------------------------------------- | ---------- | ---------- | ----------------------- | ---------- | --------\n | 2018 | 2017 | 2016 | 2015 | 2014\n\n| | | (in thousands) | | \nTotal net revenue | $3,298,177 | $2,214,253 | $1,708,721 | $1,267,118 | $850,192\nLess: Starbucks transaction-based revenue | — | — | 78,903 | 142,283 | 123,024\n\nLess: transaction-based costs | 1,558,562 | 1,230,290 | 943,200 | 672,667 | 450,858 \nLess: bitcoin costs | 164,827 | — | — | — | — \nAdd: deferred revenue adjustment related to purchase accounting | $12,853 | $— | $— | $— | $—\n\nAdjusted Revenue | $1,587,641 | $983,963 | $686,618 | $452,168 | $276,310\n\nAdjusted Revenue has limitations as financial measure, should be considered as supplemental in not substitute for related financial information prepared in accordance with GAAP. limitations include:\n\n• Adjusted Revenue is net of transaction-based costs, our largest cost of revenue item; • Adjusted Revenue net of bitcoin costs, could be significant cost; • deferred revenue adjustment added back to Adjusted Revenue never recognized as revenue by Company; • other companies, including companies in our industry, may calculate Adjusted Revenue differently or not, reduces its usefulness as comparative measure.\n\nof limitations consider Adjusted Revenue alongside other financial performance measures, including total net revenue and our financial results presented in accordance with GAAP.\n following table presents reconciliation of total net revenue to Adjusted Revenue for each periods indicated:\n\n| | | Year Ended December 31, | | \n--------------------------------------------------------------- | ---------- | ---------- | ----------------------- | ---------- | --------\n | 2018 | 2017 | 2016 | 2015 | 2014 \n | | | (in thousands) | |\n\nTotal net revenue | $3,298,177 | $2,214,253 | $1,708,721 | $1,267,118 | $850,192\nLess: Starbucks transaction-based revenue | — | — | 78,903 | 142,283 | 123,024 \nLess: transaction-based costs | 1,558,562 | 1,230,290 | 943,200 | 672,667 | 450,858\n\nLess: bitcoin costs | 164,827 | — | — | — | — \nAdd: deferred revenue adjustment related to purchase accounting | $12,853 | $— | $— | $— | $— \nAdjusted Revenue | $1,587,641 | $983,963 | $686,618 | $452,168 | $276,310\n\n\n',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Dataset:
Evaluate - Evaluated with
InformationRetrievalEvaluator
| Metric | Value |
|---|---|
| cosine_accuracy@1 | 0.0 |
| cosine_accuracy@3 | 0.0 |
| cosine_accuracy@5 | 0.0049 |
| cosine_accuracy@10 | 0.0097 |
| cosine_precision@1 | 0.0 |
| cosine_precision@3 | 0.0 |
| cosine_precision@5 | 0.001 |
| cosine_precision@10 | 0.001 |
| cosine_recall@1 | 0.0 |
| cosine_recall@3 | 0.0 |
| cosine_recall@5 | 0.0049 |
| cosine_recall@10 | 0.0097 |
| cosine_ndcg@10 | 0.0037 |
| cosine_mrr@10 | 0.0019 |
| cosine_map@100 | 0.0202 |
| dot_accuracy@1 | 0.0 |
| dot_accuracy@3 | 0.0 |
| dot_accuracy@5 | 0.0049 |
| dot_accuracy@10 | 0.0121 |
| dot_precision@1 | 0.0 |
| dot_precision@3 | 0.0 |
| dot_precision@5 | 0.001 |
| dot_precision@10 | 0.0012 |
| dot_recall@1 | 0.0 |
| dot_recall@3 | 0.0 |
| dot_recall@5 | 0.0049 |
| dot_recall@10 | 0.0121 |
| dot_ndcg@10 | 0.0046 |
| dot_mrr@10 | 0.0023 |
| dot_map@100 | 0.0174 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 2,268 training samples
- Columns:
sentence_0andsentence_1 - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 49 tokens
- mean: 88.95 tokens
- max: 228 tokens
- min: 60 tokens
- mean: 472.9 tokens
- max: 512 tokens
- Samples:
sentence_0 sentence_1 Instruct: Given a web search query, retrieve relevant passages that answer the query.
Query: Title:
Text:_id Instruct: Given a web search query, retrieve relevant passages that answer the query.
Query: Title:
Text:_id Instruct: Given a web search query, retrieve relevant passages that answer the query.
Query: Title:
Text:_id - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy: stepsper_device_train_batch_size: 4per_device_eval_batch_size: 4num_train_epochs: 2fp16: Truebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: stepsprediction_loss_only: Trueper_device_train_batch_size: 4per_device_eval_batch_size: 4per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 2max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Falsehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseeval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falsebatch_sampler: no_duplicatesmulti_dataset_batch_sampler: round_robin
Training Logs
| Epoch | Step | Training Loss | Evaluate_cosine_map@100 |
|---|---|---|---|
| 0 | 0 | - | 0.5622 |
| 0.8818 | 500 | 0.1941 | - |
| 1.0 | 567 | - | 0.0202 |
| 1.7637 | 1000 | 0.0832 | - |
| 2.0 | 1134 | - | 0.0202 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.1.1
- Transformers: 4.45.2
- PyTorch: 2.5.1+cu121
- Accelerate: 1.1.1
- Datasets: 3.1.0
- Tokenizers: 0.20.3
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 16
Model tree for thomaskim1130/stella_en_1.5B_v5-FinanceRAG-v2
Base model
NovaSearch/stella_en_1.5B_v5Dataset used to train thomaskim1130/stella_en_1.5B_v5-FinanceRAG-v2
Evaluation results
- Cosine Accuracy@1 on Evaluateself-reported0.000
- Cosine Accuracy@3 on Evaluateself-reported0.000
- Cosine Accuracy@5 on Evaluateself-reported0.005
- Cosine Accuracy@10 on Evaluateself-reported0.010
- Cosine Precision@1 on Evaluateself-reported0.000
- Cosine Precision@3 on Evaluateself-reported0.000
- Cosine Precision@5 on Evaluateself-reported0.001
- Cosine Precision@10 on Evaluateself-reported0.001
- Cosine Recall@1 on Evaluateself-reported0.000
- Cosine Recall@3 on Evaluateself-reported0.000