SentenceTransformer based on thomaskim1130/stella_en_1.5B_v5-FinanceRAG

This is a sentence-transformers model finetuned from thomaskim1130/stella_en_1.5B_v5-FinanceRAG. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Paper

Link https://arxiv.org/abs/2503.15191

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: Qwen2Model 
  (1): Pooling({'word_embedding_dimension': 1536, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Dense({'in_features': 1536, 'out_features': 1024, 'bias': True, 'activation_function': 'torch.nn.modules.linear.Identity'})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Instruct: Given a web search query, retrieve relevant passages that answer the query.\nQuery: Title: \nText: | _id | q616570e0 |\n| title |  |\n| text | what was the ratio of the purchase in december 2012 to the purchase in january 2013\n\nwas ratio of purchase in december 2012 to purchase in january 2013\n\n\n',
    'Title: \nText: | _id | d61657130 |\n| title |  |\n| text | issuer purchases of equity securities during the three months ended december 31 , 2012 , we repurchased 619314 shares of our common stock for an aggregate of approximately $ 46.0 million , including commissions and fees , pursuant to our publicly announced stock repurchase program , as follows : period total number of shares purchased ( 1 ) average price paid per share ( 2 ) total number of shares purchased as part of publicly announced plans or programs approximate dollar value of shares that may yet be\n\ndollar value of shares that may yet be purchased under the plans or programs ( in millions ) .\n\nperiod               | total number of shares purchased ( 1 ) | average price paid per share ( 2 ) | total number of shares purchased as part of publicly announced plans orprograms | approximate dollar value of shares that may yet be purchased under the plans orprograms ( in millions )\n\n-------------------- | -------------------------------------- | ---------------------------------- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------\n\noctober 2012         | 27524                                  | $ 72.62                            | 27524                                                                           | $ 1300.1\n\nnovember 2012        | 489390                                 | $ 74.22                            | 489390                                                                          | $ 1263.7\n\ndecember 2012        | 102400                                 | $ 74.83                            | 102400                                                                          | $ 1256.1\n\ntotal fourth quarter | 619314                                 | $ 74.25                            | 619314                                                                          | $ 1256.1\n\n( 1 ) repurchases made pursuant to the $ 1.5 billion stock repurchase program approved by our board of directors in march 2011 ( the 201c2011 buyback 201d ) .\nunder this program , our management is authorized to purchase shares from time to time through open market purchases or privately negotiated transactions at prevailing prices as permitted by securities laws and other legal requirements , and subject to market conditions and other factors .\n\nto facilitate repurchases , we make purchases pursuant to trading plans under rule 10b5-1 of the exchange act , which allows us to repurchase shares during periods when we otherwise might be prevented from doing so under insider trading laws or because of self-imposed trading blackout periods .\nthis program may be discontinued at any time .\n( 2 ) average price per share is calculated using the aggregate price , excluding commissions and fees .\n\nwe continued to repurchase shares of our common stock pursuant to our 2011 buyback subsequent to december 31 , 2012 .\nbetween january 1 , 2013 and january 21 , 2013 , we repurchased an additional 15790 shares of our common stock for an aggregate of $ 1.2 million , including commissions and fees , pursuant to the 2011 buyback .\n\nas a result , as of january 21 , 2013 , we had repurchased a total of approximately 4.3 million shares of our common stock under the 2011 buyback for an aggregate of $ 245.2 million , including commissions and fees .\nwe expect to continue to manage the pacing of the remaining $ 1.3 billion under the 2011 buyback in response to general market conditions and other relevant factors.\n\nissuer purchases of equity securities during three months ended december 31, 2012  repurchased 619314 shares of common stock for aggregate approximately $ 46. 0 million , including commissions and fees pursuant to our publicly announced stock repurchase program follows : period total number of shares purchased 1 ) average price paid per share ( 2 ) total number of shares purchased as part of publicly announced plans or programs approximate dollar value of shares be purchased under plans or programs ( in\n\nshares be purchased under plans or programs ( in millions ).\n\nrepurchases made to $ 1. 5 billion stock repurchase program approved by board of directors in march 2011 ( 201c2011 buyback 201d ).\n under program management authorized to purchase shares through open market purchases or privately negotiated transactions at prevailing prices as permitted by securities laws other legal requirements subject to market conditions other factors.\n\nfacilitate repurchases make purchases pursuant trading plans under rule 10b5-1 of exchange act  allows to repurchase shares during periods prevented from under insider trading laws or self-imposed trading blackout periods.\n program may be discontinued any time.\n average price per share calculated using aggregate price  excluding commissions and fees.\n continued to repurchase shares common stock pursuant to 2011 buyback subsequent to december 31 , 2012.\n\nbetween january 1 , 2013 and january 21 , 2013  repurchased additional 15790 shares common stock for aggregate of $ 1. 2 million , including commissions and fees pursuant to 2011 buyback.\n as of january 21 , 2013  had repurchased total of approximately 4. 3 million shares of common stock under 2011 buyback for aggregate of $ 245. 2 million , including commissions and fees.\n\nexpect to continue manage pacing remaining $ 1. 3 billion under 2011 buyback in response to general market conditions other relevant factors.\n\nperiod               | total number of shares purchased ( 1 ) | average price paid per share ( 2 ) | total number of shares purchased as part of publicly announced plans orprograms | approximate dollar value of shares that may yet be purchased under the plans orprograms ( in millions )\n\n-------------------- | -------------------------------------- | ---------------------------------- | ------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------\n\noctober 2012         | 27524                                  | $ 72.62                            | 27524                                                                           | $ 1300.1\n\nnovember 2012        | 489390                                 | $ 74.22                            | 489390                                                                          | $ 1263.7\n\ndecember 2012        | 102400                                 | $ 74.83                            | 102400                                                                          | $ 1256.1                                                                                               \ntotal fourth quarter | 619314                                 | $ 74.25                            | 619314                                                                          | $ 1256.1\n\n\n',
    'Title: \nText: | _id | d1a729e44 |\n| title |  |\n| text | Adjusted Revenue has limitations as a financial measure, should be considered as supplemental in nature, and is not meant as a substitute for the related financial information prepared in accordance with GAAP. These limitations include the following:\n\n• Adjusted Revenue is net of transaction-based costs, which is our largest cost of revenue item; • Adjusted Revenue is net of bitcoin costs, which could be a significant cost; • The deferred revenue adjustment that is added back to Adjusted Revenue will never be recognized as revenue by the Company; and • other companies, including companies in our industry, may calculate Adjusted Revenue differently or not at all, which reduces its usefulness as a comparative measure.\n\nBecause of these limitations, you should consider Adjusted Revenue alongside other financial performance measures, including total net revenue and our financial results presented in accordance with GAAP.\nThe following table presents a reconciliation of total net revenue to Adjusted Revenue for each of the periods indicated:\n\n|            |            | Year Ended December 31, |            |         \n--------------------------------------------------------------- | ---------- | ---------- | ----------------------- | ---------- | --------\n                                                                | 2018       | 2017       | 2016                    | 2015       | 2014\n\n|            |            | (in thousands)          |            |         \nTotal net revenue                                               | $3,298,177 | $2,214,253 | $1,708,721              | $1,267,118 | $850,192\nLess: Starbucks transaction-based revenue                       | —          | —          | 78,903                  | 142,283    | 123,024\n\nLess: transaction-based costs                                   | 1,558,562  | 1,230,290  | 943,200                 | 672,667    | 450,858 \nLess: bitcoin costs                                             | 164,827    | —          | —                       | —          | —       \nAdd: deferred revenue adjustment related to purchase accounting | $12,853    | $—         | $—                      | $—         | $—\n\nAdjusted Revenue                                                | $1,587,641 | $983,963   | $686,618                | $452,168   | $276,310\n\nAdjusted Revenue has limitations as financial measure, should be considered as supplemental in not substitute for related financial information prepared in accordance with GAAP. limitations include:\n\n• Adjusted Revenue is net of transaction-based costs, our largest cost of revenue item; • Adjusted Revenue net of bitcoin costs, could be significant cost; • deferred revenue adjustment added back to Adjusted Revenue never recognized as revenue by Company; • other companies, including companies in our industry, may calculate Adjusted Revenue differently or not, reduces its usefulness as comparative measure.\n\nof limitations consider Adjusted Revenue alongside other financial performance measures, including total net revenue and our financial results presented in accordance with GAAP.\n following table presents reconciliation of total net revenue to Adjusted Revenue for each periods indicated:\n\n|            |            | Year Ended December 31, |            |         \n--------------------------------------------------------------- | ---------- | ---------- | ----------------------- | ---------- | --------\n                                                                | 2018       | 2017       | 2016                    | 2015       | 2014    \n                                                                |            |            | (in thousands)          |            |\n\nTotal net revenue                                               | $3,298,177 | $2,214,253 | $1,708,721              | $1,267,118 | $850,192\nLess: Starbucks transaction-based revenue                       | —          | —          | 78,903                  | 142,283    | 123,024 \nLess: transaction-based costs                                   | 1,558,562  | 1,230,290  | 943,200                 | 672,667    | 450,858\n\nLess: bitcoin costs                                             | 164,827    | —          | —                       | —          | —       \nAdd: deferred revenue adjustment related to purchase accounting | $12,853    | $—         | $—                      | $—         | $—      \nAdjusted Revenue                                                | $1,587,641 | $983,963   | $686,618                | $452,168   | $276,310\n\n\n',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.0
cosine_accuracy@3 0.0
cosine_accuracy@5 0.0049
cosine_accuracy@10 0.0097
cosine_precision@1 0.0
cosine_precision@3 0.0
cosine_precision@5 0.001
cosine_precision@10 0.001
cosine_recall@1 0.0
cosine_recall@3 0.0
cosine_recall@5 0.0049
cosine_recall@10 0.0097
cosine_ndcg@10 0.0037
cosine_mrr@10 0.0019
cosine_map@100 0.0202
dot_accuracy@1 0.0
dot_accuracy@3 0.0
dot_accuracy@5 0.0049
dot_accuracy@10 0.0121
dot_precision@1 0.0
dot_precision@3 0.0
dot_precision@5 0.001
dot_precision@10 0.0012
dot_recall@1 0.0
dot_recall@3 0.0
dot_recall@5 0.0049
dot_recall@10 0.0121
dot_ndcg@10 0.0046
dot_mrr@10 0.0023
dot_map@100 0.0174

Training Details

Training Dataset

Unnamed Dataset

  • Size: 2,268 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 49 tokens
    • mean: 88.95 tokens
    • max: 228 tokens
    • min: 60 tokens
    • mean: 472.9 tokens
    • max: 512 tokens
  • Samples:
    sentence_0 sentence_1
    Instruct: Given a web search query, retrieve relevant passages that answer the query.
    Query: Title:
    Text:
    _id
    Instruct: Given a web search query, retrieve relevant passages that answer the query.
    Query: Title:
    Text:
    _id
    Instruct: Given a web search query, retrieve relevant passages that answer the query.
    Query: Title:
    Text:
    _id
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • num_train_epochs: 2
  • fp16: True
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 2
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss Evaluate_cosine_map@100
0 0 - 0.5622
0.8818 500 0.1941 -
1.0 567 - 0.0202
1.7637 1000 0.0832 -
2.0 1134 - 0.0202

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.1.1
  • Transformers: 4.45.2
  • PyTorch: 2.5.1+cu121
  • Accelerate: 1.1.1
  • Datasets: 3.1.0
  • Tokenizers: 0.20.3

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
16
Safetensors
Model size
2B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for thomaskim1130/stella_en_1.5B_v5-FinanceRAG-v2

Dataset used to train thomaskim1130/stella_en_1.5B_v5-FinanceRAG-v2

Evaluation results