|
|
--- |
|
|
language: |
|
|
- en |
|
|
tags: |
|
|
- sentence-transformers |
|
|
- cross-encoder |
|
|
- reranker |
|
|
- generated_from_trainer |
|
|
- dataset_size:39770704 |
|
|
- loss:MarginMSELoss |
|
|
base_model: jhu-clsp/ettin-encoder-150m |
|
|
datasets: |
|
|
- sentence-transformers/msmarco |
|
|
pipeline_tag: text-ranking |
|
|
library_name: sentence-transformers |
|
|
metrics: |
|
|
- map |
|
|
- mrr@10 |
|
|
- ndcg@10 |
|
|
co2_eq_emissions: |
|
|
emissions: 9007.676857965895 |
|
|
energy_consumed: 24.402161915767365 |
|
|
source: codecarbon |
|
|
training_type: fine-tuning |
|
|
on_cloud: false |
|
|
cpu_model: AMD EPYC 7R13 Processor |
|
|
ram_total_size: 1999.9855308532715 |
|
|
hours_used: 4.849 |
|
|
hardware_used: 8 x NVIDIA H100 80GB HBM3 |
|
|
model-index: |
|
|
- name: CrossEncoder based on jhu-clsp/ettin-encoder-150m |
|
|
results: |
|
|
- task: |
|
|
type: cross-encoder-reranking |
|
|
name: Cross Encoder Reranking |
|
|
dataset: |
|
|
name: NanoMSMARCO R100 |
|
|
type: NanoMSMARCO_R100 |
|
|
metrics: |
|
|
- type: map |
|
|
value: 0.6522 |
|
|
name: Map |
|
|
- type: mrr@10 |
|
|
value: 0.6477 |
|
|
name: Mrr@10 |
|
|
- type: ndcg@10 |
|
|
value: 0.718 |
|
|
name: Ndcg@10 |
|
|
- task: |
|
|
type: cross-encoder-reranking |
|
|
name: Cross Encoder Reranking |
|
|
dataset: |
|
|
name: NanoNFCorpus R100 |
|
|
type: NanoNFCorpus_R100 |
|
|
metrics: |
|
|
- type: map |
|
|
value: 0.3763 |
|
|
name: Map |
|
|
- type: mrr@10 |
|
|
value: 0.6256 |
|
|
name: Mrr@10 |
|
|
- type: ndcg@10 |
|
|
value: 0.4451 |
|
|
name: Ndcg@10 |
|
|
- task: |
|
|
type: cross-encoder-reranking |
|
|
name: Cross Encoder Reranking |
|
|
dataset: |
|
|
name: NanoNQ R100 |
|
|
type: NanoNQ_R100 |
|
|
metrics: |
|
|
- type: map |
|
|
value: 0.7584 |
|
|
name: Map |
|
|
- type: mrr@10 |
|
|
value: 0.7881 |
|
|
name: Mrr@10 |
|
|
- type: ndcg@10 |
|
|
value: 0.8011 |
|
|
name: Ndcg@10 |
|
|
- task: |
|
|
type: cross-encoder-nano-beir |
|
|
name: Cross Encoder Nano BEIR |
|
|
dataset: |
|
|
name: NanoBEIR R100 mean |
|
|
type: NanoBEIR_R100_mean |
|
|
metrics: |
|
|
- type: map |
|
|
value: 0.5957 |
|
|
name: Map |
|
|
- type: mrr@10 |
|
|
value: 0.6871 |
|
|
name: Mrr@10 |
|
|
- type: ndcg@10 |
|
|
value: 0.6548 |
|
|
name: Ndcg@10 |
|
|
--- |
|
|
|
|
|
# CrossEncoder based on jhu-clsp/ettin-encoder-150m |
|
|
|
|
|
This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [jhu-clsp/ettin-encoder-150m](https://huggingface.co/jhu-clsp/ettin-encoder-150m) on the [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) dataset using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for text reranking and semantic search. |
|
|
|
|
|
## Model Details |
|
|
|
|
|
### Model Description |
|
|
- **Model Type:** Cross Encoder |
|
|
- **Base model:** [jhu-clsp/ettin-encoder-150m](https://huggingface.co/jhu-clsp/ettin-encoder-150m) <!-- at revision 45d08642849e5c5701b162671ac811b7654bfd9f --> |
|
|
- **Maximum Sequence Length:** 512 tokens |
|
|
- **Number of Output Labels:** 1 label |
|
|
- **Training Dataset:** |
|
|
- [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) |
|
|
- **Language:** en |
|
|
<!-- - **License:** Unknown --> |
|
|
|
|
|
### Model Sources |
|
|
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net) |
|
|
- **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html) |
|
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers) |
|
|
- **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder) |
|
|
|
|
|
## Usage |
|
|
|
|
|
### Direct Usage (Sentence Transformers) |
|
|
|
|
|
First install the Sentence Transformers library: |
|
|
|
|
|
```bash |
|
|
pip install -U sentence-transformers |
|
|
``` |
|
|
|
|
|
Then you can load this model and run inference. |
|
|
```python |
|
|
from sentence_transformers import CrossEncoder |
|
|
|
|
|
# Download from the 🤗 Hub |
|
|
model = CrossEncoder("tomaarsen/ms-marco-ettin-150m-reranker") |
|
|
# Get scores for pairs of texts |
|
|
pairs = [ |
|
|
['which constitutional amendment required that u.s. senators be directly elected by the people instead of being chosen by state legislatures?', 'Full text of the Constitution and Amendments. The Seventeenth Amendment (Amendment XVII) to the United States Constitution established the popular election of United States Senators by the people of the states. The amendment supersedes Article I, §3, Clauses 1 and 2 of the Constitution, under which senators were elected by state legislatures. It also alters the procedure for filling vacancies in the Senate, allowing for state legislatures to permit their governors to make temporary appointments until a special election can be held.'], |
|
|
['where is the tigris river?', 'The Tigris is one of the two main rivers of Mesopotamia (modern Iraq). The Tigris is the river to the east (towards Persia [modern Iran]); the Euphrates, to the west. The Tigris runs from Lake Hazar, in the Taurus Mountains, in Turkey, joins the Euphrates, and flows into the Persian Gulf. The Encyclopedia Britannica says the Tigris is 1,180 miles (1,900 km) in length.'], |
|
|
['what tectonic plate is japan on', 'Japan lies over 4 tectonic plates. These are: The North American Plate The Eurasian Plate The Philippine Sea Plate The Pacific Plate.+ 5 others found this useful.Nathaniel Preece.apan lies on the North American, Eurasian and Pacific plates, and is close to the Phillipine plate. Tokyo and Sendai, for example are on the North American plate.'], |
|
|
['how many seasons of portlandia are there', 'We monitor the news to keep you updated on the release date of Portlandia season 7. To the delight of the fans, IFC has officially renewed the series. The release date has not been scheduled yet. If you want to get automatically notified of the showâ\x80\x99s premiere date, please, sign up for updates below.'], |
|
|
['how to delete messenger messages from iphone', 'Tap the Facebook Messengerâ\x80\x9d application and select Messages tab. 2. Locate the message or messages that you wish to delete. Press and hold on the message until a list of options displays, including the option to delete or copy the message. 3. Choose the Delete tab, not the Archive tab. If you wish to permanently delete Facebook messages, tap the Delete option.'], |
|
|
] |
|
|
scores = model.predict(pairs) |
|
|
print(scores.shape) |
|
|
# (5,) |
|
|
|
|
|
# Or rank different texts based on similarity to a single text |
|
|
ranks = model.rank( |
|
|
'which constitutional amendment required that u.s. senators be directly elected by the people instead of being chosen by state legislatures?', |
|
|
[ |
|
|
'Full text of the Constitution and Amendments. The Seventeenth Amendment (Amendment XVII) to the United States Constitution established the popular election of United States Senators by the people of the states. The amendment supersedes Article I, §3, Clauses 1 and 2 of the Constitution, under which senators were elected by state legislatures. It also alters the procedure for filling vacancies in the Senate, allowing for state legislatures to permit their governors to make temporary appointments until a special election can be held.', |
|
|
'The Tigris is one of the two main rivers of Mesopotamia (modern Iraq). The Tigris is the river to the east (towards Persia [modern Iran]); the Euphrates, to the west. The Tigris runs from Lake Hazar, in the Taurus Mountains, in Turkey, joins the Euphrates, and flows into the Persian Gulf. The Encyclopedia Britannica says the Tigris is 1,180 miles (1,900 km) in length.', |
|
|
'Japan lies over 4 tectonic plates. These are: The North American Plate The Eurasian Plate The Philippine Sea Plate The Pacific Plate.+ 5 others found this useful.Nathaniel Preece.apan lies on the North American, Eurasian and Pacific plates, and is close to the Phillipine plate. Tokyo and Sendai, for example are on the North American plate.', |
|
|
'We monitor the news to keep you updated on the release date of Portlandia season 7. To the delight of the fans, IFC has officially renewed the series. The release date has not been scheduled yet. If you want to get automatically notified of the showâ\x80\x99s premiere date, please, sign up for updates below.', |
|
|
'Tap the Facebook Messengerâ\x80\x9d application and select Messages tab. 2. Locate the message or messages that you wish to delete. Press and hold on the message until a list of options displays, including the option to delete or copy the message. 3. Choose the Delete tab, not the Archive tab. If you wish to permanently delete Facebook messages, tap the Delete option.', |
|
|
] |
|
|
) |
|
|
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...] |
|
|
``` |
|
|
|
|
|
<!-- |
|
|
### Direct Usage (Transformers) |
|
|
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary> |
|
|
|
|
|
</details> |
|
|
--> |
|
|
|
|
|
<!-- |
|
|
### Downstream Usage (Sentence Transformers) |
|
|
|
|
|
You can finetune this model on your own dataset. |
|
|
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
|
|
</details> |
|
|
--> |
|
|
|
|
|
<!-- |
|
|
### Out-of-Scope Use |
|
|
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
|
--> |
|
|
|
|
|
## Evaluation |
|
|
|
|
|
### Metrics |
|
|
|
|
|
#### Cross Encoder Reranking |
|
|
|
|
|
* Datasets: `NanoMSMARCO_R100`, `NanoNFCorpus_R100` and `NanoNQ_R100` |
|
|
* Evaluated with [<code>CrossEncoderRerankingEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator) with these parameters: |
|
|
```json |
|
|
{ |
|
|
"at_k": 10, |
|
|
"always_rerank_positives": true |
|
|
} |
|
|
``` |
|
|
|
|
|
| Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 | |
|
|
|:------------|:---------------------|:---------------------|:---------------------| |
|
|
| map | 0.6522 (+0.1627) | 0.3763 (+0.1153) | 0.7584 (+0.3388) | |
|
|
| mrr@10 | 0.6477 (+0.1702) | 0.6256 (+0.1258) | 0.7881 (+0.3614) | |
|
|
| **ndcg@10** | **0.7180 (+0.1776)** | **0.4451 (+0.1201)** | **0.8011 (+0.3005)** | |
|
|
|
|
|
#### Cross Encoder Nano BEIR |
|
|
|
|
|
* Dataset: `NanoBEIR_R100_mean` |
|
|
* Evaluated with [<code>CrossEncoderNanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderNanoBEIREvaluator) with these parameters: |
|
|
```json |
|
|
{ |
|
|
"dataset_names": [ |
|
|
"msmarco", |
|
|
"nfcorpus", |
|
|
"nq" |
|
|
], |
|
|
"rerank_k": 100, |
|
|
"at_k": 10, |
|
|
"always_rerank_positives": true |
|
|
} |
|
|
``` |
|
|
|
|
|
| Metric | Value | |
|
|
|:------------|:---------------------| |
|
|
| map | 0.5957 (+0.2056) | |
|
|
| mrr@10 | 0.6871 (+0.2191) | |
|
|
| **ndcg@10** | **0.6548 (+0.1994)** | |
|
|
|
|
|
<!-- |
|
|
## Bias, Risks and Limitations |
|
|
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
|
--> |
|
|
|
|
|
<!-- |
|
|
### Recommendations |
|
|
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
|
--> |
|
|
|
|
|
## Training Details |
|
|
|
|
|
### Training Dataset |
|
|
|
|
|
#### msmarco |
|
|
|
|
|
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83) |
|
|
* Size: 39,770,704 training samples |
|
|
* Columns: <code>query_id</code>, <code>positive_id</code>, <code>negative_id</code>, and <code>score</code> |
|
|
* Approximate statistics based on the first 1000 samples: |
|
|
| | query_id | positive_id | negative_id | score | |
|
|
|:--------|:------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:------------------------------------------------------------------| |
|
|
| type | string | string | string | float | |
|
|
| details | <ul><li>min: 10 characters</li><li>mean: 34.72 characters</li><li>max: 108 characters</li></ul> | <ul><li>min: 71 characters</li><li>mean: 351.58 characters</li><li>max: 919 characters</li></ul> | <ul><li>min: 81 characters</li><li>mean: 344.41 characters</li><li>max: 992 characters</li></ul> | <ul><li>min: -1.0</li><li>mean: 13.5</li><li>max: 22.59</li></ul> | |
|
|
* Samples: |
|
|
| query_id | positive_id | negative_id | score | |
|
|
|:------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------| |
|
|
| <code>where is jade city bc</code> | <code>Jade City, British Columbia. From Wikipedia, the free encyclopedia. Jade City is a spot on the road in northwestern British Columbia, Canada, near the Yukon, located on Highway 37, west of Good Hope Lake and close to Cassiar, in the Cassiar Highlands.The region around Jade City is rich with serpentinite (a jade precursor), greenstone (jade look-a-likes), and jade.ade City is by road about 24 hours north of Vancouver, and 1 hour south of the Yukon border. As of 2015, it has a population of about 30 people. Jade City is a very small town.</code> | <code>China. Few gems have the mystique of jade, a stone that has been revered in China for more than 4000 years. Jade is also one of the most misunderstood of gems -- there is widespread confusion about the types of jade, about the most valuable colors, and the standards used to grade it.</code> | <code>15.705843766530355</code> | |
|
|
| <code>is asparagus good for your kidneys</code> | <code>Is Asparagus Good For Your Kidneys? Those who want to keep their kidneys functioning properly will definitely want to include certain foods in their diet, including asparagus. This particular vegetable is able to sooth the urinary system as well as increase urine production and support kidney function overall. Some of the different properties that asparagus has which makes it good for the kidneys includes asparagin, glycosides, glycolic acid, tyrosin, vitamin A, B, C, E and folic acid.</code> | <code>Last year, I planted 20 asparagus crowns and only had 1 fern over the entire year. This year, I read a lot of articles on planting asparagus and decided to follow a recommendation to soak the crowns in luke warm water for 30 minutes before planting.reat article! Last year, I planted 20 asparagus crowns and only had 1 fern over the entire year. This year, I read a lot of articles on planting asparagus and decided to follow a recommendation to soak the crowns in luke warm water for 30 minutes before planting. Success! I now have ferns along the entire row!</code> | <code>21.735484917958576</code> | |
|
|
| <code>what does a urine culture tell you</code> | <code>A urine culture is a test to find germs (such as bacteria) in the urine that can cause an infection. Urine in the bladder is normally sterile. This means it does not contain any bacteria or other organisms (such as fungi). But bacteria can enter the urethra and cause a urinary tract infection (UTI).</code> | <code>What Is The Procedure To Conduct A Urine Culture Test? Urine culture refers to a urine test that is conducted to find bacteria that could cause a urinary tract infection. Bladder urine is supposed to be sterile and devoid of any bacteria or fungi, but sometimes bacteria enters the urine through the urethra, causing a urinary infection.</code> | <code>1.1982590357462568</code> | |
|
|
* Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#marginmseloss) with these parameters: |
|
|
```json |
|
|
{ |
|
|
"activation_fn": "torch.nn.modules.linear.Identity" |
|
|
} |
|
|
``` |
|
|
|
|
|
### Evaluation Dataset |
|
|
|
|
|
#### msmarco |
|
|
|
|
|
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83) |
|
|
* Size: 10,000 evaluation samples |
|
|
* Columns: <code>query_id</code>, <code>positive_id</code>, <code>negative_id</code>, and <code>score</code> |
|
|
* Approximate statistics based on the first 1000 samples: |
|
|
| | query_id | positive_id | negative_id | score | |
|
|
|:--------|:-----------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------| |
|
|
| type | string | string | string | float | |
|
|
| details | <ul><li>min: 8 characters</li><li>mean: 34.72 characters</li><li>max: 169 characters</li></ul> | <ul><li>min: 65 characters</li><li>mean: 356.17 characters</li><li>max: 968 characters</li></ul> | <ul><li>min: 36 characters</li><li>mean: 340.3 characters</li><li>max: 982 characters</li></ul> | <ul><li>min: -2.7</li><li>mean: 13.48</li><li>max: 22.44</li></ul> | |
|
|
* Samples: |
|
|
| query_id | positive_id | negative_id | score | |
|
|
|:---------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------| |
|
|
| <code>which constitutional amendment required that u.s. senators be directly elected by the people instead of being chosen by state legislatures?</code> | <code>Full text of the Constitution and Amendments. The Seventeenth Amendment (Amendment XVII) to the United States Constitution established the popular election of United States Senators by the people of the states. The amendment supersedes Article I, §3, Clauses 1 and 2 of the Constitution, under which senators were elected by state legislatures. It also alters the procedure for filling vacancies in the Senate, allowing for state legislatures to permit their governors to make temporary appointments until a special election can be held.</code> | <code>Explanation: The original text of the Constitution called for the election of a state s senators to be dome by the state s legislature. This was changed in the 17th amendment that called for Senators to be elected directly by the people of the states.</code> | <code>-0.32251540819803814</code> | |
|
|
| <code>where is the tigris river?</code> | <code>The Tigris is one of the two main rivers of Mesopotamia (modern Iraq). The Tigris is the river to the east (towards Persia [modern Iran]); the Euphrates, to the west. The Tigris runs from Lake Hazar, in the Taurus Mountains, in Turkey, joins the Euphrates, and flows into the Persian Gulf. The Encyclopedia Britannica says the Tigris is 1,180 miles (1,900 km) in length.</code> | <code>The oldest known civilization in South America, as well as in the Western Hemisphere as a whole, the Norte Chico civilization-c. 3200 BC - 1800 BC-comprised several interconnected settlements on the Peruvian coast, including the urban centers at Aspero and Caral.he civilizations that emerged around these rivers are among the earliest known non-nomadic agrarian societies. Because Ubaid, Sumer, Akkad, Assyria and Babylon civilizations all emerged around the Tigris-Euphrates, the theory that Mesopotamia is the cradle of civilization is widely accepted.</code> | <code>12.891789237658184</code> | |
|
|
| <code>what tectonic plate is japan on</code> | <code>Japan lies over 4 tectonic plates. These are: The North American Plate The Eurasian Plate The Philippine Sea Plate The Pacific Plate.+ 5 others found this useful.Nathaniel Preece.apan lies on the North American, Eurasian and Pacific plates, and is close to the Phillipine plate. Tokyo and Sendai, for example are on the North American plate.</code> | <code>History[edit] In the past two decades the steel plate shear wall (SPSW), also known as the steel plate wall (SPW), has been used in a number of buildings in Japan and North America as part of the lateral force resisting system.</code> | <code>12.691253264745075</code> | |
|
|
* Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#marginmseloss) with these parameters: |
|
|
```json |
|
|
{ |
|
|
"activation_fn": "torch.nn.modules.linear.Identity" |
|
|
} |
|
|
``` |
|
|
|
|
|
### Training Hyperparameters |
|
|
#### Non-Default Hyperparameters |
|
|
|
|
|
- `eval_strategy`: steps |
|
|
- `per_device_train_batch_size`: 128 |
|
|
- `per_device_eval_batch_size`: 128 |
|
|
- `learning_rate`: 2e-05 |
|
|
- `num_train_epochs`: 1 |
|
|
- `warmup_ratio`: 0.1 |
|
|
- `seed`: 12 |
|
|
- `bf16`: True |
|
|
- `load_best_model_at_end`: True |
|
|
|
|
|
#### All Hyperparameters |
|
|
<details><summary>Click to expand</summary> |
|
|
|
|
|
- `overwrite_output_dir`: False |
|
|
- `do_predict`: False |
|
|
- `eval_strategy`: steps |
|
|
- `prediction_loss_only`: True |
|
|
- `per_device_train_batch_size`: 128 |
|
|
- `per_device_eval_batch_size`: 128 |
|
|
- `per_gpu_train_batch_size`: None |
|
|
- `per_gpu_eval_batch_size`: None |
|
|
- `gradient_accumulation_steps`: 1 |
|
|
- `eval_accumulation_steps`: None |
|
|
- `torch_empty_cache_steps`: None |
|
|
- `learning_rate`: 2e-05 |
|
|
- `weight_decay`: 0.0 |
|
|
- `adam_beta1`: 0.9 |
|
|
- `adam_beta2`: 0.999 |
|
|
- `adam_epsilon`: 1e-08 |
|
|
- `max_grad_norm`: 1.0 |
|
|
- `num_train_epochs`: 1 |
|
|
- `max_steps`: -1 |
|
|
- `lr_scheduler_type`: linear |
|
|
- `lr_scheduler_kwargs`: {} |
|
|
- `warmup_ratio`: 0.1 |
|
|
- `warmup_steps`: 0 |
|
|
- `log_level`: passive |
|
|
- `log_level_replica`: warning |
|
|
- `log_on_each_node`: True |
|
|
- `logging_nan_inf_filter`: True |
|
|
- `save_safetensors`: True |
|
|
- `save_on_each_node`: False |
|
|
- `save_only_model`: False |
|
|
- `restore_callback_states_from_checkpoint`: False |
|
|
- `no_cuda`: False |
|
|
- `use_cpu`: False |
|
|
- `use_mps_device`: False |
|
|
- `seed`: 12 |
|
|
- `data_seed`: None |
|
|
- `jit_mode_eval`: False |
|
|
- `bf16`: True |
|
|
- `fp16`: False |
|
|
- `fp16_opt_level`: O1 |
|
|
- `half_precision_backend`: auto |
|
|
- `bf16_full_eval`: False |
|
|
- `fp16_full_eval`: False |
|
|
- `tf32`: None |
|
|
- `local_rank`: 0 |
|
|
- `ddp_backend`: None |
|
|
- `tpu_num_cores`: None |
|
|
- `tpu_metrics_debug`: False |
|
|
- `debug`: [] |
|
|
- `dataloader_drop_last`: True |
|
|
- `dataloader_num_workers`: 0 |
|
|
- `dataloader_prefetch_factor`: None |
|
|
- `past_index`: -1 |
|
|
- `disable_tqdm`: False |
|
|
- `remove_unused_columns`: True |
|
|
- `label_names`: None |
|
|
- `load_best_model_at_end`: True |
|
|
- `ignore_data_skip`: False |
|
|
- `fsdp`: [] |
|
|
- `fsdp_min_num_params`: 0 |
|
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} |
|
|
- `fsdp_transformer_layer_cls_to_wrap`: None |
|
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} |
|
|
- `parallelism_config`: None |
|
|
- `deepspeed`: None |
|
|
- `label_smoothing_factor`: 0.0 |
|
|
- `optim`: adamw_torch_fused |
|
|
- `optim_args`: None |
|
|
- `adafactor`: False |
|
|
- `group_by_length`: False |
|
|
- `length_column_name`: length |
|
|
- `project`: huggingface |
|
|
- `trackio_space_id`: trackio |
|
|
- `ddp_find_unused_parameters`: None |
|
|
- `ddp_bucket_cap_mb`: None |
|
|
- `ddp_broadcast_buffers`: False |
|
|
- `dataloader_pin_memory`: True |
|
|
- `dataloader_persistent_workers`: False |
|
|
- `skip_memory_metrics`: True |
|
|
- `use_legacy_prediction_loop`: False |
|
|
- `push_to_hub`: False |
|
|
- `resume_from_checkpoint`: None |
|
|
- `hub_model_id`: None |
|
|
- `hub_strategy`: every_save |
|
|
- `hub_private_repo`: None |
|
|
- `hub_always_push`: False |
|
|
- `hub_revision`: None |
|
|
- `gradient_checkpointing`: False |
|
|
- `gradient_checkpointing_kwargs`: None |
|
|
- `include_inputs_for_metrics`: False |
|
|
- `include_for_metrics`: [] |
|
|
- `eval_do_concat_batches`: True |
|
|
- `fp16_backend`: auto |
|
|
- `push_to_hub_model_id`: None |
|
|
- `push_to_hub_organization`: None |
|
|
- `mp_parameters`: |
|
|
- `auto_find_batch_size`: False |
|
|
- `full_determinism`: False |
|
|
- `torchdynamo`: None |
|
|
- `ray_scope`: last |
|
|
- `ddp_timeout`: 1800 |
|
|
- `torch_compile`: False |
|
|
- `torch_compile_backend`: None |
|
|
- `torch_compile_mode`: None |
|
|
- `include_tokens_per_second`: False |
|
|
- `include_num_input_tokens_seen`: no |
|
|
- `neftune_noise_alpha`: None |
|
|
- `optim_target_modules`: None |
|
|
- `batch_eval_metrics`: False |
|
|
- `eval_on_start`: False |
|
|
- `use_liger_kernel`: False |
|
|
- `liger_kernel_config`: None |
|
|
- `eval_use_gather_object`: False |
|
|
- `average_tokens_across_devices`: True |
|
|
- `prompts`: None |
|
|
- `batch_sampler`: batch_sampler |
|
|
- `multi_dataset_batch_sampler`: proportional |
|
|
- `router_mapping`: {} |
|
|
- `learning_rate_mapping`: {} |
|
|
|
|
|
</details> |
|
|
|
|
|
### Training Logs |
|
|
<details><summary>Click to expand</summary> |
|
|
|
|
|
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 | |
|
|
|:----------:|:--------:|:-------------:|:---------------:|:------------------------:|:-------------------------:|:--------------------:|:--------------------------:| |
|
|
| -1 | -1 | - | - | 0.0321 (-0.5083) | 0.1979 (-0.1271) | 0.0227 (-0.4780) | 0.0842 (-0.3711) | |
|
|
| 0.0000 | 1 | 207.4745 | - | - | - | - | - | |
|
|
| 0.0020 | 78 | 208.2147 | - | - | - | - | - | |
|
|
| 0.0040 | 156 | 206.6548 | - | - | - | - | - | |
|
|
| 0.0060 | 234 | 200.7388 | - | - | - | - | - | |
|
|
| 0.0080 | 312 | 182.4953 | - | - | - | - | - | |
|
|
| 0.0100 | 389 | - | 75.3840 | 0.3932 (-0.1472) | 0.2493 (-0.0758) | 0.4066 (-0.0940) | 0.3497 (-0.1057) | |
|
|
| 0.0100 | 390 | 130.2694 | - | - | - | - | - | |
|
|
| 0.0121 | 468 | 43.6472 | - | - | - | - | - | |
|
|
| 0.0141 | 546 | 24.7341 | - | - | - | - | - | |
|
|
| 0.0161 | 624 | 18.6241 | - | - | - | - | - | |
|
|
| 0.0181 | 702 | 15.3865 | - | - | - | - | - | |
|
|
| 0.0200 | 778 | - | 12.5421 | 0.6049 (+0.0645) | 0.3811 (+0.0561) | 0.6413 (+0.1406) | 0.5424 (+0.0871) | |
|
|
| 0.0201 | 780 | 13.24 | - | - | - | - | - | |
|
|
| 0.0221 | 858 | 11.5539 | - | - | - | - | - | |
|
|
| 0.0241 | 936 | 10.4097 | - | - | - | - | - | |
|
|
| 0.0261 | 1014 | 9.5802 | - | - | - | - | - | |
|
|
| 0.0281 | 1092 | 8.8313 | - | - | - | - | - | |
|
|
| 0.0300 | 1167 | - | 8.2054 | 0.6507 (+0.1103) | 0.3997 (+0.0746) | 0.7045 (+0.2039) | 0.5850 (+0.1296) | |
|
|
| 0.0301 | 1170 | 8.3729 | - | - | - | - | - | |
|
|
| 0.0321 | 1248 | 7.8437 | - | - | - | - | - | |
|
|
| 0.0341 | 1326 | 7.4383 | - | - | - | - | - | |
|
|
| 0.0362 | 1404 | 6.9899 | - | - | - | - | - | |
|
|
| 0.0382 | 1482 | 6.7353 | - | - | - | - | - | |
|
|
| 0.0401 | 1556 | - | 6.4089 | 0.6421 (+0.1016) | 0.4112 (+0.0861) | 0.7289 (+0.2283) | 0.5941 (+0.1387) | |
|
|
| 0.0402 | 1560 | 6.4088 | - | - | - | - | - | |
|
|
| 0.0422 | 1638 | 6.1957 | - | - | - | - | - | |
|
|
| 0.0442 | 1716 | 6.0008 | - | - | - | - | - | |
|
|
| 0.0462 | 1794 | 5.7517 | - | - | - | - | - | |
|
|
| 0.0482 | 1872 | 5.5172 | - | - | - | - | - | |
|
|
| 0.0501 | 1945 | - | 5.5248 | 0.6704 (+0.1300) | 0.4163 (+0.0913) | 0.7344 (+0.2337) | 0.6070 (+0.1517) | |
|
|
| 0.0502 | 1950 | 5.38 | - | - | - | - | - | |
|
|
| 0.0522 | 2028 | 5.3785 | - | - | - | - | - | |
|
|
| 0.0542 | 2106 | 5.1515 | - | - | - | - | - | |
|
|
| 0.0562 | 2184 | 5.0263 | - | - | - | - | - | |
|
|
| 0.0582 | 2262 | 4.9028 | - | - | - | - | - | |
|
|
| 0.0601 | 2334 | - | 4.8510 | 0.6637 (+0.1233) | 0.4187 (+0.0936) | 0.7418 (+0.2412) | 0.6081 (+0.1527) | |
|
|
| 0.0603 | 2340 | 4.8283 | - | - | - | - | - | |
|
|
| 0.0623 | 2418 | 4.6962 | - | - | - | - | - | |
|
|
| 0.0643 | 2496 | 4.6482 | - | - | - | - | - | |
|
|
| 0.0663 | 2574 | 4.5224 | - | - | - | - | - | |
|
|
| 0.0683 | 2652 | 4.4137 | - | - | - | - | - | |
|
|
| 0.0701 | 2723 | - | 4.4682 | 0.6785 (+0.1381) | 0.4142 (+0.0892) | 0.7679 (+0.2672) | 0.6202 (+0.1648) | |
|
|
| 0.0703 | 2730 | 4.3639 | - | - | - | - | - | |
|
|
| 0.0723 | 2808 | 4.2902 | - | - | - | - | - | |
|
|
| 0.0743 | 2886 | 4.2308 | - | - | - | - | - | |
|
|
| 0.0763 | 2964 | 4.1607 | - | - | - | - | - | |
|
|
| 0.0783 | 3042 | 4.0809 | - | - | - | - | - | |
|
|
| 0.0801 | 3112 | - | 3.9937 | 0.6766 (+0.1362) | 0.4238 (+0.0988) | 0.7817 (+0.2810) | 0.6274 (+0.1720) | |
|
|
| 0.0803 | 3120 | 4.0516 | - | - | - | - | - | |
|
|
| 0.0823 | 3198 | 3.9772 | - | - | - | - | - | |
|
|
| 0.0844 | 3276 | 3.9662 | - | - | - | - | - | |
|
|
| 0.0864 | 3354 | 3.8161 | - | - | - | - | - | |
|
|
| 0.0884 | 3432 | 3.8557 | - | - | - | - | - | |
|
|
| 0.0901 | 3501 | - | 3.9840 | 0.6680 (+0.1276) | 0.4198 (+0.0947) | 0.7710 (+0.2703) | 0.6196 (+0.1642) | |
|
|
| 0.0904 | 3510 | 3.7875 | - | - | - | - | - | |
|
|
| 0.0924 | 3588 | 3.7164 | - | - | - | - | - | |
|
|
| 0.0944 | 3666 | 3.6808 | - | - | - | - | - | |
|
|
| 0.0964 | 3744 | 3.6347 | - | - | - | - | - | |
|
|
| 0.0984 | 3822 | 3.5847 | - | - | - | - | - | |
|
|
| 0.1002 | 3890 | - | 3.6954 | 0.6761 (+0.1357) | 0.4364 (+0.1114) | 0.7863 (+0.2856) | 0.6329 (+0.1776) | |
|
|
| 0.1004 | 3900 | 3.5902 | - | - | - | - | - | |
|
|
| 0.1024 | 3978 | 3.509 | - | - | - | - | - | |
|
|
| 0.1044 | 4056 | 3.5064 | - | - | - | - | - | |
|
|
| 0.1064 | 4134 | 3.4219 | - | - | - | - | - | |
|
|
| 0.1085 | 4212 | 3.366 | - | - | - | - | - | |
|
|
| 0.1102 | 4279 | - | 3.3804 | 0.6863 (+0.1459) | 0.4224 (+0.0973) | 0.7947 (+0.2941) | 0.6345 (+0.1791) | |
|
|
| 0.1105 | 4290 | 3.3347 | - | - | - | - | - | |
|
|
| 0.1125 | 4368 | 3.3026 | - | - | - | - | - | |
|
|
| 0.1145 | 4446 | 3.2498 | - | - | - | - | - | |
|
|
| 0.1165 | 4524 | 3.255 | - | - | - | - | - | |
|
|
| 0.1185 | 4602 | 3.1982 | - | - | - | - | - | |
|
|
| 0.1202 | 4668 | - | 3.2256 | 0.7017 (+0.1613) | 0.4293 (+0.1043) | 0.7822 (+0.2816) | 0.6377 (+0.1824) | |
|
|
| 0.1205 | 4680 | 3.1525 | - | - | - | - | - | |
|
|
| 0.1225 | 4758 | 3.1405 | - | - | - | - | - | |
|
|
| 0.1245 | 4836 | 3.0912 | - | - | - | - | - | |
|
|
| 0.1265 | 4914 | 3.0559 | - | - | - | - | - | |
|
|
| 0.1285 | 4992 | 3.0431 | - | - | - | - | - | |
|
|
| 0.1302 | 5057 | - | 3.0894 | 0.6808 (+0.1404) | 0.4356 (+0.1106) | 0.7973 (+0.2967) | 0.6379 (+0.1825) | |
|
|
| 0.1305 | 5070 | 3.0138 | - | - | - | - | - | |
|
|
| 0.1326 | 5148 | 3.0115 | - | - | - | - | - | |
|
|
| 0.1346 | 5226 | 2.9885 | - | - | - | - | - | |
|
|
| 0.1366 | 5304 | 2.933 | - | - | - | - | - | |
|
|
| 0.1386 | 5382 | 2.885 | - | - | - | - | - | |
|
|
| 0.1402 | 5446 | - | 2.9048 | 0.7014 (+0.1610) | 0.4382 (+0.1131) | 0.7868 (+0.2861) | 0.6421 (+0.1867) | |
|
|
| 0.1406 | 5460 | 2.8851 | - | - | - | - | - | |
|
|
| 0.1426 | 5538 | 2.9002 | - | - | - | - | - | |
|
|
| 0.1446 | 5616 | 2.8765 | - | - | - | - | - | |
|
|
| 0.1466 | 5694 | 2.8202 | - | - | - | - | - | |
|
|
| 0.1486 | 5772 | 2.847 | - | - | - | - | - | |
|
|
| 0.1502 | 5835 | - | 2.9050 | 0.7169 (+0.1764) | 0.4233 (+0.0983) | 0.7844 (+0.2838) | 0.6415 (+0.1862) | |
|
|
| 0.1506 | 5850 | 2.8285 | - | - | - | - | - | |
|
|
| 0.1526 | 5928 | 2.7882 | - | - | - | - | - | |
|
|
| 0.1546 | 6006 | 2.7507 | - | - | - | - | - | |
|
|
| 0.1567 | 6084 | 2.7457 | - | - | - | - | - | |
|
|
| 0.1587 | 6162 | 2.7313 | - | - | - | - | - | |
|
|
| **0.1603** | **6224** | **-** | **2.7971** | **0.7180 (+0.1776)** | **0.4451 (+0.1201)** | **0.8011 (+0.3005)** | **0.6548 (+0.1994)** | |
|
|
| 0.1607 | 6240 | 2.7239 | - | - | - | - | - | |
|
|
| 0.1627 | 6318 | 2.6975 | - | - | - | - | - | |
|
|
| 0.1647 | 6396 | 2.6854 | - | - | - | - | - | |
|
|
| 0.1667 | 6474 | 2.6714 | - | - | - | - | - | |
|
|
| 0.1687 | 6552 | 2.6476 | - | - | - | - | - | |
|
|
| 0.1703 | 6613 | - | 2.6787 | 0.7106 (+0.1702) | 0.4325 (+0.1075) | 0.7970 (+0.2964) | 0.6467 (+0.1914) | |
|
|
| 0.1707 | 6630 | 2.6565 | - | - | - | - | - | |
|
|
| 0.1727 | 6708 | 2.5863 | - | - | - | - | - | |
|
|
| 0.1747 | 6786 | 2.6027 | - | - | - | - | - | |
|
|
| 0.1767 | 6864 | 2.606 | - | - | - | - | - | |
|
|
| 0.1787 | 6942 | 2.5634 | - | - | - | - | - | |
|
|
| 0.1803 | 7002 | - | 2.6158 | 0.7166 (+0.1762) | 0.4361 (+0.1111) | 0.7984 (+0.2977) | 0.6504 (+0.1950) | |
|
|
| 0.1808 | 7020 | 2.548 | - | - | - | - | - | |
|
|
| 0.1828 | 7098 | 2.5719 | - | - | - | - | - | |
|
|
| 0.1848 | 7176 | 2.5375 | - | - | - | - | - | |
|
|
| 0.1868 | 7254 | 2.5263 | - | - | - | - | - | |
|
|
| 0.1888 | 7332 | 2.5312 | - | - | - | - | - | |
|
|
| 0.1903 | 7391 | - | 2.5446 | 0.7002 (+0.1598) | 0.4387 (+0.1137) | 0.7865 (+0.2859) | 0.6418 (+0.1864) | |
|
|
| 0.1908 | 7410 | 2.4945 | - | - | - | - | - | |
|
|
| 0.1928 | 7488 | 2.4464 | - | - | - | - | - | |
|
|
| 0.1948 | 7566 | 2.4738 | - | - | - | - | - | |
|
|
| 0.1968 | 7644 | 2.4752 | - | - | - | - | - | |
|
|
| 0.1988 | 7722 | 2.4583 | - | - | - | - | - | |
|
|
| 0.2003 | 7780 | - | 2.4624 | 0.6970 (+0.1565) | 0.4396 (+0.1145) | 0.7976 (+0.2970) | 0.6447 (+0.1893) | |
|
|
| 0.2008 | 7800 | 2.4284 | - | - | - | - | - | |
|
|
| 0.2028 | 7878 | 2.4296 | - | - | - | - | - | |
|
|
| 0.2049 | 7956 | 2.4268 | - | - | - | - | - | |
|
|
| 0.2069 | 8034 | 2.4424 | - | - | - | - | - | |
|
|
| 0.2089 | 8112 | 2.4084 | - | - | - | - | - | |
|
|
| 0.2103 | 8169 | - | 2.4326 | 0.6985 (+0.1580) | 0.4347 (+0.1097) | 0.7968 (+0.2961) | 0.6433 (+0.1879) | |
|
|
| 0.2109 | 8190 | 2.3929 | - | - | - | - | - | |
|
|
| 0.2129 | 8268 | 2.3961 | - | - | - | - | - | |
|
|
| 0.2149 | 8346 | 2.3766 | - | - | - | - | - | |
|
|
| 0.2169 | 8424 | 2.3712 | - | - | - | - | - | |
|
|
| 0.2189 | 8502 | 2.3447 | - | - | - | - | - | |
|
|
| 0.2204 | 8558 | - | 2.3136 | 0.7005 (+0.1600) | 0.4400 (+0.1149) | 0.7844 (+0.2837) | 0.6416 (+0.1862) | |
|
|
| 0.2209 | 8580 | 2.3248 | - | - | - | - | - | |
|
|
| 0.2229 | 8658 | 2.3181 | - | - | - | - | - | |
|
|
| 0.2249 | 8736 | 2.3264 | - | - | - | - | - | |
|
|
| 0.2269 | 8814 | 2.3092 | - | - | - | - | - | |
|
|
| 0.2290 | 8892 | 2.2868 | - | - | - | - | - | |
|
|
| 0.2304 | 8947 | - | 2.3536 | 0.7082 (+0.1678) | 0.4401 (+0.1150) | 0.7904 (+0.2897) | 0.6462 (+0.1908) | |
|
|
| 0.2310 | 8970 | 2.2946 | - | - | - | - | - | |
|
|
| 0.2330 | 9048 | 2.2849 | - | - | - | - | - | |
|
|
| 0.2350 | 9126 | 2.2389 | - | - | - | - | - | |
|
|
| 0.2370 | 9204 | 2.2426 | - | - | - | - | - | |
|
|
| 0.2390 | 9282 | 2.2654 | - | - | - | - | - | |
|
|
| 0.2404 | 9336 | - | 2.2665 | 0.6990 (+0.1586) | 0.4479 (+0.1229) | 0.7904 (+0.2898) | 0.6458 (+0.1904) | |
|
|
| 0.2410 | 9360 | 2.2348 | - | - | - | - | - | |
|
|
| 0.2430 | 9438 | 2.2268 | - | - | - | - | - | |
|
|
| 0.2450 | 9516 | 2.2216 | - | - | - | - | - | |
|
|
| 0.2470 | 9594 | 2.2366 | - | - | - | - | - | |
|
|
| 0.2490 | 9672 | 2.2292 | - | - | - | - | - | |
|
|
| 0.2504 | 9725 | - | 2.2219 | 0.7040 (+0.1636) | 0.4322 (+0.1071) | 0.7859 (+0.2853) | 0.6407 (+0.1853) | |
|
|
| 0.2510 | 9750 | 2.2018 | - | - | - | - | - | |
|
|
| 0.2531 | 9828 | 2.1947 | - | - | - | - | - | |
|
|
| 0.2551 | 9906 | 2.1809 | - | - | - | - | - | |
|
|
| 0.2571 | 9984 | 2.2151 | - | - | - | - | - | |
|
|
| 0.2591 | 10062 | 2.1765 | - | - | - | - | - | |
|
|
| 0.2604 | 10114 | - | 2.1775 | 0.6820 (+0.1415) | 0.4325 (+0.1075) | 0.7861 (+0.2855) | 0.6335 (+0.1782) | |
|
|
| 0.2611 | 10140 | 2.1634 | - | - | - | - | - | |
|
|
| 0.2631 | 10218 | 2.1752 | - | - | - | - | - | |
|
|
| 0.2651 | 10296 | 2.1746 | - | - | - | - | - | |
|
|
| 0.2671 | 10374 | 2.129 | - | - | - | - | - | |
|
|
| 0.2691 | 10452 | 2.1452 | - | - | - | - | - | |
|
|
| 0.2704 | 10503 | - | 2.1497 | 0.7014 (+0.1610) | 0.4440 (+0.1190) | 0.7896 (+0.2890) | 0.6450 (+0.1896) | |
|
|
| 0.2711 | 10530 | 2.1273 | - | - | - | - | - | |
|
|
| 0.2731 | 10608 | 2.1441 | - | - | - | - | - | |
|
|
| 0.2751 | 10686 | 2.1364 | - | - | - | - | - | |
|
|
| 0.2772 | 10764 | 2.1504 | - | - | - | - | - | |
|
|
| 0.2792 | 10842 | 2.116 | - | - | - | - | - | |
|
|
| 0.2804 | 10892 | - | 2.1506 | 0.7022 (+0.1618) | 0.4360 (+0.1109) | 0.7891 (+0.2885) | 0.6424 (+0.1871) | |
|
|
| 0.2812 | 10920 | 2.1029 | - | - | - | - | - | |
|
|
| 0.2832 | 10998 | 2.0787 | - | - | - | - | - | |
|
|
| 0.2852 | 11076 | 2.1119 | - | - | - | - | - | |
|
|
| 0.2872 | 11154 | 2.0874 | - | - | - | - | - | |
|
|
| 0.2892 | 11232 | 2.0925 | - | - | - | - | - | |
|
|
| 0.2905 | 11281 | - | 2.1008 | 0.6965 (+0.1560) | 0.4397 (+0.1146) | 0.7937 (+0.2931) | 0.6433 (+0.1879) | |
|
|
| 0.2912 | 11310 | 2.067 | - | - | - | - | - | |
|
|
| 0.2932 | 11388 | 2.0569 | - | - | - | - | - | |
|
|
| 0.2952 | 11466 | 2.0698 | - | - | - | - | - | |
|
|
| 0.2972 | 11544 | 2.061 | - | - | - | - | - | |
|
|
| 0.2992 | 11622 | 2.0437 | - | - | - | - | - | |
|
|
| 0.3005 | 11670 | - | 2.1165 | 0.6991 (+0.1587) | 0.4332 (+0.1082) | 0.7979 (+0.2973) | 0.6434 (+0.1880) | |
|
|
| 0.3013 | 11700 | 2.0703 | - | - | - | - | - | |
|
|
| 0.3033 | 11778 | 2.0376 | - | - | - | - | - | |
|
|
| 0.3053 | 11856 | 2.0282 | - | - | - | - | - | |
|
|
| 0.3073 | 11934 | 2.0309 | - | - | - | - | - | |
|
|
| 0.3093 | 12012 | 2.0278 | - | - | - | - | - | |
|
|
| 0.3105 | 12059 | - | 2.0215 | 0.7035 (+0.1631) | 0.4328 (+0.1078) | 0.8016 (+0.3010) | 0.6460 (+0.1906) | |
|
|
| 0.3113 | 12090 | 2.02 | - | - | - | - | - | |
|
|
| 0.3133 | 12168 | 2.0036 | - | - | - | - | - | |
|
|
| 0.3153 | 12246 | 1.9998 | - | - | - | - | - | |
|
|
| 0.3173 | 12324 | 1.9902 | - | - | - | - | - | |
|
|
| 0.3193 | 12402 | 2.0017 | - | - | - | - | - | |
|
|
| 0.3205 | 12448 | - | 2.0026 | 0.7053 (+0.1649) | 0.4374 (+0.1124) | 0.7872 (+0.2865) | 0.6433 (+0.1879) | |
|
|
| 0.3213 | 12480 | 1.9936 | - | - | - | - | - | |
|
|
| 0.3233 | 12558 | 1.9797 | - | - | - | - | - | |
|
|
| 0.3254 | 12636 | 1.9993 | - | - | - | - | - | |
|
|
| 0.3274 | 12714 | 1.9675 | - | - | - | - | - | |
|
|
| 0.3294 | 12792 | 1.9707 | - | - | - | - | - | |
|
|
| 0.3305 | 12837 | - | 1.9845 | 0.6987 (+0.1583) | 0.4428 (+0.1177) | 0.7978 (+0.2972) | 0.6464 (+0.1911) | |
|
|
| 0.3314 | 12870 | 1.9597 | - | - | - | - | - | |
|
|
| 0.3334 | 12948 | 1.972 | - | - | - | - | - | |
|
|
| 0.3354 | 13026 | 1.9621 | - | - | - | - | - | |
|
|
| 0.3374 | 13104 | 1.9656 | - | - | - | - | - | |
|
|
| 0.3394 | 13182 | 1.942 | - | - | - | - | - | |
|
|
| 0.3405 | 13226 | - | 1.9720 | 0.7032 (+0.1627) | 0.4371 (+0.1120) | 0.7909 (+0.2903) | 0.6437 (+0.1884) | |
|
|
| 0.3414 | 13260 | 1.9184 | - | - | - | - | - | |
|
|
| 0.3434 | 13338 | 1.9362 | - | - | - | - | - | |
|
|
| 0.3454 | 13416 | 1.9075 | - | - | - | - | - | |
|
|
| 0.3474 | 13494 | 1.9191 | - | - | - | - | - | |
|
|
| 0.3495 | 13572 | 1.9234 | - | - | - | - | - | |
|
|
| 0.3506 | 13615 | - | 1.9565 | 0.7179 (+0.1774) | 0.4441 (+0.1191) | 0.7873 (+0.2866) | 0.6498 (+0.1944) | |
|
|
| 0.3515 | 13650 | 1.9019 | - | - | - | - | - | |
|
|
| 0.3535 | 13728 | 1.9003 | - | - | - | - | - | |
|
|
| 0.3555 | 13806 | 1.9249 | - | - | - | - | - | |
|
|
| 0.3575 | 13884 | 1.9253 | - | - | - | - | - | |
|
|
| 0.3595 | 13962 | 1.9325 | - | - | - | - | - | |
|
|
| 0.3606 | 14004 | - | 1.8728 | 0.6957 (+0.1552) | 0.4306 (+0.1055) | 0.7976 (+0.2970) | 0.6413 (+0.1859) | |
|
|
| 0.3615 | 14040 | 1.8806 | - | - | - | - | - | |
|
|
| 0.3635 | 14118 | 1.877 | - | - | - | - | - | |
|
|
| 0.3655 | 14196 | 1.8853 | - | - | - | - | - | |
|
|
| 0.3675 | 14274 | 1.8759 | - | - | - | - | - | |
|
|
| 0.3695 | 14352 | 1.8652 | - | - | - | - | - | |
|
|
| 0.3706 | 14393 | - | 1.8743 | 0.7072 (+0.1668) | 0.4305 (+0.1054) | 0.7942 (+0.2936) | 0.6440 (+0.1886) | |
|
|
| 0.3715 | 14430 | 1.8737 | - | - | - | - | - | |
|
|
| 0.3736 | 14508 | 1.8627 | - | - | - | - | - | |
|
|
| 0.3756 | 14586 | 1.8504 | - | - | - | - | - | |
|
|
| 0.3776 | 14664 | 1.8553 | - | - | - | - | - | |
|
|
| 0.3796 | 14742 | 1.8819 | - | - | - | - | - | |
|
|
| 0.3806 | 14782 | - | 1.8626 | 0.6979 (+0.1575) | 0.4347 (+0.1097) | 0.7939 (+0.2932) | 0.6422 (+0.1868) | |
|
|
| 0.3816 | 14820 | 1.8535 | - | - | - | - | - | |
|
|
| 0.3836 | 14898 | 1.8452 | - | - | - | - | - | |
|
|
| 0.3856 | 14976 | 1.8402 | - | - | - | - | - | |
|
|
| 0.3876 | 15054 | 1.8568 | - | - | - | - | - | |
|
|
| 0.3896 | 15132 | 1.828 | - | - | - | - | - | |
|
|
| 0.3906 | 15171 | - | 1.8430 | 0.7060 (+0.1656) | 0.4412 (+0.1162) | 0.7855 (+0.2848) | 0.6442 (+0.1888) | |
|
|
| 0.3916 | 15210 | 1.835 | - | - | - | - | - | |
|
|
| 0.3936 | 15288 | 1.8453 | - | - | - | - | - | |
|
|
| 0.3956 | 15366 | 1.8354 | - | - | - | - | - | |
|
|
| 0.3977 | 15444 | 1.8252 | - | - | - | - | - | |
|
|
| 0.3997 | 15522 | 1.8272 | - | - | - | - | - | |
|
|
| 0.4006 | 15560 | - | 1.8492 | 0.7016 (+0.1612) | 0.4332 (+0.1082) | 0.7863 (+0.2857) | 0.6404 (+0.1850) | |
|
|
| 0.4017 | 15600 | 1.8083 | - | - | - | - | - | |
|
|
| 0.4037 | 15678 | 1.8132 | - | - | - | - | - | |
|
|
| 0.4057 | 15756 | 1.7857 | - | - | - | - | - | |
|
|
| 0.4077 | 15834 | 1.8222 | - | - | - | - | - | |
|
|
| 0.4097 | 15912 | 1.7911 | - | - | - | - | - | |
|
|
| 0.4107 | 15949 | - | 1.7805 | 0.6863 (+0.1459) | 0.4452 (+0.1201) | 0.7911 (+0.2904) | 0.6408 (+0.1855) | |
|
|
| 0.4117 | 15990 | 1.8027 | - | - | - | - | - | |
|
|
| 0.4137 | 16068 | 1.8112 | - | - | - | - | - | |
|
|
| 0.4157 | 16146 | 1.795 | - | - | - | - | - | |
|
|
| 0.4177 | 16224 | 1.7912 | - | - | - | - | - | |
|
|
| 0.4197 | 16302 | 1.7574 | - | - | - | - | - | |
|
|
| 0.4207 | 16338 | - | 1.7685 | 0.6899 (+0.1495) | 0.4375 (+0.1124) | 0.7935 (+0.2928) | 0.6403 (+0.1849) | |
|
|
| 0.4218 | 16380 | 1.7746 | - | - | - | - | - | |
|
|
| 0.4238 | 16458 | 1.7878 | - | - | - | - | - | |
|
|
| 0.4258 | 16536 | 1.7794 | - | - | - | - | - | |
|
|
| 0.4278 | 16614 | 1.7586 | - | - | - | - | - | |
|
|
| 0.4298 | 16692 | 1.7565 | - | - | - | - | - | |
|
|
| 0.4307 | 16727 | - | 1.7783 | 0.6999 (+0.1595) | 0.4433 (+0.1182) | 0.7836 (+0.2829) | 0.6423 (+0.1869) | |
|
|
| 0.4318 | 16770 | 1.7578 | - | - | - | - | - | |
|
|
| 0.4338 | 16848 | 1.7402 | - | - | - | - | - | |
|
|
| 0.4358 | 16926 | 1.7549 | - | - | - | - | - | |
|
|
| 0.4378 | 17004 | 1.7556 | - | - | - | - | - | |
|
|
| 0.4398 | 17082 | 1.7479 | - | - | - | - | - | |
|
|
| 0.4407 | 17116 | - | 1.7780 | 0.6830 (+0.1426) | 0.4377 (+0.1127) | 0.7820 (+0.2814) | 0.6343 (+0.1789) | |
|
|
| 0.4418 | 17160 | 1.7505 | - | - | - | - | - | |
|
|
| 0.4438 | 17238 | 1.7515 | - | - | - | - | - | |
|
|
| 0.4459 | 17316 | 1.7301 | - | - | - | - | - | |
|
|
| 0.4479 | 17394 | 1.7521 | - | - | - | - | - | |
|
|
| 0.4499 | 17472 | 1.7397 | - | - | - | - | - | |
|
|
| 0.4507 | 17505 | - | 1.7556 | 0.6781 (+0.1377) | 0.4357 (+0.1107) | 0.8007 (+0.3001) | 0.6382 (+0.1828) | |
|
|
| 0.4519 | 17550 | 1.734 | - | - | - | - | - | |
|
|
| 0.4539 | 17628 | 1.7251 | - | - | - | - | - | |
|
|
| 0.4559 | 17706 | 1.7292 | - | - | - | - | - | |
|
|
| 0.4579 | 17784 | 1.7269 | - | - | - | - | - | |
|
|
| 0.4599 | 17862 | 1.6889 | - | - | - | - | - | |
|
|
| 0.4607 | 17894 | - | 1.7416 | 0.6845 (+0.1441) | 0.4323 (+0.1072) | 0.7945 (+0.2938) | 0.6371 (+0.1817) | |
|
|
| 0.4619 | 17940 | 1.7034 | - | - | - | - | - | |
|
|
| 0.4639 | 18018 | 1.6913 | - | - | - | - | - | |
|
|
| 0.4659 | 18096 | 1.7312 | - | - | - | - | - | |
|
|
| 0.4679 | 18174 | 1.6997 | - | - | - | - | - | |
|
|
| 0.4700 | 18252 | 1.6941 | - | - | - | - | - | |
|
|
| 0.4708 | 18283 | - | 1.6900 | 0.6796 (+0.1392) | 0.4382 (+0.1132) | 0.7881 (+0.2874) | 0.6353 (+0.1799) | |
|
|
| 0.4720 | 18330 | 1.6976 | - | - | - | - | - | |
|
|
| 0.4740 | 18408 | 1.7138 | - | - | - | - | - | |
|
|
| 0.4760 | 18486 | 1.6885 | - | - | - | - | - | |
|
|
| 0.4780 | 18564 | 1.7152 | - | - | - | - | - | |
|
|
| 0.4800 | 18642 | 1.6676 | - | - | - | - | - | |
|
|
| 0.4808 | 18672 | - | 1.6843 | 0.6842 (+0.1438) | 0.4380 (+0.1129) | 0.7832 (+0.2826) | 0.6351 (+0.1798) | |
|
|
| 0.4820 | 18720 | 1.7004 | - | - | - | - | - | |
|
|
| 0.4840 | 18798 | 1.6834 | - | - | - | - | - | |
|
|
| 0.4860 | 18876 | 1.6955 | - | - | - | - | - | |
|
|
| 0.4880 | 18954 | 1.6971 | - | - | - | - | - | |
|
|
| 0.4900 | 19032 | 1.7012 | - | - | - | - | - | |
|
|
| 0.4908 | 19061 | - | 1.6958 | 0.6969 (+0.1565) | 0.4353 (+0.1103) | 0.7831 (+0.2824) | 0.6384 (+0.1831) | |
|
|
| 0.4920 | 19110 | 1.699 | - | - | - | - | - | |
|
|
| 0.4941 | 19188 | 1.6602 | - | - | - | - | - | |
|
|
| 0.4961 | 19266 | 1.6515 | - | - | - | - | - | |
|
|
| 0.4981 | 19344 | 1.6488 | - | - | - | - | - | |
|
|
| 0.5001 | 19422 | 1.6533 | - | - | - | - | - | |
|
|
| 0.5008 | 19450 | - | 1.6820 | 0.6937 (+0.1533) | 0.4359 (+0.1108) | 0.7847 (+0.2841) | 0.6381 (+0.1827) | |
|
|
| 0.5021 | 19500 | 1.6465 | - | - | - | - | - | |
|
|
| 0.5041 | 19578 | 1.6682 | - | - | - | - | - | |
|
|
| 0.5061 | 19656 | 1.6591 | - | - | - | - | - | |
|
|
| 0.5081 | 19734 | 1.6603 | - | - | - | - | - | |
|
|
| 0.5101 | 19812 | 1.6469 | - | - | - | - | - | |
|
|
| 0.5108 | 19839 | - | 1.6590 | 0.7084 (+0.1680) | 0.4303 (+0.1053) | 0.7996 (+0.2990) | 0.6461 (+0.1907) | |
|
|
| 0.5121 | 19890 | 1.6462 | - | - | - | - | - | |
|
|
| 0.5141 | 19968 | 1.648 | - | - | - | - | - | |
|
|
| 0.5161 | 20046 | 1.6577 | - | - | - | - | - | |
|
|
| 0.5182 | 20124 | 1.6456 | - | - | - | - | - | |
|
|
| 0.5202 | 20202 | 1.6321 | - | - | - | - | - | |
|
|
| 0.5208 | 20228 | - | 1.6421 | 0.7064 (+0.1660) | 0.4300 (+0.1050) | 0.7759 (+0.2753) | 0.6375 (+0.1821) | |
|
|
| 0.5222 | 20280 | 1.6187 | - | - | - | - | - | |
|
|
| 0.5242 | 20358 | 1.6326 | - | - | - | - | - | |
|
|
| 0.5262 | 20436 | 1.6286 | - | - | - | - | - | |
|
|
| 0.5282 | 20514 | 1.6071 | - | - | - | - | - | |
|
|
| 0.5302 | 20592 | 1.6112 | - | - | - | - | - | |
|
|
| 0.5308 | 20617 | - | 1.6292 | 0.7030 (+0.1626) | 0.4334 (+0.1083) | 0.7812 (+0.2806) | 0.6392 (+0.1838) | |
|
|
| 0.5322 | 20670 | 1.6242 | - | - | - | - | - | |
|
|
| 0.5342 | 20748 | 1.613 | - | - | - | - | - | |
|
|
| 0.5362 | 20826 | 1.6209 | - | - | - | - | - | |
|
|
| 0.5382 | 20904 | 1.6224 | - | - | - | - | - | |
|
|
| 0.5402 | 20982 | 1.5982 | - | - | - | - | - | |
|
|
| 0.5409 | 21006 | - | 1.6298 | 0.7073 (+0.1669) | 0.4342 (+0.1092) | 0.7795 (+0.2789) | 0.6403 (+0.1850) | |
|
|
| 0.5423 | 21060 | 1.6032 | - | - | - | - | - | |
|
|
| 0.5443 | 21138 | 1.6099 | - | - | - | - | - | |
|
|
| 0.5463 | 21216 | 1.599 | - | - | - | - | - | |
|
|
| 0.5483 | 21294 | 1.6098 | - | - | - | - | - | |
|
|
| 0.5503 | 21372 | 1.5978 | - | - | - | - | - | |
|
|
| 0.5509 | 21395 | - | 1.6169 | 0.6832 (+0.1428) | 0.4393 (+0.1142) | 0.7890 (+0.2883) | 0.6372 (+0.1818) | |
|
|
| 0.5523 | 21450 | 1.6116 | - | - | - | - | - | |
|
|
| 0.5543 | 21528 | 1.5971 | - | - | - | - | - | |
|
|
| 0.5563 | 21606 | 1.5883 | - | - | - | - | - | |
|
|
| 0.5583 | 21684 | 1.5852 | - | - | - | - | - | |
|
|
| 0.5603 | 21762 | 1.6024 | - | - | - | - | - | |
|
|
| 0.5609 | 21784 | - | 1.5942 | 0.6892 (+0.1488) | 0.4323 (+0.1072) | 0.7858 (+0.2851) | 0.6357 (+0.1804) | |
|
|
| 0.5623 | 21840 | 1.6046 | - | - | - | - | - | |
|
|
| 0.5643 | 21918 | 1.5723 | - | - | - | - | - | |
|
|
| 0.5664 | 21996 | 1.5583 | - | - | - | - | - | |
|
|
| 0.5684 | 22074 | 1.5917 | - | - | - | - | - | |
|
|
| 0.5704 | 22152 | 1.5819 | - | - | - | - | - | |
|
|
| 0.5709 | 22173 | - | 1.5676 | 0.6901 (+0.1496) | 0.4377 (+0.1127) | 0.7944 (+0.2938) | 0.6407 (+0.1854) | |
|
|
| 0.5724 | 22230 | 1.5842 | - | - | - | - | - | |
|
|
| 0.5744 | 22308 | 1.5815 | - | - | - | - | - | |
|
|
| 0.5764 | 22386 | 1.5972 | - | - | - | - | - | |
|
|
| 0.5784 | 22464 | 1.568 | - | - | - | - | - | |
|
|
| 0.5804 | 22542 | 1.5798 | - | - | - | - | - | |
|
|
| 0.5809 | 22562 | - | 1.5817 | 0.6882 (+0.1478) | 0.4352 (+0.1102) | 0.7858 (+0.2852) | 0.6364 (+0.1810) | |
|
|
| 0.5824 | 22620 | 1.5523 | - | - | - | - | - | |
|
|
| 0.5844 | 22698 | 1.5821 | - | - | - | - | - | |
|
|
| 0.5864 | 22776 | 1.5812 | - | - | - | - | - | |
|
|
| 0.5884 | 22854 | 1.5837 | - | - | - | - | - | |
|
|
| 0.5905 | 22932 | 1.5731 | - | - | - | - | - | |
|
|
| 0.5909 | 22951 | - | 1.5726 | 0.6916 (+0.1512) | 0.4291 (+0.1041) | 0.7904 (+0.2898) | 0.6370 (+0.1817) | |
|
|
| 0.5925 | 23010 | 1.588 | - | - | - | - | - | |
|
|
| 0.5945 | 23088 | 1.5709 | - | - | - | - | - | |
|
|
| 0.5965 | 23166 | 1.5522 | - | - | - | - | - | |
|
|
| 0.5985 | 23244 | 1.5469 | - | - | - | - | - | |
|
|
| 0.6005 | 23322 | 1.549 | - | - | - | - | - | |
|
|
| 0.6010 | 23340 | - | 1.5656 | 0.6924 (+0.1520) | 0.4289 (+0.1039) | 0.7854 (+0.2847) | 0.6356 (+0.1802) | |
|
|
| 0.6025 | 23400 | 1.5528 | - | - | - | - | - | |
|
|
| 0.6045 | 23478 | 1.5671 | - | - | - | - | - | |
|
|
| 0.6065 | 23556 | 1.5491 | - | - | - | - | - | |
|
|
| 0.6085 | 23634 | 1.558 | - | - | - | - | - | |
|
|
| 0.6105 | 23712 | 1.5406 | - | - | - | - | - | |
|
|
| 0.6110 | 23729 | - | 1.5599 | 0.6984 (+0.1580) | 0.4335 (+0.1084) | 0.7815 (+0.2809) | 0.6378 (+0.1825) | |
|
|
| 0.6125 | 23790 | 1.5491 | - | - | - | - | - | |
|
|
| 0.6146 | 23868 | 1.5452 | - | - | - | - | - | |
|
|
| 0.6166 | 23946 | 1.5455 | - | - | - | - | - | |
|
|
| 0.6186 | 24024 | 1.5487 | - | - | - | - | - | |
|
|
| 0.6206 | 24102 | 1.5639 | - | - | - | - | - | |
|
|
| 0.6210 | 24118 | - | 1.5381 | 0.6892 (+0.1488) | 0.4343 (+0.1092) | 0.7861 (+0.2855) | 0.6365 (+0.1812) | |
|
|
| 0.6226 | 24180 | 1.5223 | - | - | - | - | - | |
|
|
| 0.6246 | 24258 | 1.5293 | - | - | - | - | - | |
|
|
| 0.6266 | 24336 | 1.5441 | - | - | - | - | - | |
|
|
| 0.6286 | 24414 | 1.535 | - | - | - | - | - | |
|
|
| 0.6306 | 24492 | 1.5151 | - | - | - | - | - | |
|
|
| 0.6310 | 24507 | - | 1.5285 | 0.6896 (+0.1492) | 0.4347 (+0.1097) | 0.7967 (+0.2960) | 0.6403 (+0.1850) | |
|
|
| 0.6326 | 24570 | 1.524 | - | - | - | - | - | |
|
|
| 0.6346 | 24648 | 1.5383 | - | - | - | - | - | |
|
|
| 0.6366 | 24726 | 1.5218 | - | - | - | - | - | |
|
|
| 0.6387 | 24804 | 1.5176 | - | - | - | - | - | |
|
|
| 0.6407 | 24882 | 1.5136 | - | - | - | - | - | |
|
|
| 0.6410 | 24896 | - | 1.5087 | 0.6776 (+0.1372) | 0.4326 (+0.1076) | 0.7880 (+0.2874) | 0.6327 (+0.1774) | |
|
|
| 0.6427 | 24960 | 1.5151 | - | - | - | - | - | |
|
|
| 0.6447 | 25038 | 1.5177 | - | - | - | - | - | |
|
|
| 0.6467 | 25116 | 1.5054 | - | - | - | - | - | |
|
|
| 0.6487 | 25194 | 1.5206 | - | - | - | - | - | |
|
|
| 0.6507 | 25272 | 1.4956 | - | - | - | - | - | |
|
|
| 0.6510 | 25285 | - | 1.5238 | 0.6972 (+0.1568) | 0.4325 (+0.1075) | 0.7842 (+0.2836) | 0.6380 (+0.1826) | |
|
|
| 0.6527 | 25350 | 1.4985 | - | - | - | - | - | |
|
|
| 0.6547 | 25428 | 1.51 | - | - | - | - | - | |
|
|
| 0.6567 | 25506 | 1.4913 | - | - | - | - | - | |
|
|
| 0.6587 | 25584 | 1.5001 | - | - | - | - | - | |
|
|
| 0.6607 | 25662 | 1.4997 | - | - | - | - | - | |
|
|
| 0.6611 | 25674 | - | 1.5059 | 0.6950 (+0.1546) | 0.4310 (+0.1060) | 0.7887 (+0.2881) | 0.6382 (+0.1829) | |
|
|
| 0.6628 | 25740 | 1.492 | - | - | - | - | - | |
|
|
| 0.6648 | 25818 | 1.4816 | - | - | - | - | - | |
|
|
| 0.6668 | 25896 | 1.4959 | - | - | - | - | - | |
|
|
| 0.6688 | 25974 | 1.5026 | - | - | - | - | - | |
|
|
| 0.6708 | 26052 | 1.4936 | - | - | - | - | - | |
|
|
| 0.6711 | 26063 | - | 1.4728 | 0.6938 (+0.1534) | 0.4373 (+0.1123) | 0.7935 (+0.2929) | 0.6416 (+0.1862) | |
|
|
| 0.6728 | 26130 | 1.481 | - | - | - | - | - | |
|
|
| 0.6748 | 26208 | 1.4999 | - | - | - | - | - | |
|
|
| 0.6768 | 26286 | 1.5008 | - | - | - | - | - | |
|
|
| 0.6788 | 26364 | 1.47 | - | - | - | - | - | |
|
|
| 0.6808 | 26442 | 1.4855 | - | - | - | - | - | |
|
|
| 0.6811 | 26452 | - | 1.4808 | 0.6864 (+0.1460) | 0.4343 (+0.1092) | 0.7736 (+0.2729) | 0.6314 (+0.1761) | |
|
|
| 0.6828 | 26520 | 1.479 | - | - | - | - | - | |
|
|
| 0.6848 | 26598 | 1.4814 | - | - | - | - | - | |
|
|
| 0.6869 | 26676 | 1.4696 | - | - | - | - | - | |
|
|
| 0.6889 | 26754 | 1.4776 | - | - | - | - | - | |
|
|
| 0.6909 | 26832 | 1.4662 | - | - | - | - | - | |
|
|
| 0.6911 | 26841 | - | 1.4597 | 0.6842 (+0.1438) | 0.4369 (+0.1119) | 0.7928 (+0.2922) | 0.6380 (+0.1826) | |
|
|
| 0.6929 | 26910 | 1.4744 | - | - | - | - | - | |
|
|
| 0.6949 | 26988 | 1.4684 | - | - | - | - | - | |
|
|
| 0.6969 | 27066 | 1.4658 | - | - | - | - | - | |
|
|
| 0.6989 | 27144 | 1.4686 | - | - | - | - | - | |
|
|
| 0.7009 | 27222 | 1.4785 | - | - | - | - | - | |
|
|
| 0.7011 | 27230 | - | 1.4598 | 0.6980 (+0.1576) | 0.4330 (+0.1080) | 0.7886 (+0.2879) | 0.6399 (+0.1845) | |
|
|
| 0.7029 | 27300 | 1.4823 | - | - | - | - | - | |
|
|
| 0.7049 | 27378 | 1.4697 | - | - | - | - | - | |
|
|
| 0.7069 | 27456 | 1.4564 | - | - | - | - | - | |
|
|
| 0.7089 | 27534 | 1.4506 | - | - | - | - | - | |
|
|
| 0.7110 | 27612 | 1.4452 | - | - | - | - | - | |
|
|
| 0.7111 | 27619 | - | 1.4513 | 0.7083 (+0.1679) | 0.4298 (+0.1048) | 0.7848 (+0.2842) | 0.6410 (+0.1856) | |
|
|
| 0.7130 | 27690 | 1.4585 | - | - | - | - | - | |
|
|
| 0.7150 | 27768 | 1.4485 | - | - | - | - | - | |
|
|
| 0.7170 | 27846 | 1.4641 | - | - | - | - | - | |
|
|
| 0.7190 | 27924 | 1.4557 | - | - | - | - | - | |
|
|
| 0.7210 | 28002 | 1.4573 | - | - | - | - | - | |
|
|
| 0.7211 | 28008 | - | 1.4622 | 0.6934 (+0.1530) | 0.4379 (+0.1128) | 0.7971 (+0.2964) | 0.6428 (+0.1874) | |
|
|
| 0.7230 | 28080 | 1.4557 | - | - | - | - | - | |
|
|
| 0.7250 | 28158 | 1.4568 | - | - | - | - | - | |
|
|
| 0.7270 | 28236 | 1.4508 | - | - | - | - | - | |
|
|
| 0.7290 | 28314 | 1.459 | - | - | - | - | - | |
|
|
| 0.7310 | 28392 | 1.4636 | - | - | - | - | - | |
|
|
| 0.7312 | 28397 | - | 1.4458 | 0.6894 (+0.1490) | 0.4323 (+0.1072) | 0.7942 (+0.2935) | 0.6386 (+0.1832) | |
|
|
| 0.7330 | 28470 | 1.4486 | - | - | - | - | - | |
|
|
| 0.7351 | 28548 | 1.4706 | - | - | - | - | - | |
|
|
| 0.7371 | 28626 | 1.4511 | - | - | - | - | - | |
|
|
| 0.7391 | 28704 | 1.4665 | - | - | - | - | - | |
|
|
| 0.7411 | 28782 | 1.4437 | - | - | - | - | - | |
|
|
| 0.7412 | 28786 | - | 1.4317 | 0.6888 (+0.1483) | 0.4374 (+0.1124) | 0.7952 (+0.2945) | 0.6405 (+0.1851) | |
|
|
| 0.7431 | 28860 | 1.4436 | - | - | - | - | - | |
|
|
| 0.7451 | 28938 | 1.4211 | - | - | - | - | - | |
|
|
| 0.7471 | 29016 | 1.4313 | - | - | - | - | - | |
|
|
| 0.7491 | 29094 | 1.4353 | - | - | - | - | - | |
|
|
| 0.7511 | 29172 | 1.4218 | - | - | - | - | - | |
|
|
| 0.7512 | 29175 | - | 1.4379 | 0.6906 (+0.1502) | 0.4371 (+0.1120) | 0.7900 (+0.2893) | 0.6392 (+0.1838) | |
|
|
| 0.7531 | 29250 | 1.4302 | - | - | - | - | - | |
|
|
| 0.7551 | 29328 | 1.4294 | - | - | - | - | - | |
|
|
| 0.7571 | 29406 | 1.4255 | - | - | - | - | - | |
|
|
| 0.7592 | 29484 | 1.4374 | - | - | - | - | - | |
|
|
| 0.7612 | 29562 | 1.4278 | - | - | - | - | - | |
|
|
| 0.7612 | 29564 | - | 1.4246 | 0.6983 (+0.1579) | 0.4328 (+0.1077) | 0.7886 (+0.2879) | 0.6399 (+0.1845) | |
|
|
| 0.7632 | 29640 | 1.4133 | - | - | - | - | - | |
|
|
| 0.7652 | 29718 | 1.4351 | - | - | - | - | - | |
|
|
| 0.7672 | 29796 | 1.4215 | - | - | - | - | - | |
|
|
| 0.7692 | 29874 | 1.4331 | - | - | - | - | - | |
|
|
| 0.7712 | 29952 | 1.4226 | - | - | - | - | - | |
|
|
| 0.7712 | 29953 | - | 1.4197 | 0.7034 (+0.1630) | 0.4313 (+0.1063) | 0.7938 (+0.2932) | 0.6428 (+0.1875) | |
|
|
| 0.7732 | 30030 | 1.4374 | - | - | - | - | - | |
|
|
| 0.7752 | 30108 | 1.4181 | - | - | - | - | - | |
|
|
| 0.7772 | 30186 | 1.4228 | - | - | - | - | - | |
|
|
| 0.7792 | 30264 | 1.4054 | - | - | - | - | - | |
|
|
| 0.7812 | 30342 | 1.4225 | 1.4250 | 0.7096 (+0.1692) | 0.4364 (+0.1114) | 0.7905 (+0.2899) | 0.6455 (+0.1902) | |
|
|
| 0.7833 | 30420 | 1.4216 | - | - | - | - | - | |
|
|
| 0.7853 | 30498 | 1.4137 | - | - | - | - | - | |
|
|
| 0.7873 | 30576 | 1.4233 | - | - | - | - | - | |
|
|
| 0.7893 | 30654 | 1.4139 | - | - | - | - | - | |
|
|
| 0.7913 | 30731 | - | 1.4091 | 0.6835 (+0.1430) | 0.4351 (+0.1101) | 0.7894 (+0.2887) | 0.6360 (+0.1806) | |
|
|
| 0.7913 | 30732 | 1.4071 | - | - | - | - | - | |
|
|
| 0.7933 | 30810 | 1.4261 | - | - | - | - | - | |
|
|
| 0.7953 | 30888 | 1.4255 | - | - | - | - | - | |
|
|
| 0.7973 | 30966 | 1.4011 | - | - | - | - | - | |
|
|
| 0.7993 | 31044 | 1.4125 | - | - | - | - | - | |
|
|
| 0.8013 | 31120 | - | 1.4071 | 0.6894 (+0.1489) | 0.4338 (+0.1088) | 0.7784 (+0.2777) | 0.6339 (+0.1785) | |
|
|
| 0.8013 | 31122 | 1.4023 | - | - | - | - | - | |
|
|
| 0.8033 | 31200 | 1.4043 | - | - | - | - | - | |
|
|
| 0.8053 | 31278 | 1.4123 | - | - | - | - | - | |
|
|
| 0.8074 | 31356 | 1.4206 | - | - | - | - | - | |
|
|
| 0.8094 | 31434 | 1.4043 | - | - | - | - | - | |
|
|
| 0.8113 | 31509 | - | 1.3989 | 0.6856 (+0.1452) | 0.4354 (+0.1103) | 0.7752 (+0.2746) | 0.6321 (+0.1767) | |
|
|
| 0.8114 | 31512 | 1.4099 | - | - | - | - | - | |
|
|
| 0.8134 | 31590 | 1.3995 | - | - | - | - | - | |
|
|
| 0.8154 | 31668 | 1.4002 | - | - | - | - | - | |
|
|
| 0.8174 | 31746 | 1.3961 | - | - | - | - | - | |
|
|
| 0.8194 | 31824 | 1.3848 | - | - | - | - | - | |
|
|
| 0.8213 | 31898 | - | 1.3922 | 0.6864 (+0.1460) | 0.4341 (+0.1090) | 0.7734 (+0.2728) | 0.6313 (+0.1760) | |
|
|
| 0.8214 | 31902 | 1.418 | - | - | - | - | - | |
|
|
| 0.8234 | 31980 | 1.4076 | - | - | - | - | - | |
|
|
| 0.8254 | 32058 | 1.3818 | - | - | - | - | - | |
|
|
| 0.8274 | 32136 | 1.3747 | - | - | - | - | - | |
|
|
| 0.8294 | 32214 | 1.3872 | - | - | - | - | - | |
|
|
| 0.8313 | 32287 | - | 1.3914 | 0.6930 (+0.1526) | 0.4386 (+0.1135) | 0.7725 (+0.2718) | 0.6347 (+0.1793) | |
|
|
| 0.8315 | 32292 | 1.3882 | - | - | - | - | - | |
|
|
| 0.8335 | 32370 | 1.4111 | - | - | - | - | - | |
|
|
| 0.8355 | 32448 | 1.3677 | - | - | - | - | - | |
|
|
| 0.8375 | 32526 | 1.3726 | - | - | - | - | - | |
|
|
| 0.8395 | 32604 | 1.377 | - | - | - | - | - | |
|
|
| 0.8413 | 32676 | - | 1.3778 | 0.6842 (+0.1438) | 0.4387 (+0.1136) | 0.7823 (+0.2817) | 0.6351 (+0.1797) | |
|
|
| 0.8415 | 32682 | 1.3853 | - | - | - | - | - | |
|
|
| 0.8435 | 32760 | 1.3851 | - | - | - | - | - | |
|
|
| 0.8455 | 32838 | 1.3724 | - | - | - | - | - | |
|
|
| 0.8475 | 32916 | 1.3845 | - | - | - | - | - | |
|
|
| 0.8495 | 32994 | 1.3827 | - | - | - | - | - | |
|
|
| 0.8514 | 33065 | - | 1.3790 | 0.6848 (+0.1443) | 0.4374 (+0.1124) | 0.7797 (+0.2791) | 0.6340 (+0.1786) | |
|
|
| 0.8515 | 33072 | 1.388 | - | - | - | - | - | |
|
|
| 0.8535 | 33150 | 1.377 | - | - | - | - | - | |
|
|
| 0.8556 | 33228 | 1.3762 | - | - | - | - | - | |
|
|
| 0.8576 | 33306 | 1.3716 | - | - | - | - | - | |
|
|
| 0.8596 | 33384 | 1.3763 | - | - | - | - | - | |
|
|
| 0.8614 | 33454 | - | 1.3811 | 0.6874 (+0.1469) | 0.4388 (+0.1138) | 0.7727 (+0.2721) | 0.6330 (+0.1776) | |
|
|
| 0.8616 | 33462 | 1.3755 | - | - | - | - | - | |
|
|
| 0.8636 | 33540 | 1.3733 | - | - | - | - | - | |
|
|
| 0.8656 | 33618 | 1.3621 | - | - | - | - | - | |
|
|
| 0.8676 | 33696 | 1.3648 | - | - | - | - | - | |
|
|
| 0.8696 | 33774 | 1.3665 | - | - | - | - | - | |
|
|
| 0.8714 | 33843 | - | 1.3638 | 0.6905 (+0.1500) | 0.4394 (+0.1143) | 0.7755 (+0.2749) | 0.6351 (+0.1797) | |
|
|
| 0.8716 | 33852 | 1.3636 | - | - | - | - | - | |
|
|
| 0.8736 | 33930 | 1.3654 | - | - | - | - | - | |
|
|
| 0.8756 | 34008 | 1.365 | - | - | - | - | - | |
|
|
| 0.8776 | 34086 | 1.3769 | - | - | - | - | - | |
|
|
| 0.8797 | 34164 | 1.3679 | - | - | - | - | - | |
|
|
| 0.8814 | 34232 | - | 1.3558 | 0.6891 (+0.1486) | 0.4378 (+0.1128) | 0.7802 (+0.2796) | 0.6357 (+0.1803) | |
|
|
| 0.8817 | 34242 | 1.3614 | - | - | - | - | - | |
|
|
| 0.8837 | 34320 | 1.3672 | - | - | - | - | - | |
|
|
| 0.8857 | 34398 | 1.3632 | - | - | - | - | - | |
|
|
| 0.8877 | 34476 | 1.3759 | - | - | - | - | - | |
|
|
| 0.8897 | 34554 | 1.3704 | - | - | - | - | - | |
|
|
| 0.8914 | 34621 | - | 1.3621 | 0.6851 (+0.1447) | 0.4381 (+0.1131) | 0.7810 (+0.2804) | 0.6348 (+0.1794) | |
|
|
| 0.8917 | 34632 | 1.3523 | - | - | - | - | - | |
|
|
| 0.8937 | 34710 | 1.3444 | - | - | - | - | - | |
|
|
| 0.8957 | 34788 | 1.3419 | - | - | - | - | - | |
|
|
| 0.8977 | 34866 | 1.3616 | - | - | - | - | - | |
|
|
| 0.8997 | 34944 | 1.3519 | - | - | - | - | - | |
|
|
| 0.9014 | 35010 | - | 1.3627 | 0.6878 (+0.1474) | 0.4377 (+0.1127) | 0.7732 (+0.2726) | 0.6329 (+0.1776) | |
|
|
| 0.9017 | 35022 | 1.3614 | - | - | - | - | - | |
|
|
| 0.9038 | 35100 | 1.3606 | - | - | - | - | - | |
|
|
| 0.9058 | 35178 | 1.3401 | - | - | - | - | - | |
|
|
| 0.9078 | 35256 | 1.3503 | - | - | - | - | - | |
|
|
| 0.9098 | 35334 | 1.3422 | - | - | - | - | - | |
|
|
| 0.9115 | 35399 | - | 1.3563 | 0.6938 (+0.1534) | 0.4382 (+0.1132) | 0.7775 (+0.2768) | 0.6365 (+0.1811) | |
|
|
| 0.9118 | 35412 | 1.3397 | - | - | - | - | - | |
|
|
| 0.9138 | 35490 | 1.3592 | - | - | - | - | - | |
|
|
| 0.9158 | 35568 | 1.3687 | - | - | - | - | - | |
|
|
| 0.9178 | 35646 | 1.3452 | - | - | - | - | - | |
|
|
| 0.9198 | 35724 | 1.3685 | - | - | - | - | - | |
|
|
| 0.9215 | 35788 | - | 1.3459 | 0.6877 (+0.1473) | 0.4330 (+0.1080) | 0.7924 (+0.2917) | 0.6377 (+0.1824) | |
|
|
| 0.9218 | 35802 | 1.3542 | - | - | - | - | - | |
|
|
| 0.9238 | 35880 | 1.3567 | - | - | - | - | - | |
|
|
| 0.9258 | 35958 | 1.3627 | - | - | - | - | - | |
|
|
| 0.9279 | 36036 | 1.3476 | - | - | - | - | - | |
|
|
| 0.9299 | 36114 | 1.3556 | - | - | - | - | - | |
|
|
| 0.9315 | 36177 | - | 1.3442 | 0.6951 (+0.1547) | 0.4382 (+0.1132) | 0.7749 (+0.2742) | 0.6361 (+0.1807) | |
|
|
| 0.9319 | 36192 | 1.3444 | - | - | - | - | - | |
|
|
| 0.9339 | 36270 | 1.3625 | - | - | - | - | - | |
|
|
| 0.9359 | 36348 | 1.3428 | - | - | - | - | - | |
|
|
| 0.9379 | 36426 | 1.3575 | - | - | - | - | - | |
|
|
| 0.9399 | 36504 | 1.3592 | - | - | - | - | - | |
|
|
| 0.9415 | 36566 | - | 1.3468 | 0.6848 (+0.1443) | 0.4389 (+0.1138) | 0.7922 (+0.2916) | 0.6386 (+0.1832) | |
|
|
| 0.9419 | 36582 | 1.3317 | - | - | - | - | - | |
|
|
| 0.9439 | 36660 | 1.3394 | - | - | - | - | - | |
|
|
| 0.9459 | 36738 | 1.3411 | - | - | - | - | - | |
|
|
| 0.9479 | 36816 | 1.339 | - | - | - | - | - | |
|
|
| 0.9499 | 36894 | 1.3346 | - | - | - | - | - | |
|
|
| 0.9515 | 36955 | - | 1.3424 | 0.6877 (+0.1473) | 0.4382 (+0.1132) | 0.7851 (+0.2845) | 0.6370 (+0.1816) | |
|
|
| 0.9520 | 36972 | 1.348 | - | - | - | - | - | |
|
|
| 0.9540 | 37050 | 1.3462 | - | - | - | - | - | |
|
|
| 0.9560 | 37128 | 1.3339 | - | - | - | - | - | |
|
|
| 0.9580 | 37206 | 1.3218 | - | - | - | - | - | |
|
|
| 0.9600 | 37284 | 1.3461 | - | - | - | - | - | |
|
|
| 0.9615 | 37344 | - | 1.3382 | 0.6877 (+0.1473) | 0.4390 (+0.1139) | 0.7851 (+0.2845) | 0.6373 (+0.1819) | |
|
|
| 0.9620 | 37362 | 1.3392 | - | - | - | - | - | |
|
|
| 0.9640 | 37440 | 1.3415 | - | - | - | - | - | |
|
|
| 0.9660 | 37518 | 1.3423 | - | - | - | - | - | |
|
|
| 0.9680 | 37596 | 1.3357 | - | - | - | - | - | |
|
|
| 0.9700 | 37674 | 1.333 | - | - | - | - | - | |
|
|
| 0.9715 | 37733 | - | 1.3409 | 0.6877 (+0.1473) | 0.4394 (+0.1144) | 0.7825 (+0.2819) | 0.6366 (+0.1812) | |
|
|
| 0.9720 | 37752 | 1.3111 | - | - | - | - | - | |
|
|
| 0.9740 | 37830 | 1.3506 | - | - | - | - | - | |
|
|
| 0.9761 | 37908 | 1.3472 | - | - | - | - | - | |
|
|
| 0.9781 | 37986 | 1.3273 | - | - | - | - | - | |
|
|
| 0.9801 | 38064 | 1.337 | - | - | - | - | - | |
|
|
| 0.9816 | 38122 | - | 1.3355 | 0.6877 (+0.1473) | 0.4383 (+0.1133) | 0.7751 (+0.2745) | 0.6337 (+0.1784) | |
|
|
| 0.9821 | 38142 | 1.3393 | - | - | - | - | - | |
|
|
| 0.9841 | 38220 | 1.3205 | - | - | - | - | - | |
|
|
| 0.9861 | 38298 | 1.3313 | - | - | - | - | - | |
|
|
| 0.9881 | 38376 | 1.3408 | - | - | - | - | - | |
|
|
| 0.9901 | 38454 | 1.3463 | - | - | - | - | - | |
|
|
| 0.9916 | 38511 | - | 1.3325 | 0.6877 (+0.1473) | 0.4366 (+0.1115) | 0.7751 (+0.2745) | 0.6331 (+0.1778) | |
|
|
| 0.9921 | 38532 | 1.3356 | - | - | - | - | - | |
|
|
| 0.9941 | 38610 | 1.3352 | - | - | - | - | - | |
|
|
| 0.9961 | 38688 | 1.3489 | - | - | - | - | - | |
|
|
| 0.9981 | 38766 | 1.3365 | - | - | - | - | - | |
|
|
| -1 | -1 | - | - | 0.7180 (+0.1776) | 0.4451 (+0.1201) | 0.8011 (+0.3005) | 0.6548 (+0.1994) | |
|
|
|
|
|
* The bold row denotes the saved checkpoint. |
|
|
</details> |
|
|
|
|
|
### Environmental Impact |
|
|
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon). |
|
|
- **Energy Consumed**: 24.402 kWh |
|
|
- **Carbon Emitted**: 9.008 kg of CO2 |
|
|
- **Hours Used**: 4.849 hours |
|
|
|
|
|
### Training Hardware |
|
|
- **On Cloud**: No |
|
|
- **GPU Model**: 8 x NVIDIA H100 80GB HBM3 |
|
|
- **CPU Model**: AMD EPYC 7R13 Processor |
|
|
- **RAM Size**: 1999.99 GB |
|
|
|
|
|
### Framework Versions |
|
|
- Python: 3.10.14 |
|
|
- Sentence Transformers: 5.1.2 |
|
|
- Transformers: 4.57.1 |
|
|
- PyTorch: 2.9.1+cu126 |
|
|
- Accelerate: 1.12.0 |
|
|
- Datasets: 4.4.1 |
|
|
- Tokenizers: 0.22.1 |
|
|
|
|
|
## Citation |
|
|
|
|
|
### BibTeX |
|
|
|
|
|
#### Sentence Transformers |
|
|
```bibtex |
|
|
@inproceedings{reimers-2019-sentence-bert, |
|
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", |
|
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", |
|
|
month = "11", |
|
|
year = "2019", |
|
|
publisher = "Association for Computational Linguistics", |
|
|
url = "https://arxiv.org/abs/1908.10084", |
|
|
} |
|
|
``` |
|
|
|
|
|
#### MarginMSELoss |
|
|
```bibtex |
|
|
@misc{hofstätter2021improving, |
|
|
title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation}, |
|
|
author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury}, |
|
|
year={2021}, |
|
|
eprint={2010.02666}, |
|
|
archivePrefix={arXiv}, |
|
|
primaryClass={cs.IR} |
|
|
} |
|
|
``` |
|
|
|
|
|
<!-- |
|
|
## Glossary |
|
|
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
|
--> |
|
|
|
|
|
<!-- |
|
|
## Model Card Authors |
|
|
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
|
--> |
|
|
|
|
|
<!-- |
|
|
## Model Card Contact |
|
|
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
|
--> |