SentenceTransformer based on jhu-clsp/ettin-encoder-17m

This is a sentence-transformers model finetuned from jhu-clsp/ettin-encoder-17m. It maps sentences & paragraphs to a 256-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: jhu-clsp/ettin-encoder-17m
  • Maximum Sequence Length: 7999 tokens
  • Output Dimensionality: 256 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 7999, 'do_lower_case': False, 'architecture': 'ModernBertModel'})
  (1): Pooling({'word_embedding_dimension': 256, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("gabrielloiseau/ettin-17m-crossnews")
# Run inference
sentences = [
    'Chandigarh:  Devender Pal Singh  Bhullar, a convict in the 1993 Delhi bomb blast case, has withdrawn his plea seeking premature release from the Punjab and Haryana high court after the sentence review board (SRB) of National Capital Territory of Delhi (NCT) rejected his petition.\nHis counsel Vipul Jindal confirmed to TOI that they would challenge the SRB decision passed on January 19 through which his request for premature release was rejected by the Delhi government."Counsel for NCT has placed on record a copy of the January 19 letter of the home (general) department NCT, Delhi...In view of the order, counsel for the petitioner seeks permission to withdraw the present petition," Justice Jasjit Singh Bedi of the HC has observed in its order released on Wednesday.',
    'HARDA: Eleven people were killed and nearly 200 injured Tuesday morning in a series of explosions at a fireworks factory on the outskirts of  Harda  that was ordered closed in 2022. Sources said over 200 people worked in the factory, with 70-odd in the morning shift. But in the chaos and panic, their fate was unknown till Tuesday midnight.\nThree people including the factory owners - brothers  Rajesh  and  Somesh Agrawal  - and general manager  Rafiq Khan  have been arrested.Ironically, Rajesh was sentenced to 10 years\' imprisonment in 2021 after a blast killed two workers in 2015, but he challenged the verdict.\nThe three-storey factory was obliterated in blasts that went on for an hour, leaving behind a heap of concrete and rubble that was so hot that it was impossible to get near it till late in the night. The true toll may be known only in the morning. Govt officials claim the body count is unlikely to rise as most workers had "fled after the first blast".\n\n\nThis was a tragedy waiting to happen: a factory that shouldn\'t have been open at all, and run by an owner who was sentenced to jail for the death of two workers in a blast eight years ago.\nThe factory\'s licence was first suspended and later cancelled, but it kept running, allegedly under the patronage of powerful people. In Oct last year, an IAS officer inspected the factory and was shocked to find far more explosives stored than allowed. Action on the inspection report is still pending.\nThe factory has a history of deaths and failed inspections, yet it stayed in business until a catastrophic explosion reduced it to rubble, killing at least 11 people and injuring 200 on Feb 6.\nOn July 5, 2015, two people - identified as  Sheikh Iqbal , 27 and Rakesh, 21 had died in an explosion in a rented house where explosives were stored by Rajesh Agarwal, one of the owners of the factory. A Harda court sentenced him to 10 years\' imprisonment in 2021, but he filed an appeal and was out on bail.\nIn 2017-18, the then collector suspended the factory\'s licence after finding violation of norms. The suspension remained in effect for around six months - until the collector got transferred. Again, in 2021, three people died in a blast in a house in the same locality. Police registered a case, which is on.\nOn Sept 26, 2022, after an inspection by district officials, the collector suspended all licenses issued to the factory by the district administration. The collector also wrote to the Petroleum and Explosives Safety Organisation to cancel the two licences it had given.\nDuring the 2022 inspection, officials had found several alarming lapses - stocks of explosives beyond permissible limits and violation of security protocols. As per the licences issued to the factory owners, they were allowed to store only 15kg of explosives but officials found 7.5 lakh crackers.\nHowever, the factory was again up and running after owners Rajesh Agarwal and his brother  Somesh  approached the then Narmadapuram divisional commissioner, who ordered a stay on the collector\'s order until further orders. It failed yet another inspection in Oct 2023 but stayed in business, with deadly consequences.',
    '#MSMEs to play a key role in exporting defence equipment in coming years said @nawegate on the first day of #DefenceExpo #Maharashtra #MaharashtraMSMEDefenceExpo #Pune https://t.co/SvpwwpklpC',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 256]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.7542, 0.6583],
#         [0.7542, 1.0000, 0.5909],
#         [0.6583, 0.5909, 1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,122,160 training samples
  • Columns: text1, text2, and label
  • Approximate statistics based on the first 1000 samples:
    text1 text2 label
    type string string float
    details
    • min: 7 tokens
    • mean: 383.84 tokens
    • max: 7999 tokens
    • min: 5 tokens
    • mean: 272.65 tokens
    • max: 7999 tokens
    • min: 0.0
    • mean: 0.3
    • max: 1.0
  • Samples:
    text1 text2 label
    The Africa group within the United Nations said it would never work again with the Romanian ambassador to Kenya in certain forums after he allegedly compared them to a monkey while attending a meeting in Nairobi. The group also demanded an unconditional and public apology to the people of Africa. Dragos Viorel Tigau was at the weekend recalled by the Romanian foreign ministry after complaints that followed a meeting at the UN building in Kenya's capital on 26 April when a monkey appeared at the window of the conference room. "The African group has joined us," Tigau allegedly said, according to a note from the South Sudanese embassy in Kenya. The Romanian foreign ministry said at the weekend it had only been informed of the incident this week and "began a procedure to recall its ambassador". "We deeply regret this situation and offer our apologies to all those who have been affected," it added, saying racist behaviour or comments were "absolutely unacceptable". The ambassador of South S... Is it any wonder why Trump wants to terminate the Constitution? Could it be that Section 3 of the 14th Amendment bars anyone from holding office who "engaged in insurrection" against the United States? https://t.co/ToMoNG6Hal 0.0
    @KEdge23 Come back soon, you're the funniest person here Lucknow: "Data usage in UP East circle is a great story," says the business head of a service provider adding that, "the people here are adopting newer and faster technology at a very faster pace and hence we are witnessing more than 100% YoY growth since the past two years".
    While in an interaction with TOI, Pankaj Thapliyal, business head UP(E) at Vodafone India says that the data story is interesting and is strengthening itself every single week.With the per capita usage anything around 450MB per month, people from the eastern part of UP are readily joining the data usage category. Thapliyal credits it to the increased use of social media platforms and the reach of exclusive stores like Vodafone Mini Store, which enables subscribers to clear their doubts and solve their technical problems. "The mini stores act as a go-to place to clear tech doubts. We regularly solve queries like installation of WhatsApp or Skype, their usage etc. People come to these stores to clear doubts and the...
    0.0
    NEW DELHI - More than four million people in India, mostly Muslims, are at risk of being declared foreign migrants as the government pushes a hard-line Hindu nationalist agenda that has challenged the country's pluralist traditions and aims to redefine what it means to be Indian. Local lad Ricky Donison emerged triumphant by a big margin in the final race of the Senior Max category in the fifth round of the 12th JK Tyre-FMSCI National Karting Championship at the Meco Kartopia on Sunday.
    The victory ensured Donison also took the overall championship and the individual awards for the maximum number of wins and pole positions. Donison, who finished with a timing of 18:09.290 in the 20-lap final race, finished ahead of Vishnu Prasad (18:17.966) and Nayan Chatterjee (18:22.488).
    Nikhil Bohra, who won his final race in the Micro Max category, couldn't prevent his rival Shahan Ali Mohsin from running away with the overall crown. In the Junior Max category, Mohammed Nallwala claimed the honours but it was local lad Akash Gowda who took the overall title.
    The overall champions of the Junior Max and Senior Max will participate in the World Karting finals to be held in Portugal this year. The top three overall champions in the Junior Max and Senior Max category will be s...
    0.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 480,927 evaluation samples
  • Columns: text1, text2, and label
  • Approximate statistics based on the first 1000 samples:
    text1 text2 label
    type string string float
    details
    • min: 5 tokens
    • mean: 400.25 tokens
    • max: 7999 tokens
    • min: 7 tokens
    • mean: 270.56 tokens
    • max: 7999 tokens
    • min: 0.0
    • mean: 0.34
    • max: 1.0
  • Samples:
    text1 text2 label
    @vamelina ❤️ you do xxx Many of our best presidents have been underestimated. Truman was seen as the tool of a corrupt political machine. Eisenhower was supposedly a bumbling middlebrow. Grant was thought a taciturn simpleton. Even F.D.R. was once considered a lightweight feather duster. 0.0
    Five of the best ... films 1 Better Watch Out (15) (Chris Peckover, 2017, US) 89 mins If you're already feeling ground down by the festive season, this could be what you're after: a smart, seasonal horror-comedy that's undemandingly entertaining yet full of surprises. It works best if you know nothing beyond the set-up: a precocious pre-teen (Levi Miller) feels this could be his lucky night with the long-fancied babysitter (Olivia DeJonge). 2 Happy End (15) (Michael Haneke, 2017, Fra/Aus/Ger) 107 mins Another snapshot of family dysfunction from the master of the genre, who folds social media and sociopathic tendencies into this study of a Calais dynasty, none of whom is without their secrets - from patriarch Jean-Louis Trintignant to business-minded daughter Isabelle Huppert. 3 The Disaster Artist (15) (James Franco, 2017, US) 103 mins A movie about "The Godfather of bad movies", The Room, whose cult status owes much to its astoundingly self-unaware director-writer-star, Tommy Wiseau. ... @billhorton1 Is that in Glasgow ? 0.0
    Madhur Jaffrey is the accidental cook. "I have always been suspicious of my cookery career," she says, "in the sense that I feel it's not my real career. I can cook but I'm an actress." Indeed she is. Famed for winning the best actress award at the 1965 Berlin film festival for her performance as a haughty Bollywood star in Merchant Ivory's movie Shakespeare Wallah. But she is also very much a culinary trailblazer; a profound educator who took generations of western cooks gently by the hand and introduced them to the joys, subtleties, and regional variations of the India of her birth. Next month sees the publication of a gorgeous 40th anniversary edition of her seminal book Indian Cookery, updated to include 11 new recipes. The original was groundbreaking in so many ways. It accompanied a 1982 BBC sleeper series of the same name, made by the education department, and went on to sell hundreds of thousands of copies. In the process, it created the market for the TV tie-in. More important... WASHINGTON - The hurricane was accelerating away from the Mid-Atlantic coast. In the Bahamas, victims were picking through the devastation. In the Southeast, they were cleaning up debris. And in Washington, President Trump waged war over his forecasting skills. 0.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • learning_rate: 2e-05
  • num_train_epochs: 1
  • warmup_ratio: 0.1

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Click to expand
Epoch Step Training Loss
0.0007 50 0.689
0.0014 100 0.6941
0.0021 150 0.6675
0.0029 200 0.6739
0.0036 250 0.6503
0.0043 300 0.6551
0.0050 350 0.5878
0.0057 400 0.5235
0.0064 450 0.4818
0.0071 500 0.4282
0.0078 550 0.4375
0.0086 600 0.4258
0.0093 650 0.3998
0.0100 700 0.4103
0.0107 750 0.4026
0.0114 800 0.3772
0.0121 850 0.3745
0.0128 900 0.3481
0.0135 950 0.3231
0.0143 1000 0.2929
0.0150 1050 0.2499
0.0157 1100 0.2472
0.0164 1150 0.2222
0.0171 1200 0.2166
0.0178 1250 0.2168
0.0185 1300 0.2088
0.0192 1350 0.2077
0.0200 1400 0.2
0.0207 1450 0.2065
0.0214 1500 0.2121
0.0221 1550 0.1886
0.0228 1600 0.2015
0.0235 1650 0.1921
0.0242 1700 0.1979
0.0250 1750 0.1839
0.0257 1800 0.177
0.0264 1850 0.185
0.0271 1900 0.1787
0.0278 1950 0.1704
0.0285 2000 0.1682
0.0292 2050 0.185
0.0299 2100 0.1709
0.0307 2150 0.1748
0.0314 2200 0.1656
0.0321 2250 0.1835
0.0328 2300 0.1674
0.0335 2350 0.1745
0.0342 2400 0.1767
0.0349 2450 0.1676
0.0356 2500 0.1655
0.0364 2550 0.1716
0.0371 2600 0.1637
0.0378 2650 0.1709
0.0385 2700 0.1689
0.0392 2750 0.16
0.0399 2800 0.1664
0.0406 2850 0.1753
0.0413 2900 0.1754
0.0421 2950 0.1696
0.0428 3000 0.156
0.0435 3050 0.153
0.0442 3100 0.1585
0.0449 3150 0.1556
0.0456 3200 0.1688
0.0463 3250 0.1543
0.0471 3300 0.1674
0.0478 3350 0.1514
0.0485 3400 0.1538
0.0492 3450 0.1514
0.0499 3500 0.161
0.0506 3550 0.1586
0.0513 3600 0.1564
0.0520 3650 0.1506
0.0528 3700 0.1679
0.0535 3750 0.1583
0.0542 3800 0.1621
0.0549 3850 0.1464
0.0556 3900 0.144
0.0563 3950 0.1506
0.0570 4000 0.1638
0.0577 4050 0.1596
0.0585 4100 0.158
0.0592 4150 0.1569
0.0599 4200 0.1566
0.0606 4250 0.1621
0.0613 4300 0.1461
0.0620 4350 0.1635
0.0627 4400 0.1696
0.0634 4450 0.16
0.0642 4500 0.1551
0.0649 4550 0.16
0.0656 4600 0.1536
0.0663 4650 0.1518
0.0670 4700 0.1448
0.0677 4750 0.1614
0.0684 4800 0.1543
0.0692 4850 0.1453
0.0699 4900 0.1457
0.0706 4950 0.1583
0.0713 5000 0.1457
0.0720 5050 0.1446
0.0727 5100 0.1428
0.0734 5150 0.1472
0.0741 5200 0.1617
0.0749 5250 0.1531
0.0756 5300 0.1552
0.0763 5350 0.1388
0.0770 5400 0.1497
0.0777 5450 0.155
0.0784 5500 0.1518
0.0791 5550 0.1563
0.0798 5600 0.1543
0.0806 5650 0.1501
0.0813 5700 0.1366
0.0820 5750 0.1472
0.0827 5800 0.139
0.0834 5850 0.1599
0.0841 5900 0.1439
0.0848 5950 0.1454
0.0855 6000 0.1346
0.0863 6050 0.1419
0.0870 6100 0.1408
0.0877 6150 0.1381
0.0884 6200 0.1578
0.0891 6250 0.1467
0.0898 6300 0.1393
0.0905 6350 0.1478
0.0913 6400 0.1514
0.0920 6450 0.153
0.0927 6500 0.1543
0.0934 6550 0.1341
0.0941 6600 0.1471
0.0948 6650 0.1393
0.0955 6700 0.1423
0.0962 6750 0.1555
0.0970 6800 0.1368
0.0977 6850 0.1391
0.0984 6900 0.1532
0.0991 6950 0.1527
0.0998 7000 0.1417
0.1005 7050 0.1339
0.1012 7100 0.1414
0.1019 7150 0.1526
0.1027 7200 0.1327
0.1034 7250 0.1354
0.1041 7300 0.1388
0.1048 7350 0.1512
0.1055 7400 0.1473
0.1062 7450 0.1399
0.1069 7500 0.1509
0.1076 7550 0.1337
0.1084 7600 0.1433
0.1091 7650 0.1384
0.1098 7700 0.1519
0.1105 7750 0.1463
0.1112 7800 0.1447
0.1119 7850 0.1462
0.1126 7900 0.1479
0.1134 7950 0.1487
0.1141 8000 0.1414
0.1148 8050 0.1434
0.1155 8100 0.145
0.1162 8150 0.1379
0.1169 8200 0.144
0.1176 8250 0.1493
0.1183 8300 0.1368
0.1191 8350 0.1436
0.1198 8400 0.1351

Framework Versions

  • Python: 3.12.11
  • Sentence Transformers: 5.0.0
  • Transformers: 4.55.0
  • PyTorch: 2.7.1+cu126
  • Accelerate: 1.12.0
  • Datasets: 4.0.0
  • Tokenizers: 0.21.4

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
8
Safetensors
Model size
16.8M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for gabrielloiseau/ettin-17m-crossnews

Finetuned
(23)
this model

Paper for gabrielloiseau/ettin-17m-crossnews