--- base_model: BAAI/bge-large-en datasets: [] language: [] library_name: sentence-transformers pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:626 - loss:CosineSimilarityLoss widget: - source_sentence: What determines the completion of performance of the contract? sentences: - In a tender/contract, in case of any difference, contradiction, discrepancy, with regard to conditions of tender/contract, specifications, drawings, bill of quantities etc. - The Contractor shall at all times during the progress and continuance of the works and also for the period of maintenance specified in the Tender Form - What determines the completion of performance of the contract? - source_sentence: Early completion bonus sentences: - In case of ambiguity, order of precedence shall be referred. - Contractor shall be entitled for a bonus of 1% for each 30 days early completion of work. - "The Railway shall have the right to let other contracts in connection with the\ \ works. The Contractor shall afford other Contractors reasonable opportunity\ \ for the storage of their materials and the execution of their works and shall\ \ properly connect and coordinate his work with theirs. If any part of the Contractor\x92\ s work depends upon proper execution or result upon the work of another Contractor(s),\ \ the Contractor shall inspect and promptly report to the Engineer any defects\ \ in such works that render it unsuitable for such proper execution and results.\ \ The Contractor's failure so-to inspect and report shall constitute an acceptance\ \ of the other Contractor's work as fit and proper for the reception of his work,\ \ except as to defects which may develop in the other Contractor's work after\ \ the execution of his work." - source_sentence: Out of scope works sentences: - 'as to execution or quality of any work or material, or as to the measurements of the works the decision of the Engineer thereon shall be final subject to the appeal (within 7 days of such decision being intimated to the Contractor) to the Chief Engineer ' - Should works over and above those included in the contract require to be executed at the site, the Contractor shall have no right to be entrusted with the execution of such works which may be carried out by another Contractor or Contractors or by other means at the option of the Railway. - What is the order of precedence in the case of ambiguity between drawings and technical specifications? - source_sentence: Deadline sentences: - shall be read in conjunction with the Standard General Conditions of Contract which are referred to herein and shall be subject to modifications additions or suppression by Special Conditions of Contract and/or Special Specifications, if any, annexed to the Tender Forms. - the sand, stone, clay ballast, earth, trees, rock - not later than 30 days after the date of receipt - source_sentence: Can the stones/rocks/bounders obtained during excavation be used for construction if found technically satisfactory? sentences: - use the same for the purpose of the works either free of cost or pay the cost - Any material found during excavation should be reported to the engineer. - No certificate other than Maintenance Certificate, if applicable, referred to in Clause 50 of the Conditions shall be deemed to constitute approval --- # SentenceTransformer based on BAAI/bge-large-en This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-large-en](https://huggingface.co./BAAI/bge-large-en). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [BAAI/bge-large-en](https://huggingface.co./BAAI/bge-large-en) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 1024 tokens - **Similarity Function:** Cosine Similarity ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co./models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("Ananthu357/Ananthus-BAAI-for-contracts10.0") # Run inference sentences = [ 'Can the stones/rocks/bounders obtained during excavation be used for construction if found technically satisfactory?', 'use the same for the purpose of the works either free of cost or pay the cost', 'No certificate other than Maintenance Certificate, if applicable, referred to in Clause 50 of the Conditions shall be deemed to constitute approval', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 1024] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Training Details ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `num_train_epochs`: 15 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 16 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 15 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs | Epoch | Step | Training Loss | loss | |:-----:|:----:|:-------------:|:------:| | 2.5 | 100 | 0.0568 | 0.1144 | | 5.0 | 200 | 0.0099 | 0.0947 | | 7.5 | 300 | 0.0039 | 0.1039 | | 10.0 | 400 | 0.0021 | 0.1027 | | 12.5 | 500 | 0.0014 | 0.1017 | | 15.0 | 600 | 0.0012 | 0.1019 | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.42.4 - PyTorch: 2.3.1+cu121 - Accelerate: 0.32.1 - Datasets: 2.21.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ```