HuggingFaceH4/ultrachat_200k
Viewer • Updated • 515k • 68.4k • 705
How to use Serega6678/Test_with_new_script with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-v0.1")
model = PeftModel.from_pretrained(base_model, "Serega6678/Test_with_new_script")This model is a fine-tuned version of Serega6678/M7Bv01_SFT_50pct_LORA_BF16_FA_2H100 on the HuggingFaceH4/ultrachat_200k dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 1.0177 | 1.0 | 907 | 1.0115 |
Base model
mistralai/Mistral-7B-v0.1