jmprcp commited on
Commit
c133ac7
·
verified ·
1 Parent(s): 56e7013

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -9
README.md CHANGED
@@ -55,15 +55,6 @@ When using the model, make sure your prompt is formated correctly!
55
 
56
  Also, we recommend using VLLM rather than Hugging Face.
57
 
58
- ## Prompt Format:
59
-
60
- ```python
61
- chat_template = "{% for message in messages %}{% if message['role'] == 'assistant' %}{% set role = 'model' %}{% else %}{% set role = message['role'] %}{% endif %}<start_of_turn>{{ role }}\n{{ message['content'] | trim }}<end_of_turn>\n{% endfor %}{% if add_generation_prompt %}{{'<start_of_turn>model\n'}}{% endif %}"
62
- with_system_prompt = "<start_of_turn>system\nYou are a helpful assistant.<end_of_turn>\n<start_of_turn>user\nTranslate: Hello, world! into Portuguese.<end_of_turn>\n"
63
- # System prompts are optional.
64
- without_system_prompt = "<start_of_turn>user\nTranslate: Hello, world! into Portuguese.<end_of_turn>\n"
65
- ```
66
-
67
  ### Using on VLLM:
68
 
69
  ```python
 
55
 
56
  Also, we recommend using VLLM rather than Hugging Face.
57
 
 
 
 
 
 
 
 
 
 
58
  ### Using on VLLM:
59
 
60
  ```python