Created
April 18, 2023 04:32
-
-
Save mrsipan/44e4a3b3ba84bf6b4ff412108a73cddd to your computer and use it in GitHub Desktop.
llama.cpp for vicuna commands and prompt
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
./main --color --threads 7 --batch_size 256 --n_predict -1 --top_k 12 --top_p 1 \ | |
--temp 0.36 --repeat_penalty 1.05 --ctx_size 2048 --instruct \ | |
--reverse-prompt "### Human:" \ | |
--model ./models/13B/ggml-vicuna-13b-4bit.bin \ | |
-f prompts/vicuna.txt | |
printf 'A chat between a curious human and an artificial intelligence assistant. | |
The assistant gives helpful, detailed, and polite answers to the human's questions.' > prompts/vicuna.txt |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment