Skip to content

Instantly share code, notes, and snippets.

@tlrmchlsmth
Created July 15, 2024 15:43
Show Gist options
  • Save tlrmchlsmth/91322b83f853b34e527e3d0e11c7c4ee to your computer and use it in GitHub Desktop.
Save tlrmchlsmth/91322b83f853b34e527e3d0e11c7c4ee to your computer and use it in GitHub Desktop.
Simple sweep over request rates, for running e2e benchmarks for vLLM
#!/bin/bash
# Default values
model="meta-llama/Meta-Llama-3-8B"
port=8192
num_prompts=500
# TODO: Add other datasets
dataset_path="ShareGPT_V3_unfiltered_cleaned_split.json"
dataset_url="https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered/resolve/main/$dataset_path"
# Parse command line arguments for model, port, and num-prompts
while getopts m:p:n: flag
do
case "${flag}" in
m) model=${OPTARG};;
p) port=${OPTARG};;
n) num_prompts=${OPTARG};;
esac
done
# Function to check if the server is running
check_server() {
response=$(curl -s -o /dev/null -w "%{http_code}" http://localhost:$port/v1/models)
if [ "$response" -ne 200 ]; then
echo "OpenAI server is not running. To start the server, use the following command:"
echo "python3 -m vllm.entrypoints.openai.api_server --model $model --disable-log-request --port $port"
exit 1
fi
}
# Check if the dataset file exists, if not, download it
if [ ! -f "$dataset_path" ]; then
echo "$dataset_path does not exist. Downloading..."
wget $dataset_url -O $dataset_path
if [ $? -ne 0 ]; then
echo "Failed to download $dataset_path. Exiting."
exit 1
fi
fi
# Check if the server is running
check_server
for (( rr = 1; rr <= 10; rr++ )); do
python benchmarks/benchmark_serving.py \
--backend openai \
--model $model \
--dataset-path $dataset_path \
--request-rate $rr \
--num-prompts $num_prompts \
--port $port
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment