Skip to content

Instantly share code, notes, and snippets.

@awni
Created June 7, 2025 19:25
Show Gist options
  • Save awni/bcd59f342d5be8a9d6b4268d0c782d25 to your computer and use it in GitHub Desktop.
Save awni/bcd59f342d5be8a9d6b4268d0c782d25 to your computer and use it in GitHub Desktop.
MLX LM + OpenAI Client

First install the dependencies:

pip install mlx-lm openai

Then start the server:

mlx_lm.server

And now you can use the OpenAI client like so:

from openai import OpenAI

client = OpenAI(
    base_url="http://localhost:8080/v1",
    api_key="not-needed"
)

response = client.chat.completions.create(
    messages=[
        {
            'role': 'user',
            'content': 'How many letter r are in strawberry?',
        }
    ],
    model="mlx-community/qwen3-4b-4bit-DWQ",
)
print(response.choices[0].message.content)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment