Skip to content

Instantly share code, notes, and snippets.

@dcalsky
Created August 30, 2024 08:19
Show Gist options
  • Save dcalsky/187f1d04d76f195bf5aaf21e149ece94 to your computer and use it in GitHub Desktop.
Save dcalsky/187f1d04d76f195bf5aaf21e149ece94 to your computer and use it in GitHub Desktop.
apple mlx for Phi-3.5-mini-instruct
from mlx_lm import load, generate
model, tokenizer = load(
"microsoft/Phi-3.5-mini-instruct",
tokenizer_config={"eos_token": "<|end|>", "trust_remote_code": True},
)
messages = [
{"role": "user", "content": "Hello, how are you?"},
]
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
response = generate(model, tokenizer, prompt=prompt, verbose=True, max_tokens=1024)
@dcalsky
Copy link
Author

dcalsky commented Aug 30, 2024

Until 2024/08/30, mlx doesn't support eos_token for phi-3.5, so we shall set it in the tokenizer_config by ourselves.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment