Created
August 30, 2024 08:19
-
-
Save dcalsky/187f1d04d76f195bf5aaf21e149ece94 to your computer and use it in GitHub Desktop.
apple mlx for Phi-3.5-mini-instruct
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
from mlx_lm import load, generate | |
model, tokenizer = load( | |
"microsoft/Phi-3.5-mini-instruct", | |
tokenizer_config={"eos_token": "<|end|>", "trust_remote_code": True}, | |
) | |
messages = [ | |
{"role": "user", "content": "Hello, how are you?"}, | |
] | |
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) | |
response = generate(model, tokenizer, prompt=prompt, verbose=True, max_tokens=1024) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Until 2024/08/30, mlx doesn't support
eos_token
for phi-3.5, so we shall set it in the tokenizer_config by ourselves.