Skip to content

Instantly share code, notes, and snippets.

@vrushankportkey
Last active October 26, 2023 07:27
Show Gist options
  • Save vrushankportkey/49c27de71bc9afd50418cc0d5b223ba9 to your computer and use it in GitHub Desktop.
Save vrushankportkey/49c27de71bc9afd50418cc0d5b223ba9 to your computer and use it in GitHub Desktop.
Langchain <> Anyscale <> Portey
import os
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage, SystemMessage
os.environ["OPENAI_API_BASE"]="https://api.portkey.ai/v1/proxy"
anyscale_key = "..."
headers = {
"x-portkey-api-key": "...",
"x-portkey-mode": "proxy anyscale",
}
chat = ChatOpenAI(openai_api_key=anyscale_key, headers=headers, model="meta-llama/Llama-2-7b-chat-hf")
messages = [
SystemMessage(content="You are a helpful assistant"),
HumanMessage(content="Who is Doraemon?")
]
chat(messages)
print(chat)
@roh26it
Copy link

roh26it commented Oct 26, 2023

Can we do it like this?

headers = Portkey.Config(
    api_key = "<PORTKEY_API_KEY>",
    trace_id = "fef659"
)

@roh26it
Copy link

roh26it commented Oct 26, 2023

Also, why do we need to talk about trace_id here?

@vrushankportkey
Copy link
Author

yeah 🤔 removing trace id. it's not needed.

we can't use Portkey's langchain integration here because there we are setting the OpenAI API Base upstream in the Portkey module and it can't be changed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment