[go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remote ollama support via API #8

Open
adatepitesz opened this issue Apr 17, 2024 · 1 comment
Open

remote ollama support via API #8

adatepitesz opened this issue Apr 17, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@adatepitesz
Copy link

I would like to use Ollama as llm provider, however from a remote cloud provider. I was unable to find where do I put the URL for this. Please provide me with some instructions what to modify!

@jaluoma
Copy link
Owner
jaluoma commented Apr 24, 2024

This is not supported, I'm afraid, but it should definitely be a feature. It looks like it's possible.

https://api.python.langchain.com/en/latest/chat_models/langchain_community.chat_models.ollama.ChatOllama.html

I believe

chat = ChatOllama(model=default_model,temperature=0)

Should be:

chat = ChatOllama(model=default_model,temperature=0, base_url = custom_url)

with custom_url set as your model endpoint.

@jaluoma jaluoma added the enhancement New feature or request label Apr 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants