Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama not looking for model in remote (VS Code tunnel) #3552

Open
3 tasks done
jasperhyp opened this issue Dec 27, 2024 · 2 comments
Open
3 tasks done

ollama not looking for model in remote (VS Code tunnel) #3552

jasperhyp opened this issue Dec 27, 2024 · 2 comments
Assignees
Labels
area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior priority:medium Indicates medium priority

Comments

@jasperhyp
Copy link

jasperhyp commented Dec 27, 2024

Before submitting your bug report

Relevant environment info

- OS: (local) Windows 11; (remote) CentOS Linux 7
- Continue version: 0.8.66
- IDE version: VS Code 1.96.2
- Model: ollama
- config.json:
  
{
  "models": [
    {
      "title": "Claude 3.5 Sonnet (Free Trial)",
      "provider": "free-trial",
      "model": "claude-3-5-sonnet-latest",
      "systemMessage": "You are an expert software developer. You give helpful and concise responses."
    },
    {
      "title": "GPT-4o (Free Trial)",
      "provider": "free-trial",
      "model": "gpt-4o",
      "systemMessage": "You are an expert software developer. You give helpful and concise responses."
    },
    {
      "title": "Llama3.1 70b (Free Trial)",
      "provider": "free-trial",
      "model": "llama3.1-70b",
      "systemMessage": "You are an expert software developer. You give helpful and concise responses."
    },
    {
      "title": "Codestral (Free Trial)",
      "provider": "free-trial",
      "model": "codestral-latest",
      "systemMessage": "You are an expert software developer. You give helpful and concise responses."
    },
    {
      "model": "claude-3-5-sonnet-latest",
      "provider": "anthropic",
      "apiKey": "",
      "title": "Claude 3.5 Sonnet"
    },
    {
      "title": "Gemini 1.5 Pro",
      "model": "gemini-1.5-pro-latest",
      "contextLength": 2000000,
      "apiKey": "AIzaSyDvIX92W6yMj-T6kQ8DvlsJ5RVGc8Q4tgw",
      "provider": "gemini"
    },
    {
      "title": "deepseek-coder-v2-16b",
      "provider": "ollama",
      "model": "deepseek-coder-v2"
    },
    {
      "title": "qwen2.5-coder-32b",
      "provider": "ollama",
      "model": "qwen2.5-coder:32b"
    },
    {
      "title": "qwq",
      "provider": "ollama",
      "model": "qwq"
    },
    {
      "title": "llama3.2",
      "provider": "ollama",
      "model": "llama3.2"
    },
    {
      "title": "Qwen 2.5 Coder 32b",
      "model": "qwen2.5-coder-32b",
      "provider": "ollama"
    },
    {
      "model": "AUTODETECT",
      "title": "Autodetect",
      "provider": "ollama"
    }
  ],
  "tabAutocompleteModel": {
    "title": "qwen2.5-coder-32b",
    "provider": "ollama",
    "model": "qwen2.5-coder:32b"
  },
  "contextProviders": [
    {
      "name": "code",
      "params": {}
    },
    {
      "name": "docs",
      "params": {}
    },
    {
      "name": "diff",
      "params": {}
    },
    {
      "name": "terminal",
      "params": {}
    },
    {
      "name": "problems",
      "params": {}
    },
    {
      "name": "folder",
      "params": {}
    },
    {
      "name": "codebase",
      "params": {}
    }
  ],
  "slashCommands": [
    {
      "name": "share",
      "description": "Export the current chat session to markdown"
    },
    {
      "name": "cmd",
      "description": "Generate a shell command"
    },
    {
      "name": "commit",
      "description": "Generate a git commit message"
    }
  ],
  "embeddingsProvider": {
    "provider": "ollama",
    "model": "nomic-embed-text"
  },
  "reranker": {
    "name": "free-trial"
  }
}

Description

I am currently using VSCode tunnel to access a remote cluster. However, the ollama tab in the Continue plugin (installed on cluster) does not connect to the ollama model running on the cluster. When local ollama is on, it connects to my local ollama, which is undesirable due to the lack of GPU compute.

Message:

Failed to connect to local Ollama instance, please ensure Ollama is both installed and running. You can download Ollama from https://ollama.ai.

Port info:

COMMAND     PID   USER   FD   TYPE    DEVICE SIZE/OFF NODE NAME
ollama    18146 yeh803    3u  IPv4 109254510      0t0  TCP 127.0.0.1:11434 (LISTEN)

To reproduce

No response

Log output

No response

@sestinj sestinj self-assigned this Dec 27, 2024
@dosubot dosubot bot added area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior labels Dec 27, 2024
@TomaraRedfox
Copy link

#3514 (comment)

This might be helpful. I'm also remoting and setting the apiBase url for the ollama models at least seemed to work for me.

@sestinj sestinj added priority:medium Indicates medium priority and removed "needs-triage" labels Jan 3, 2025
@jasperhyp
Copy link
Author

@TomaraRedfox Thank you for the suggestion! I see that the OLLAMA_HOST is always https://127.0.0.1, which means that it’s bound to the local interface of the compute node and not accessible from other machines (outside of the cluster) directly. This is due to the (typical academic) cluster structure, where I need to jump onto the compute node from the login node if I am using ssh.

I also tried to input the IP address of the compute node directly (which is highly unlikely to be accessible), and it failed as expected. Hmmm. I was looking at the difference between "tunnel" and ssh here -- maybe this is part of the reason it's not working?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:configuration Relates to configuration options ide:vscode Relates specifically to VS Code extension kind:bug Indicates an unexpected problem or unintended behavior priority:medium Indicates medium priority
Projects
None yet
Development

No branches or pull requests

3 participants