Skip to content

Shutting Down MCP Server #23

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
nimit2801 opened this issue Apr 8, 2025 · 3 comments
Open

Shutting Down MCP Server #23

nimit2801 opened this issue Apr 8, 2025 · 3 comments

Comments

@nimit2801
Copy link

Image
{
  "mcpServers": {
    "google-calendar": {
      "command": "node",
      "args": [
        "/Users/nimitsavant/Desktop/nimit-devrev/google-calendar-mcp/build/index.js"
      ]
    }
  }
}
@mimi3421
Copy link
Contributor

mimi3421 commented Apr 14, 2025

I think there is a Bug somewhere in the request POST JSON message accounting to the null string.

I tested the mcphost locally with the local llama.cpp server with empty mcpServer setting and sent a simple prumpt "hi". The log in mcphost was:

mcphost --model openai:phi4 --openai-url http://localhost:8080/v1 --openai-api-key 000000--debug
2025/04/15 00:29:32 INFO <cmd/root.go:487> Model loaded provider=openai model=phi4

You: hi
2025/04/15 00:29:34 DEBU <openai/provider.go:46> creating message prompt=hi num_messages=1 num_tools=0
2025/04/15 00:29:34 DEBU <openai/provider.go:55> converting message role=user content=hi is_tool_response=false
2025/04/15 00:29:34 DEBU <openai/provider.go:143> sending messages to OpenAI messages="[{Role:user Content:0xc000247560 ReasoningContent: FunctionCall: ToolCalls:[] Name: ToolCallID:}]" num_tools=0
2025/04/15 00:29:34 INFO <cmd/root.go:502> Shutting down MCP servers...
Error: error response with status 500

While the log in llama.cpp was:

srv log_server_r: request: POST /v1/chat/completions 127.0.0.1 500
srv log_server_r: request: {"model":"phi4","messages":[{"role":"user","content":"hi","reasoning_content":null},{"role":"user","content":"hi","reasoning_content":null}],"max_tokens":4096,"temperature":0.7}
srv log_server_r: response: {"error":{"code":500,"message":"Failed to parse messages: [json.exception.type_error.302] type must be string, but is null; messages = [\n {\n "role": "user",\n "content": "hi",\n "reasoning_content": null\n },\n {\n "role": "user",\n "content": "hi",\n "reasoning_content": null\n }\n]","type":"server_error"}}

I think the null in the request JSON should be "" which triggered the error in the JSON parser.

Edited:
I'm new to this and I think maybe the empty reasoning_content field should be omitted in the request POST and not just be set to null.

//pkg/llm/openai/types.go
type MessageParam struct {
	Role             string        `json:"role"`
	Content          *string       `json:"content"`
	ReasoningContent *string       `json:"reasoning_content",omitempty` // Add omitempty here
	FunctionCall     *FunctionCall `json:"function_call,omitempty"`
	ToolCalls        []ToolCall    `json:"tool_calls,omitempty"`
	Name             string        `json:"name,omitempty"`
	ToolCallID       string        `json:"tool_call_id,omitempty"`
}

Similar issues may be shown in link or link

@Waasi
Copy link

Waasi commented Apr 14, 2025

Pitching in just in case this is still an issue. For me, it turned out that the model I was using was not found. I got to this by logging the error returned here. I hope this helps.

mimi3421 added a commit to mimi3421/mcphost that referenced this issue Apr 16, 2025
Add ```omitempty``` to ```reasoning_content``` in OpenAI message API. See issue [link](mark3labs#23)
ezynda3 pushed a commit that referenced this issue Apr 16, 2025
…27)

Add ```omitempty``` to ```reasoning_content``` in OpenAI message API. See issue [link](#23)
@semidark
Copy link

Pitching in just in case this is still an issue. For me, it turned out that the model I was using was not found. I discovered this by logging the error returned here. I hope this helps.

I ran into the same issue, and at least in my case, @Waasi was correct. For me, not only was the model not found, but I also could not connect to the Ollama server. I had to correctly set my OLLAMA_HOST environment variable. After that, I got my first MCP server up and running.

So, I believe better logging when this error occurs would be really helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants