Open
Description
I keep on getting cause: HeadersTimeoutError: Headers Timeout Error when I am trying to request homellm model
cause: HeadersTimeoutError: Headers Timeout Error
at Timeout.onParserTimeout [as callback] (node:internal/deps/undici/undici:8228:32)
at Timeout.onTimeout [as _onTimeout] (node:internal/deps/undici/undici:6310:17)
at listOnTimeout (node:internal/timers:573:17)
at processTimers (node:internal/timers:514:7) {
code: 'UND_ERR_HEADERS_TIMEOUT'
}
Sometimes
Error: Expected a completed response.
at Ollama.processStreamableRequest (/usr/src/app/node_modules/ollama/dist/shared/ollama.a247cdd6.cjs:211:15)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at OllamaService.responseFromOllama (/usr/src/app/src/ollama/ollama.service.ts:21:24)
at AppGateway.pubSubMessageAI (/usr/src/app/src/gateway/app.gateway.ts:98:18)
at AppGateway.<anonymous> (/usr/src/app/node_modules/@nestjs/websockets/context/ws-proxy.js:11:32)
at WebSocketsController.pickResult (/usr/src/app/node_modules/@nestjs/websockets/web-sockets-controller.js:91:24)
try {
const { customerId, message } = data;
let systemPromptName;
systemPromptName = data.systemPromptName || 'Al';
const ollama = new Ollama({ host: 'http://host.docker.internal:11434' });
const response = await ollama.generate({
model: 'homellm:latest',
prompt: message,
format: 'json',
stream: false,
system: `You are 'Al', a helpful AI Assistant that controls the devices in a house. Complete the following task as instructed with the information provided only.
Services: light.turn_off(), light.turn_on(brightness,rgb_color), fan.turn_on(), fan.turn_off()
Devices:
light.office 'Office Light' = on;80%
fan.office 'Office fan' = off
light.kitchen 'Kitchen Light' = on;80%;red
light.bedroom 'Bedroom Light' = off`,
});
return response;
} catch (e) {
console.log(e);
}
}
I was getting responses before but now always getting Headers Timeout error or Expected a completed response.