Skip to content

[LocalNet] Add infrastructure to run LLM inference #1815

[LocalNet] Add infrastructure to run LLM inference

[LocalNet] Add infrastructure to run LLM inference #1815

Triggered via pull request April 29, 2024 20:07
@OlshanskOlshansk
synchronize #508
dk-ollama
Status Cancelled
Total duration 34s
Artifacts

reviewdog.yml

on: pull_request
Check TODO_IN_THIS_
8s
Check TODO_IN_THIS_
Check stdlog in off-chain source code
8s
Check stdlog in off-chain source code
Check for non-standard interface implementation statements
9s
Check for non-standard interface implementation statements
Check misspelling
24s
Check misspelling
Fit to window
Zoom out
Zoom in

Annotations

2 errors
Check misspelling
Canceling since a higher priority waiting request for 'reviewdog-dk-ollama' exists
Check misspelling
The operation was canceled.