Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to fetch job log: Error: ENOENT: no such file or directory, stat 'logs/jobs/jlq6docgk16.log' for running jobs. #694

Open
iamumairayub opened this issue Dec 16, 2023 · 3 comments

Comments

@iamumairayub
Copy link

Goal

goal is to download or view full logs of a running process, instead of waiting for the job to finish.

Summary

Getting

{
code: "job",
description: "Failed to fetch job log: Error: ENOENT: no such file or directory, stat 'logs/jobs/jlq6docgk16.log'"
}

when clicking on View Full Log or Download Log of a running job. It works fine for finished jobs.

Steps to reproduce the problem

Create an event
and select Shell Plugin
inside that, put command to run a python script like python -u myscript.py

Your Setup

Operating system and version?

Ubuntu 22.04.3 LTS

Node.js version?

v20.10.0

Cronicle software version?

Version 0.9.39

Are you using a multi-server setup, or just a single server?

single

Are you using the filesystem as back-end storage, or S3/Couchbase?

not sure, I just ran /opt/cronicle/bin/control.sh start command to start UI with default settings.

Can you reproduce the crash consistently?

yes this issue happens for all jobs I ran.

@jhuckaby
Copy link
Owner

jhuckaby commented Dec 17, 2023

Well, it sounds like your script isn't flushing its output buffers, so Cronicle isn't able to grab the output and actually log it until the script exits.

Try enabling output autoflush for your python script. Several options here: https://chat.openai.com/share/2910e2d6-9e9c-4288-84cf-32de5867d51b

If that isn't it, then maybe your script is outputting JSON? Cronicle may be "eating" your output lines, thinking they are part of its STDIO JSON API. Try turning off JSON interp mode in the Shell Plugin options in your event.

If that isn't it, maybe your script is accumulating everything onto one line? Cronicle only grabs one "line" at a time for the log append, delimited by a Unix EOL.

For example, if your script is emitting ASCII graphical progress bars or other "overwriting" lines or progress indicators, these are NOT picked up by Cronicle, and not logged. Only fully formed lines that end in a true Linux EOL (ASCII 10) are picked up.

Could your script be using Windows/DOS line endings perhaps?

That's all I can think of.

Oh, try running a Shell Plugin job with this as the script source:

#!/bin/bash

for i in {1..20}
do
    echo "Iteration: $i"
    sleep 1
done

Can you view that job's log while it is running? If so, then your python script is doing something funky with its output. If not, then something is wrong with your Cronicle installation.

You can also try piping your python script's output directly to the Cronicle job log, and routing STDERR to STDOUT, so they all go to the log and only the log:

python -u myscript.py > $JOB_LOG 2>&1

$JOB_LOG is a predefined env var -- docs here.

That's all I can think of. Good luck!

@iamumairayub
Copy link
Author

Hi, just noticed that it was due to Detached (Uninterruptible) setting enabled.

image

I have now disabled that and now I am able to view logs of running processes.

@jhuckaby
Copy link
Owner

Ah, yes, that's a known bug. It's fixed in v2 (coming out in 2024).

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants