-
-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Header not CSV but data is CSV with -o <fmt> #449
Comments
Yes - true - but it was not meant to create a csv output :) but I will check, if the change does not break other things. |
for csv, i just use |
In order to be more flexibel I propose to replace the old csv code with an user defined such as It needs some work to implement. |
I agree completely here. because ALL COLUMNS csv export is bloat for everyone's use. only limited number of columns needed for practical jobs. Currenly im adding 'header line' by simple bash script but that way is not very robust. and second opinion: CSV export should not be dropped (marked as obsolete in current version)! JSON-export has very large overhead! it was bloated for that type of data. more overhead , more disk utilization, more I/O, more RAM and CPU intensive operations. so using CSV should be more flexible as mentioned by topic-starter and not dropped. for example workflow: nfcapd -> nfdump -> csv -> clickhouse timeseriesDB import from csv file.... and job is done flawlessly! |
New csv format implemented. Now the headers also get properly |
Specifying the output format using the
-o <fmt>
option results in the body being CSV but the header is not. This makes further processing with CSV tools (such as xsv) more difficult than it should be as the header line is seen as a single field rather than a field per formatted item.Example: I would have expected a comma immediately after
(raw)
which leads to an error fromxsv
:I believe the issue occurs as the format is parsed (in ParseOutputFormat) and
header_string
is created. It looks like commas between fields should be inserted at that time.The text was updated successfully, but these errors were encountered: