I have the following CSV (truncated):
"adfebb","a-f-A-M-F-R-Z","95-00123","C560","USAF"
"ae0133","a-f-A-M-F-R-Z","97-00102","C560","USAF"
I need the following JSON:
{
"adfebb":["a-f-A-M-F-R-Z","95-00123","C560","USAF"],
"ae0133":["a-f-A-M-F-R-Z","97-00102","C560","USAF"]
}
I have been tinkering with jq and came close but I haven’t managed to get it just right. What I currently have is:
jq -R '
inputs
| . / "n"
| (.[] | select(length > 0) | . / ",") as $fields
| [$fields[1], $fields[2], $fields[3], $fields[4]] as $aircraft
| {($fields[0]): $aircraft}
' >outfile.json
That produces
{
"adfebb":["a-f-A-M-F-R-Z","95-00123","C560","USAF"]
}
{
"ae0133":["a-f-A-M-F-R-Z","97-00102","C560","USAF"]
}
(Note the missing comma)…
I’m clearly missing something but I’m at the end of my wits…
2
Answers
I’m never sure if this is the simplest way, but I would write
An explanation…
The first part of the filter produces a list of strings for each row. (It assumes you can treat a row as a comma-separated stream of JSON strings;
jq
isn’t really designed to parse CSV files.)The second part of the filter creates a list of objects with
key
andvalue
fields, using the first element of a list as the key and the remaining elements of a list as the value.These lists are designed to be processed by
from_entries
to create the desired JSON objects.The second
jq
command reads the objects into a single array, then "sums" them to create a single object with two keys.With
. / ","
you are splitting at any occurring comma, regardless of it actually separating the columns, or being part of a column value. If you can assert that the latter won’t happen (data never contains commas), here’s an approach usingreduce
to successively build up your target object: