I have the following table:
┌────────────────┬─────────────────────────────┬───────────┬──────────┬──────────────────────────────────────────────────────────────────┐
│ Column │ Type │ Collation │ Nullable │ Default │
├────────────────┼─────────────────────────────┼───────────┼──────────┼──────────────────────────────────────────────────────────────────┤
│ id │ bigint │ │ not null │ nextval('"HistoricalDataAggregatorWorkOrders_id_seq"'::regclass) ││
│ inputTimeRange │ tstzrange │ │ not null │ │
│ outputTags │ tag[] │ │ not null │ │
└────────────────┴─────────────────────────────┴───────────┴──────────┴──────────────────────────────────────────────────────────────────┘
I want to bulk insert a bunch of rows into it in one parameterized query. I thought I’d be able to do it the usual way with unnest
:
INSERT INTO "HistoricalDataAggregatorWorkOrders"
("inputTimeRange", "outputTags")
SELECT * from unnest($1::tstzrange[], $2::tag[][])
RETURNING id;
But unfortunately, unnest
doesn’t work the way I want (it flattens the array).
I gather from https://stackoverflow.com/a/8142998/200224 that there’s no builtin unnest-one-level function.
Is there really no simple way to bulk insert with a parameterized query when one column is an array type? Seems like a missing feature…
2
Answers
I don't love it but I went with a
jsonb[]
parameter:Update: realized I could
unnest
atext[]
and then cast an element of that to::tag[]
:It still doesn't feel that clean...if there isn't a cleaner way to do this with builtin features, then a way should be added...
Ideally I'd like a way to leave serialization of the parameters completely up to the db driver (
node-pg
, which may have limitations.)If anyone knows a more elegant way to do it, let me know.
If the lengths of both arrays are designed to match, you can unpack just one, requesting each element to also present its index (
with ordinality
clause), then use that to address elements in the other array.demo at db<>fiddle
Since SQL arrays can only return either individual elements or whole slices of equal dimensionality as the whole input array (not arrays of N-K dimension when you skip K subscripts),
array(select unnest(<slice>)
has to peel that one dimension away – otherwise you’d get a "nested" 2D{{(abc),(def)}}
instead of a 1D{(abc),(def)}
you want.The two
unnest
calls should perform comparably to the single call for two arrays at once but the null-padding will only work one way, for larger/longer$1
, compared to the single multi-arrayunnest
:Except for the fact it saves your from switching to
jsonb
and back or going through thetext[]::tag[]
casts, it’s still not much cleaner than what you had already.