For example, Josh gave me this little problem yesterday:
given:
serial | name | type
847001 | t15 | GE1043
847002 | t16 | GE1043
847003 | sub-alpha | GESS90
I want to produce:
{ "turbines" :
{
847001 : { "name" : "t15", "type" : "GE1043" },
847002 : { "name" : "t16", "type" : "GE1043" },
847003 : { "name" : "sub-alpha", "type" : "GESS90" }
}
}
I think this is going to call for a small addition to the json_build package, a json object aggregator. With that we'd be able to do this with:
select json_build_object("turbines",
json_object_agg(serial, json_build_object("name",name,"type",type)))
from mytable;
Probably we'd want a similar aggregator for arrays. This would be different from the builtin json_agg function which takes a record as its argument and always produces an array of hashes.
I make heavy use of JSONSelect, which I either use in my middleware layer or as a custom function in PLV8. I'd love it if this, or some similar query mechanism, was integrated into postgres. Is there something like this coming down the pipeline?
ReplyDeleteI have a situation where I am generating GeoJSON data via Postgis but need to merge in additional info from other columns with regular data types into the final generated JSON.
ReplyDeleteI would very much like a function to replace a subobject of a JSON document with a new one, something like:
ReplyDeletejson_replace_path(from_json json, path text[], new_object json);
This would enable modifying (potentially large) JSON documents stored in Postgres without having to ship the whole doucment back and forth to the client.