Postgresql: Find values ​​in a JSON array using wildcards and index comparison operators

I have a table with JSON array data that I would like to find.

CREATE TABLE data (id SERIAL, json JSON); INSERT INTO data (id, json) VALUES (1, '[{"name": "Value A", "value": 10}]'); INSERT INTO data (id, json) VALUES (2, '[{"name": "Value B1", "value": 5}, {"name": "Value B2", "value": 15}]'); 

As described in this answer , I created a function that also allows you to create an index for array data (important).

 CREATE OR REPLACE FUNCTION json_val_arr(_j json, _key text) RETURNS text[] AS $$ SELECT array_agg(elem->>_key) FROM json_array_elements(_j) AS x(elem) $$ LANGUAGE sql IMMUTABLE; 

This works well if I want to find an integer value (for example, β€œB1 Value”):

 SELECT * FROM data WHERE '{"Value B1"}'::text[] <@ (json_val_arr(json, 'name')); 

Now my questions are:

  • Can values ​​be found using a wildcard (for example, "Value *")? Something like the following (naive) approach:

     ... WHERE '{"Value%"}'::text[] <@ (json_val_arr(json, 'name')); 
  • Is it possible to find numerical values ​​with comparison operators (e.g.> = 10)? Again, a naive and obviously wrong approach:

     ... WHERE '{10}'::int[] >= (json_val_arr(json, 'value')); 

    I tried to create a new function returning int[] , but that did not work.

I created a SQL Fiddle to illustrate my problem.

Or it would be better to use a different approach, for example, the following work queries:

 SELECT * FROM data, json_array_elements(json) jsondata WHERE jsondata ->> 'name' LIKE 'Value%'; 

and

 ... WHERE cast(jsondata ->> 'value' as integer) <= 10; 

However, for these queries, I was unable to create any index that was actually selected by the queries.

Also, I would like to implement all of this in Postgresql 9.4 with JSONB in ​​the end, but I think this should not be a problem for the above questions.

Many thanks!

+8
json sql indexing postgresql
source share
1 answer

I know it has been a while, but I was just looking for something like this (using wild cards to query json data types) and thought I was sharing what I found.

Firstly, it was a huge moment in the right direction: http://schinckel.net/2014/05/25/querying-json-in-postgres/

Drop that your way of hacking a json element into something else (a recordset) is the way to go. It allows you to request json elements with regular postgres materials.

In my case:

 #Table:test ID | jsonb_column 1 | {"name": "", "value": "reserved", "expires_in": 13732} 2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554} 3 | {"name": "dog", "value": "{\"ns\":[\"woof.\"]}", "expires_in": 4554} 

Request example

 select * from test jsonb_to_recordset(x) where jsonb_column->>'name' like '%o%'; # => Returns # 2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554} 

And to answer your question about jsonb: it looks like jsonb is the best MOST route of the time. It has more methods and reads (but writes more slowly) times faster.

Sources:

Happy hunt!

+8
source share

All Articles