Improve SQL Server query to convert arbitrary table to JSON

After repeatedly searching and assembling very good methods for converting result sets using the FOR XML and .nodes () commands that are on the Internet, I was able to create this single query (rather than a stored procedure) that does a pretty good job of transforming any arbitrary SQL -query to the JSON array.

The request will encode each row of data as a single JSON object with a leading comma. Data lines are wrapped in brackets, and it is expected that the entire result set will be exported to a file.

I would like to see if anyone can see ways to improve their performance?

Here's a query with an example table:

declare @xd table (col1 varchar(max), col2 int, col3 real, colNull int) insert into @xd select '', null, null, null UNION ALL select 'ItemA', 123, 123.123, null UNION ALL select 'ItemB', 456, 456.456, null UNION ALL select '7890', 789, 789.789, null select '[{}' UNION ALL select ',{' + STUFF(( (select ',' + '"' + r.value('local-name(.)', 'varchar(max)') + '":' + case when r.value('./@xsi:nil', 'varchar(max)') = 'true' then 'null' when isnumeric(r.value('.', 'varchar(max)')) = 1 then r.value('.', 'varchar(max)') else '"' + r.value('.', 'varchar(max)') + '"' end from rows.nodes('/row/*') as x(r) for xml path('')) ), 1, 1, '') + '}' from ( -- Arbitrary query goes here, (fields go where t.* is, table where @xd t is) select (select t.* for xml raw,type,elements XSINIL) rows from @xd t ) xd UNION ALL select ']' 

My biggest criticism is that he is insanely slow.
It currently takes around 3:30 for ~ 42,000 lines.

My other big criticism is that it is currently assumed that everything that looks like a number is a number. It is not trying to detect the type of the column at least (and I'm not even sure what it can).

The last minor criticism is that the first row of data will have a comma up and technically it should not. To compensate for this, this requires that an empty JSON object in the first line trigger a JSON array.

Other critical sentences (preferably with solutions), the only real limitation that I have is that the solution can be reproduced with a sufficient degree of repeatability in many arbitrary SQL queries without the need to explicitly identify column names.

I am using SQL Server 2012.

Thanks to everyone who liked, who searched for generalized SQL results -> JSON Array converter, ENJOY!

+8
json sql sql-server converter
source share
2 answers

I say if you really want to improve performance, use metaprogramming. In the example below, an attempt is made with 40,000 rows and returns results in less than a second (not counting the insertion of the initial 40k rows, which in this example only take about 2 seconds). It also takes into account your data types so as not to quote numbers.

 declare @xd table (col1 varchar(max), col2 int, col3 real, colDate datetime, colNull int); declare @i int = 0; while @i < 10000 begin set @i += 1; insert into @xd select '', null, null, null, null union all select 'ItemA', 123, 123.123, getDate(), null union all select 'ItemB', 456, 456.456, getDate(), null union all select '7890', 789, 789.789, getDate(), null; end; select * into #json_base from ( -- Insert SQL Statement here select * from @xd ) t; declare @columns table ( id int identity primary key, name sysname, datatype sysname, is_number bit, is_date bit); insert into @columns(name, datatype, is_number, is_date) select columns.name, types.name, case when number_types.name is not NULL then 1 else 0 end as is_number, case when date_types.name is not NULL then 1 else 0 end as is_date from tempdb.sys.columns join tempdb.sys.types on (columns.system_type_id = types.system_type_id) left join (values ('int'), ('real'), ('numeric'), ('decimal'), ('bigint'), ('tinyint')) as number_types(name) on (types.name = number_types.name) left join (values ('date'), ('datetime'), ('datetime2'), ('smalldatetime'), ('time'), ('datetimeoffset')) as date_types(name) on (types.name = date_types.name) where object_id = OBJECT_ID('tempdb..#json_base'); declare @field_list varchar(max) = STUFF(( select '+'',''+' + QUOTENAME(QUOTENAME(name, '"') + ':', '''') + '+' + case when is_number = 1 then 'COALESCE(LTRIM(' + QUOTENAME(name) + '),''null'')' when is_date = 1 then 'COALESCE(QUOTENAME(LTRIM(convert(varchar(max), ' + QUOTENAME(name) + ', 126)),''"''),''null'')' else 'COALESCE(QUOTENAME(' + QUOTENAME(name) + ',''"''),''null'')' end from @columns for xml path('')), 1, 5, ''); create table #json_result ( id int identity primary key, line varchar(max)); declare @sql varchar(max) = REPLACE( 'insert into #json_result ' + 'select '',{''+{f}+''}'' ' + 'from #json_base', '{f}', @field_list); exec(@sql); update #json_result set line = STUFF(line, 1, 1, '') where id = 1; select '[' UNION ALL select line from #json_result UNION ALL select ']'; drop table #json_base; drop table #json_result; 
+11
source share

From Firoz Ansari :

 CREATE PROCEDURE [dbo].[GetJSON] ( @ParameterSQL AS VARCHAR(MAX) ) AS BEGIN DECLARE @SQL NVARCHAR(MAX) DECLARE @XMLString VARCHAR(MAX) DECLARE @XML XML DECLARE @Paramlist NVARCHAR(1000) SET @Paramlist = N'@XML XML OUTPUT' SET @SQL = 'WITH PrepareTable (XMLString) ' SET @SQL = @SQL + 'AS ( ' SET @SQL = @SQL + @ParameterSQL+ ' FOR XML RAW, TYPE, ELEMENTS ' SET @SQL = @SQL + ') ' SET @SQL = @SQL + 'SELECT @XML = XMLString FROM PrepareTable ' EXEC sp_executesql @SQL, @Paramlist, @XML=@XML OUTPUT SET @XMLString = CAST(@XML AS VARCHAR(MAX)) DECLARE @JSON VARCHAR(MAX) DECLARE @Row VARCHAR(MAX) DECLARE @RowStart INT DECLARE @RowEnd INT DECLARE @FieldStart INT DECLARE @FieldEnd INT DECLARE @Key VARCHAR(MAX) DECLARE @Value VARCHAR(MAX) DECLARE @StartRoot VARCHAR(100); SET @StartRoot = '' DECLARE @EndRoot VARCHAR(100); SET @EndRoot = '' DECLARE @StartField VARCHAR(100); SET @StartField = '' SET @RowStart = CharIndex(@StartRoot, @XMLString, 0) SET @JSON = '' WHILE @RowStart &gt; 0 BEGIN SET @RowStart = @RowStart+Len(@StartRoot) SET @RowEnd = CharIndex(@EndRoot, @XMLString, @RowStart) SET @Row = SubString(@XMLString, @RowStart, @RowEnd-@RowStart) SET @JSON = @JSON+'{' -- for each row SET @FieldStart = CharIndex(@StartField, @Row, 0) WHILE @FieldStart &gt; 0 BEGIN -- parse node key SET @FieldStart = @FieldStart+Len(@StartField) SET @FieldEnd = CharIndex(@EndField, @Row, @FieldStart) SET @Key = SubString(@Row, @FieldStart, @FieldEnd-@FieldStart) SET @JSON = @JSON+'"'+@Key+'":' -- parse node value SET @FieldStart = @FieldEnd+1 SET @FieldEnd = CharIndex('0 SET @JSON = SubString(@JSON, 0, LEN(@JSON)) SET @JSON = @JSON+'},' --/ for each row SET @RowStart = CharIndex(@StartRoot, @XMLString, @RowEnd) END IF LEN(@JSON) > 0 SET @JSON = SubString(@JSON, 0, LEN(@JSON)) SET @JSON = '[' + @JSON + ']' SELECT @JSON END 
+1
source share

All Articles