Why is casting AVG (intger_column) like DECIMAL return a minimum of six decimal places?

Consider this query:

WITH Scores (score) AS ( SELECT CAST(score AS INTEGER) FROM ( VALUES (0), (10), (10) ) AS Scores (score) ) SELECT AVG(CAST(score AS DECIMAL(19, 8))) AS precision_eight, AVG(CAST(score AS DECIMAL(19, 7))) AS precision_seven, AVG(CAST(score AS DECIMAL(19, 6))) AS precision_six, AVG(CAST(score AS DECIMAL(19, 5))) AS precision_five, AVG(CAST(score AS DECIMAL(19, 4))) AS precision_four FROM Scores; 

Results:

 precision_eight | precision_seven | precision_six | precision_five | precision_four 6.66666666 | 6.6666666 | 6.666666 | 6.666666 | 6.666666 

Why do I always get at least six decimal places? Is this documented behavior?

(I am running SQL Server 2008)

+6
sql sql-server sql-server-2008
source share
2 answers

Decimal AVG always returns the โ€œdecimal (38, s) divided by decimal (10, 0)โ€ data type ( see here )

You need to transfer the result of AVG to the desired accuracy.

+8
source share

I was going to post something similar to Jappie's answer along with this example to illustrate what's going on behind the scenes.

 declare @a int set @a = 0 declare @b int set @b = 10 declare @c int set @c = 10 declare @n decimal(10,0) set @n = 3 select cast((@ a+@b +@c ) as decimal(38,4))/@n 
+2
source share

All Articles