I would like to know how NULL values affect query performance in SQL Server 2005.
I have a table like this (simplified):
ID | ImportantData | QuickPickOrder -------------------------- 1 | 'Some Text' | NULL 2 | 'Other Text' | 3 3 | 'abcdefg' | NULL 4 | 'whatever' | 4 5 | 'it is' | 2 6 | 'technically' | NULL 7 | 'a varchar' | NULL 8 | 'of course' | 1 9 | 'but that' | NULL 10 | 'is not' | NULL 11 | 'important' | 5
And I make a request on it as follows:
SELECT * FROM MyTable WHERE QuickPickOrder IS NOT NULL ORDER BY QuickPickOrder
So, QuickPickOrder is basically a column used to highlight some frequently selected items from a larger list. It also provides the order in which they will be displayed to the user. NULL values mean that they do not appear in the quick pick list.
I've always been told that the NULL values in the database are somehow evil, at least in terms of normalization, but is it acceptable to filter out unwanted rows in the WHERE clause?
Would it be better to use a specific numerical value like -1 or 0 to indicate items that are not needed? Are there other alternatives?
EDIT: The example does not accurately reflect the ratio of real values to NULL. A better example might display at least 10 NULL for each non-NULL. The size of the table can be from 100 to 200 rows. This is a look-up table, so updates are rare.
performance sql database sql-server
SurroundedByFish
source share