I have a SQL Server table with the following structure:
CREATE TABLE [dbo].[Log]( [LogID] [bigint] IDENTITY(1,1) NOT NULL, [A] [int] NOT NULL, [B] [int] NOT NULL, [C] [int] NOT NULL, [D] [int] NOT NULL, [E] [int] NOT NULL, [Flag1] [bit] NOT NULL, [Flag2] [bit] NOT NULL, [Flag3] [bit] NOT NULL, [Counter] [int] NOT NULL, [Start] [datetime] NOT NULL, [End] [datetime] NOT NULL)
The table is used to record actions. Columns A Flag1 E represent foreign keys, Flag1 Flag3 indicate specific log states, and Start and End columns indicate the beginning and end of the action.
On average, this table is updated every ~ 30 seconds, and the update is ~ 50 inserts / updates.
The user can make a request from the user interface and filter the data in any column and all combinations of columns and column types.
What would be the best way to optimize data retrieval for this table:
- Create one "main" index that will contain all of these columns.
- Identify some of the most commonly used filter combinations, for example. [
A,D,E ], [ A, Start, End ], etc. and create indexes for them - Something else...
source share