#StackBounty: #sql-server #performance #query-optimization #sql-server-2014 #database-performance Why doesn't SQL Server use the in…

Bounty: 50

Given the following in a SQL Server 2014 DB:

create table t 
(
    c1 int primary key,
    c2 datetime2(7),
    c3 nvarchar(20),
    c4 as cast(dbo.toTimeZone(c2, c3, 'UTC') as date) persisted
);

create index i on t (c4);

declare @i int = 0;

while @i < 10000 
begin
    insert into t (c1, c2, c3) values
        (@i, dateadd(day, @i, '1970-01-02 03:04:05:6'), 'Asia/Manila');
    set @i = @i + 1;
end;

toTimeZone is a CLR UDF that converts a datetime2 in a time zone to a datetime2 in another time zone.

When I run the following query:

select c1 
from t 
where c4 >= '1970-01-02'
    and c4 <= '1970-03-04';

The execution plan followed by SQL Server indicates that i isn’t used.

Instead there is a scan on the implicit index on the PK followed by a couple of scalar computations before finally a filter using the predicates of the query. The execution plan that I was expecting is a scan on i.

Use the SSDT project in this ZIP file to try and replicate the problem. It includes a mock definition of the CLR UDF. Included also is the execution plan I get.


Get this bounty!!!