Submit to speak at SQLSaturday Houston by March 15

SQLSaturday #107 in Houston is coming up on April 21, 2012.  The list of submitted sessions so far is fantastic, but ultimately incomplete if you have not yet added yours to the mix.  The deadline for Houston is a week from today on March 15, so don’t delay!

SQLSaturdays are a great first time speaking opportunity, especially for local professionals, and once you’ve done one you’ll be craving another.  If you’ve been on the fence, take a moment today and submit: only good can come of it.  And if Houston is not convenient to you, take a look at upcoming SQLSaturdays around the world and perhaps one that is.

March 8, 2012 Posted by: Leave a comment - Permalink

Using Excel to parse Set Statistics IO output

When tuning T-SQL, “set statistics io on” is definitely your friend. However, I know I cannot be the only developer whose eyes glaze over when I flip over to the Messages tab in SSMS and am greeted with this:

Table 'sysobjrdb'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysschobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysobjrdb'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'spt_values'. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysschobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolrdb'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolrdb'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysschobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysclsobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysidxstats'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 8, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysidxstats'. Scan count 1, logical reads 12, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysclsobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spindtab___________________________________________________________________________________________________________000000001259'. Scan count 0, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysclsobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spindtab___________________________________________________________________________________________________________000000001259'. Scan count 0, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysclsobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spindtab___________________________________________________________________________________________________________000000001259'. Scan count 0, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysclsobjs'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spindtab___________________________________________________________________________________________________________000000001259'. Scan count 0, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spindtab___________________________________________________________________________________________________________000000001259'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysschobjs'. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysidxstats'. Scan count 0, logical reads 4, physical reads 1, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spcnsttab__________________________________________________________________________________________________________00000000125A'. Scan count 0, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spcnsttab__________________________________________________________________________________________________________00000000125A'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 1, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysobjvalues'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 2, logical reads 14, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysschobjs'. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spcnsttab__________________________________________________________________________________________________________00000000125A'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 1, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syscolpars'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spcnsttab__________________________________________________________________________________________________________00000000125A'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table '#spcnsttab__________________________________________________________________________________________________________00000000125A'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'syssingleobjrefs'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
No foreign keys reference table 'Person', or you do not have permissions on referencing tables.
Table 'sysobjvalues'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'sysmultiobjrefs'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
No views with schema binding reference table 'Person'.

That 58-line monstrosity is brought to us by our friend sp_help. Scanning through all that output to find the meaningful few values that I am trying to tweak can make my head spin. So, like most developers would, I got lazy.  No, not lazy in that I stopped using set statistics io.  Rather, I got so lazy that I started to use Excel.

Aha, much easier!

SQL Server set statistics io output formatted into columns in Excel (click for enlarged version)

It’s a quick and dirty couple of formulas, but it has worked reasonably well for my purposes. Here they are, for the interested. These start at row 2 (assuming a header row) and should be filled down. Note that I am not an Excel developer so these may not be the world’s most efficient formulas.


Object Name: =IFERROR(MID(A2,FIND("'",A2,1)+1,FIND("'",A2,FIND("'",A2,1)+1)-FIND("'",A2,1)-1),"")

Scan Count: =IF(LEN(A2)>0,IFERROR(MID(A2,FIND("Scan count",A2,1)+10,FIND(",",A2,1)-FIND("Scan count",A2,1)-10)+0,""),"")

Logical Reads: =IF(LEN(A2)>0,IFERROR(MID(A2,FIND("logical reads",A2,1)+13,FIND(",",A2,FIND("logical reads",A2,1))-FIND("logical reads",A2,1)-13)+0,""),"")

Physical Reads: =IF(LEN(A2)>0,IFERROR(MID(A2,FIND("physical reads",A2,1)+14,FIND(",",A2,FIND("physical reads",A2,1))-FIND("physical reads",A2,1)-14)+0,""),"")

If you’re interested in read-ahead reads, lob logical reads, lob physical reads, or lob read-ahead reads it would not take a lot of doing to add columns for those as well. I’m usually just using this to narrow down an IO issue in a large dump so I have not included them.

If you feel like downloading the Excel workbook I use, it is here. I can make no warranty and warn you to use at your own risk, but it’s really just a nicely formatted version of the formulas above.

March 7, 2012 Posted by: 12 Comments - Permalink

Speaking at PASS SQLRally 2012 in Dallas, TX

I am humbled and delighted to announce that my session Edge Case Testing for the Database Professional was chosen as a community pick for SQLRally 2012 in Dallas.  I feel honored and privileged to have this opportunity and I thank each of you for your support through the voting process.

SQLRally Dallas 2012 will be held May 8-11 in Dallas, Texas, with optional pre-conferences on the 8th and 9th and regular sessions on the 10th and 11th.  You can find more details and register here.

You can see this and all my past and upcoming speaking events on my SQL Server Events page.

Again, thank you for the opportunity, and I’ll see you in Dallas!

February 27, 2012 Posted by: 2 Comments - Permalink

Data Type Conversions with Coalesce and IsNull

One topic I have spoken on a few times in the past few months, but have not elaborated on a great deal, is the issue of bugs caused by data type conversions when using the IsNull expression in T-SQL.  This is a relatively subtle issue, though, and deserving of some attention, so I thought I would throw together a few quick examples.

The expressions IsNull and Coalesce are built-in T-SQL expressions that are concerned with the handling of null values.  Coalesce is an ANSI standard SQL expression while IsNull is a proprietary SQL Server specific function.  Definitionally, IsNull(value1, value2) returns value1 unless it is null, in which case it returns value2, even if it is null. The Coalesce expression behaves similarly but allows further fall-through – Coalesce (value1,value2,value3,…value5) woulld follow the same first two steps as IsNull, but could continue on through the list of values until reaching the first non-null value. As with IsNull, Coalesce will return null if all values in the list are null.

Many people would consider Coalesce(value1,value2) to be identical to IsNull(value1,value2), and I think that the documentation for the commands could be clearer in showing that that is not the case.  Besides the difference in numbers of possible parameters, the two commands also differ in a few other respects:

  • The query plans for IsNull and Coalesce may differ, with Coalesce often having the less efficient plan
  • The output type for a Coalesce expression is considered nullable even if that is not practically possible – for example, Coalesce(value1,1) will never return null, but the return type will be considered nullable.  The equivalent expression using IsNull would be considered not-nullable.  This has implications in some cases, including computed columns, as described in the documentation for Coalesce.
  • The result of IsNull always returns as the datatype of the first argument to IsNull, while Coalesce will return a datatype that accommodates all possible output types.  This is the difference I want to discuss today.

Coalesce is identical in behavior to a Case statement, but neither Coalesce nor a Case statement are identical to IsNull. By way of illustration, this example creates 3 string values of different types – char, varchar, and nchar – and inserts into a temp table.  For clarity let me point out that this and all examples in this post are using SQL Server 2008 R2.

declare @c10 char(10), @v20 varchar(20), @n30 nchar(30), @i int
select
 MyColumn =
 case
 when @c10 is not null
 then @c10
 when @v20 is not null
 then @v20
 else @n30
 end
into
 #ExampleCase -- Case Statement
select
 MyColumn = coalesce(@c10,@v20,@n30)
into
 #ExampleCoalesce -- Coalesce Statement

select
 MyColumn = isnull(@c10,isnull(@v20,@n30))
into
 #ExampleIsNull -- IsNull statement
drop table #ExampleCase
drop table #ExampleCoalesce
drop table #ExampleIsNull

The query plan is as follows (click for larger view):

For each of these table inserts there is an expression being evaluated – in all three cases Expr1003:

Viewing the Properties for each of the table inserts allows us to see the details of these expressions:

The three expressions are as follows:

Case Expression
[Expr1003] = Scalar Operator(
CASE WHEN [@c10] IS NOT NULL
THEN CONVERT_IMPLICIT(nvarchar(30),[@c10],0)
ELSE
CASE WHEN [@v20] IS NOT NULL
THEN CONVERT_IMPLICIT(nvarchar(30),[@v20],0)
ELSE [@n30]
END
END)

Coalesce Expression
[Expr1003] = Scalar Operator(
CASE WHEN [@c10] IS NOT NULL
THEN CONVERT_IMPLICIT(nvarchar(30),[@c10],0)
ELSE
CASE WHEN [@v20] IS NOT NULL
THEN CONVERT_IMPLICIT(nvarchar(30),[@v20],0)
ELSE [@n30]
END
END)

IsNull Expression
[Expr1003] = Scalar Operator(
isnull([@c10],CONVERT_IMPLICIT(char(10),
isnull([@v20],CONVERT_IMPLICIT(varchar(20),[@n30],0)),0)))

I can save you some string comparisons and let you know that the Case and Coalesce expressions are identical, and the IsNull expression is entirely different.  I’ve highlighted the Convert_Implict calls in the expressions for clarity: as you can see, the Case/Coalesce plan will return an nvarchar(30) while IsNull will return char(10).  But what if @n30 were larger than char(10)?  Funny you should ask!


declare @char char(10)

select [Coalesce] =
     coalesce(@char,N'ӫ This is my unicode string')

select [IsNull] =
     isnull(@char,N'ӫ This is my unicode string')

The resultset for this shows that coalesce maintains both the string length and the unicode encoding, while IsNull converts the data to the type of the first argument – and as a char(10) this means the data is truncated and the unicode character is lost.

Coalesce
---------------------------
ӫ This is my unicode string

IsNull
----------
? This is

We’ve been looking at strings so far, and while those are troublesome, the more in-your-face overflows and runtime errors really come into play when you start working with numeric data.  In this example the value 10000000000 is being used in an expression with an int data type, which is too small to store that large a value.


declare @int int

select [CoalesceColumn] =
 coalesce(@int,10000000000) into CoalesceTable

select [IsNullColumn] =
 isnull(@int,10000000000) into IsNullTable

select CoalesceColumn from CoalesceTable
select IsNullColumn from IsNullTable

select
 TableName = rtrim(object_name(c.object_id)),
 ColumnName = c.name ,
 ColumnType = t.name,
 MaxLength = c.max_length
from
 sys.columns c
 inner join sys.types t
 on c.system_type_id = t.system_type_id
where
 object_id in (
object_id('CoalesceTable'),
object_id('IsNullTable'))

drop table CoalesceTable, IsNullTable

The resultset shows an overflow for the IsNull expression, while the Coalesce expression proceeds without trouble.  The resulting tables have different column data types – int for IsNull, Numeric for Coalesce.

Msg 8115, Level 16, State 2, Line 5
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.

CoalesceColumn
---------------------------------------
10000000000

IsNullColumn
------------

TableName          ColumnName        ColumnType   MaxLength
------------------ ----------------- ------------ ---------
IsNullTable        IsNullColumn      int          4
CoalesceTable      CoalesceColumn    numeric      9

This can be a killer bug to spot in production – it manifests itself as a data type overflow that only happens when a particular column or argument is null.  I’ve seen this crop up when a column or variable type was sized up to prevent an overflow and the sister data type in the isnull statement was left alone, so it gives the impression that the original problem was not solved.  A lot of time can be wasted head-scratching and looking at the column sizes on base tables when it really just comes down to a fleeting variable in a stored procedure or a parameter with an outdated type.

One of the common code patterns I’ve seen using IsNull is to use it in the predicate of a select statement in a procedure to allow optional filtering of a resultset.  The logic is “if no filter, return everything, otherwise filter.”  It’s clever-looking and it saves typing but it is not very good for performance, as explained by Kimberly Tripp (b | t).  Be that as it may, it is commonly used.  However, it is susceptible to bugs due to data type mismatches.

This code snippet creates and populates a sample table and creates a stored procedure which filters for any input filter string, and returns all if no string is provided.  For clarity, one result is returned for filtering with IsNull and a second with Coalesce.


create table MySampleTable
(myKey int identity primary key clustered,
myValue nvarchar(30))

go

insert into MySampleTable(myValue)
select 'The'
union all select 'Quick'
union all select 'Brown'
union all select 'Fox'
union all select 'Jumped'
union all select 'Over'
union all select 'The'
union all select 'Lazy'
union all select 'Dog'

create procedure p_LookupOrReturnAll
(
 @filterString nvarchar(5)
)
as
begin

select
 FilterType = 'isnull',
 myKey,
 myValue
from
 MySampleTable
where
 myValue = isnull(@filterString,myValue)

select
 FilterType = 'coalesce',
 myKey,
 myValue
from
 MySampleTable
where
 myValue = coalesce(@filterString,myValue)
end

This code has a contrived bug in that the filterString parameter is nvarchar(5) while the table which is being filtered has a column type of nvarchar(30), and that will cause problems.  In the next code segment we execute the procedure with three different arguments:


set nocount off
exec p_LookupOrReturnAll 'Quick'
exec p_LookupOrReturnAll 'Jumped'
exec p_LookupOrReturnAll NULL

The results are as follows:

FilterType myKey       myValue
---------- ----------- ------------------------------
isnull     2           Quick

(1 row(s) affected)

FilterType myKey       myValue
---------- ----------- ------------------------------
coalesce   2           Quick

(1 row(s) affected)

FilterType myKey       myValue
---------- ----------- ------------------------------

(0 row(s) affected)

FilterType myKey       myValue
---------- ----------- ------------------------------

(0 row(s) affected)

FilterType myKey       myValue
---------- ----------- ------------------------------
isnull     1           The
isnull     2           Quick
isnull     3           Brown
isnull     4           Fox
isnull     6           Over
isnull     7           The
isnull     8           Lazy
isnull     9           Dog

(8 row(s) affected)

FilterType myKey       myValue
---------- ----------- ------------------------------
coalesce   1           The
coalesce   2           Quick
coalesce   3           Brown
coalesce   4           Fox
coalesce   5           Jumped
coalesce   6           Over
coalesce   7           The
coalesce   8           Lazy
coalesce   9           Dog

(9 row(s) affected)

The first two rows resultsets return exactly as expected.  The second two return nothing because the short parameter truncates the string “Jumped” to “Jumpe” and it fails to match.  The last two results show that IsNull filters out one row – “Jumped” – because of the conversion of the column [myValue] to nvarchar(5), which again causes an attempt to match “Jumped” to “Jumpe” and returns no row.  By contrast, Coalesce returns all 9 rows as expected.

Now the takeaway here should not be to use Coalesce in preference to IsNull in all cases.  There are other differences between the expressions that should be considered, most especially the differences between the query plans.  A good example of this can be found here on the SQL Server Engine Tips blog.  There is nothing that I have described here that cannot be solved through careful selection and handling of data types in your code.  However, in the rough and tumble world of database development, where multiple developers may be jostling for space in a database and making changes that are not well communicated to the whole team, it may be worth hardening your code to prevent these problems, be that by choosing a different expression, by using explicit type casting, or simply by coming up with a codepath that is less susceptible to failure when data types are mutable.

Thank you for your time, and if you have any questions or corrections, please let me know!

February 22, 2012 Posted by: 1 Comment - Permalink

SQLRally Dallas 2012 – Community Voting

It’s that time of year!  The pre-cons and regular sessions for SQLRally Dallas 2012 have been announced and it is already looking like an exciting event.  I was fortunate enough to attend SQLRally Orlando 2011 and I had a fantastic time.

My session Edge Case Testing for the Database Professional has proceeded to the voting round, which allows for the SQL community at large to choose the final 20 sessions to be included in the schedule.  It has been my observation that many environments do not approach quality control for their database projects with nearly the same level of detail as they might with other software projects, and in smaller environments  there may be no QA whatsoever.  In my session I cover the basics of test case and test plan preparation and go over many of the edge case scenarios that developers and administrators alike should be aware of.  One of the members of the SQLdm QA team attended this presentation recently and told me they should make me an honorary QA engineer – high praise indeed!

If this sounds like something you would be interested in attending at SQLRally I would really appreciate your vote, as would many other speakers.  There are many worthy sessions and you may choose as many as you like.  Voting is open until 1 PM on Friday, February 24.

If you are a SQL Server professional and have not already done so, consider joining the Professional Association For SQL Server, PASS, at this link.  It is a free membership and a great way to keep informed about local and national SQL Server training events like SQLRally

February 17, 2012 Posted by: 1 Comment - Permalink

Speaking at Houston Area SQL Server User Group 2/14

I am speaking on Edge Case Testing for Database Professional at the Houston Area SQL Server User Group meeting on 2/14/2012 at 11:30 AM.  Further details and directions are available here.

In this presentation I go over the basics of test case and test plan creation and cover the 9 main areas of edge case testing that I try to account for in all of my database application design, including:

  • Data Types
  • Language and Regionalization
  • Date/Time Considerations
  • Performance
  • Usability/Maintainability
  • Security
  • Integration
  • Recoverability
  • Compatibility

I will be updating this post with a link to my slide deck after the presentation is complete.  

EDIT: Thank you all for taking time to come to my presentation.  The slide deck is available here.

February 14, 2012 Posted by: Leave a comment - Permalink