Monday, May 27, 2013

How to Size the SQL Server Memory & Why is it important?

The following blog post has 2 objectives.

 1) How to size the memory correctly ( or atleast being close to correctly )
 2) Why is it important to size the memory ( & not leave it default after installation )



First let me address item 2.

Setting a upper limit to the memory that SQL Server can use is termed as sizing memory on SQL Server.One of the common mistakes done on SQL Server Database Server is not sizing the memory. By default, after a typical installation, SQL Server is not set any upper limit on the amount of memory it can use. In other words, after a default installation, SQL Server is configured to use any amount of memory that is available on the Server. For ex ., if you have 32 GB RAM server and set up SQL Server, by default settings SQL Server is allowed to use the entire 32 GB. After a default installation, the memory setting of a server looks like the picture below.

The number "2147483647" just implies SQL Server has no upper limit and will use all the memory on the server.

      As I explained previously over here, SQL Server by design uses all the memory that is allocated to it. For ex., If SQL Server is allocated 26 GB of RAM ( out of say 32 GB on the server ), it utilizes entire 26 GB. SQL Server is designed in such a way, so that it reduces the need to read the disk frequently and leverages maximum on the memory allocated to it.

    So, why is it bad not to configure a upper limit? The reason is as SQL Server starts using the memory, its memory utilization goes on increasing ( by design as explained above ) day by day and at one point will leave very little memory for Windows Operating system to function. Windows Operating system requires adequate memory for healthy functioning of the server. When SQL Server has no upper limit, SQL Server consumes all the memory on the server, leaves operating  system to starve, bringing slowness to overall server. Usually under such circumstances, the memory utilization is over 97 or 98% with almost 90% used by SQL Server.

   When the Operating system experiences memory pressure, though SQL Server may be functioning as usual, physically perfoming any operation on the server ( logging in to the server, copying files, opening sql server client tools  etc ) may be extremely slow. Prolonged persistance of the problem ( for a few days to a week ), can result in a unexpected shutdown / restart of the server. Like many of Microsoft problems, after restart the problem vanishes for a while and comes back once memory utilization increases again.

  Now for the second part of the post. How to Size the memory?

    Though sizing the memory accurately is a task to be done meticulously taking quite a few parameters into account, I stumbled upon a dummies guide to configure the memory needed for SQL Server and OS to function smoothly.The article written by Jonathan Kehayias, a industry expert, provides a simple formula in layman terms, with which you ensure you have configured enough memory for the OS & SQL to run smoothly. Please refer to the article here.

   Brief explanation of the technique to size the memory is given below. The amount of memory to be given to Operating System

  1.  1 GB of memory reserved to Operating System
  2.  1 GB each for every 4 GB in 4 to 16 GB
  3.  1 GB each for every 8 GB in more than 16 GB.
ie., if you have a 32 GB RAM Database Server, then memory to be given to Operating System would be

  1. 1 GB, the minimum allocation +
  2. 3 GB, as 4 to 16 GB has 12 GB and 12 Divided by 4 GB ( each 4 GB gets 1 GB ) is 3GB +
  3. 2 GB, as 16 to 32 GB has 16 GB and 16 divided by 8 ( each 8 GB after 16 GB gets 1 GB ) is 2 GB
So in total for a 32 GB server, 7 GB will be reserved for the Operating System. This implies maximum memory allocated to SQL Server should be 25 GB. Similarly, for a 64 GB Server, 10 GB should be reserved for the Operating System & 54 GB should be allocated for SQL Server.

Above method ensures Operating System has enough memory to function smoothly and doesnt starve for memory.One may attempt to increase the max memory allocation to SQL Server more than what is recommended by the technique above when SQL Server internally is facing memory pressure. However, by doing so, one needs to ensure, Operating System doesn't starve for memory. The above technique of sizing the memory is sufficient for both OS & SQL in most of the cases.

To conclude, as a DBA,it is extremely important to ensure upper limits for memory is configured for SQL Server to prevent sudden performance issues.

Thursday, May 16, 2013

Xp_readerrorlog in SQL Server 2012


I have always been a big fan of Xp_readerrorlog. The simple reason is that it much faster to read errorlog using the script compared to using SSMS. xp_readerrorlog gives more options to filter the log for specific dates or containing specific string etc

In SQL Server 2012, xp_readerrorlog has undergone a minor change. if you use xp_readerrorlog as used before you may encounter the following error

for ex: -
 
EXEC master.dbo.Xp_readerrorlog
  0,
  1,
  '',
  '',
  '20130502 00:00',
  '20130503' 


throws a error  saying

Msg 22004, Level 12, State 1, Line 0
Error executing extended stored procedure: Invalid Parameter Type


If you are like me, who likes to capture the results of errorlog into a table then you may get a different error.

for ex:
 
INSERT INTO #read_error_log
EXEC master.dbo.Xp_readerrorlog
  0,
  1,
  '',
  '',
  '20130502 00:00',
  '20130503' 



Msg 0, Level 11, State 0, Line 0
A severe error occurred on the current command. The results, if any, should be discarded.


The reason for the error is that the third and the fourth parameter, to be passed as a search strings strictly accepts nvarchar Datatype. Adding the letter 'N'; makes the parameter into nvarchar as shown below .
 
EXEC master.dbo.Xp_readerrorlog
  0,
  1,
  N'',
  N'',
  '20130502 00:00',
  '20130503' 


The script provided below will help one read the error log quickly and effectively. The script is similar to the one shared in this blog previously over here. Instead of using sp_readerrorlog, the script below uses xp_readerrorlog with additional parameters which makes it even more effective. sp_readerrorlog doesn't provide time based filters (6th & 7th parameters) which are the most important filters while reading errorlog.
 
CREATE TABLE #read_error_log
  (
     logdate      DATETIME,
     processinfo  VARCHAR(200),
     errorlogtext VARCHAR(max)
  )

INSERT INTO #read_error_log
EXEC master.dbo.Xp_readerrorlog
  0,
  1,
  N'',
  N'',
  '20130502 00:00',
  '20130503'

SELECT *
FROM   #read_error_log
ORDER  BY logdate DESC

TRUNCATE TABLE #read_error_log

DROP TABLE #read_error_log 

Friday, March 1, 2013

When was the last successful backup?

What is the fastest and perhaps easiest way to find the last successful database backup?
Just right click on the Database on pick properties on the management studio.


 



On top of window the last full backup completion time and log backup completion time is shown.
This is perhaps the simplest way of finding the success of a daily backup without even typing a command.