Monday, December 31, 2012

In-Memory OLTP technology to SQL Server

High transaction throughput is vital in today’s applications. No matter the database product delivered with rich set of features if it unable to handle the high transaction throughput. It seems that MS SQL Server product team has seriously considered this and they have developed in-memory technology to achieve the goal and it will be released with the next major version of SQL Server. The code name of the project is “Hekaton”. Below are some links I found about “Hekaton” and thought of sharing.

Hekaton Breaks Through

How Fast is Project Codenamed “Hekaton” – It’s ‘Wicked Fast’!

SQL Server In-Memory OLTP technology Project “Hekaton” Riles Oracle VP

Friday, November 30, 2012

MongoDB to the Windows Azure cloud

In my previous post, it is mentioned the need of NoSQL product from Microsoft. I did some Googling and found several interesting posts where Microsoft and 10gen (one of the famous document oriented database vendors who invented MongoDB) has come to a collaboration to bring MongoDB to the Windows Azure cloud. Below are the links found related to Microsoft and MongoDB.

MongoDB on Azure Cloud Services

Going NoSQL with MongoDB

Sunday, November 25, 2012

Future versions of SQL Server?

It is publicly known fact that RDBMSs have well known limitations when it comes to unstructured data management. This is the reason to emerged new database technologies like NoSQL. It is true that NoSQL is not a replacement for RDBMS. However the time has come to include NoSQL features into RDBMS products (then it can’t be called it as RDBMS) or introduce brand new NoSQL product from RDBMS vendors. We need to wait and see how Microsoft react to these new technology trends and how SQL Server change accordingly. Oracle has already announced their NoSQL version called Oracle NoSQL Database. Will Microsoft come up with new NoSQL product?

Thursday, November 1, 2012

Know about your SQL Server memory

Knowing about computer memory management is the basis to learn SQL Server memory architecture. Troubleshooting SQL Server memory is the topic which is not being discussed more often. Last couple of days I spent more time on reading SQL Server memory related articles, blogs and books. In this I got to know about deprecated memory feature used in SQL Server 2008 R2 and previous versions. That is “AWE Enable” option. This feature is no more in SQL Server 2012. I found below posts on SQL Server memory and thought of sharing.

SQL Server 2012 Memory Manager KB articles
Pushing the Limits of Windows: Virtual Memory
Pushing the Limits of Windows: Physical Memory

Saturday, October 20, 2012

Test data generator using T-SQL

Some times DBAs want to generate test data to test some applications designed and developed by themselves. I have faced many situations like this in past. Recently also I had to generate some test data to test one of my research projects. Below is the ERD of the application. It is just four tables.

  1. Customer ( master data)
  2. Item (master data)
  3. SalesOrder (Sales order header data)
  4. SalesOrderDetail (Sales order detail data)

image

Friday, October 5, 2012

Currently executing queries

It is important to know what is currently executing in SQL Server at any given time. You may need this for troubleshooting purposes. The below query is useful in such occasions;

SELECT sqltext.TEXT,
req.session_id,
req.status,
req.command,
req.cpu_time,
req.total_elapsed_time,
s.login_name,
s.host_name,
req.wait_type,
req.wait_resource,
SUBSTRING(sqltext.text, (req.statement_start_offset/2)+1,
        ((
CASE req.statement_end_offset
         
WHEN -1 THEN DATALENGTH(sqltext.text)
        
ELSE req.statement_end_offset
        
END - req.statement_start_offset)/2) + 1) AS statement_text,
req.plan_handle,
sqltext.text
FROM
sys.dm_exec_requests req
INNER JOIN sys.dm_exec_sessions s
   
ON req.session_id=s.session_id
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext
ORDER BY req.cpu_time DESC

Wednesday, July 18, 2012

Meaning of auto generated STATISTIC name

Use sys.stats catalog to see all the STATISTICS available for the database. To see the statistics in AdventureWorks2012 database, use the following T-SQL statement.

USE AdventureWorks2012
GO
SELECT * FROM sys.stats

You would see the output something like below; (only the values of name column appeared)

_WA_Sys_00000009_00000005
_WA_Sys_00000005_00000005
_WA_Sys_00000003_00000005
_WA_Sys_00000004_00000005

How do you understand above names.

_WA - Washington, the state of the US where SQL Server development team is located.
All automatically generated statistics have the name starting with _WA_Sys. The first number is the column id of the column which these statistics are based on. The next number is the hexadecimal number of the object id of the table.
Source: Inside the SQL Server Query Optimaztion by Benjamin Nevarez

Tuesday, July 10, 2012

Replication framework between relational database and document oriented (NOSQL) database

I’m working on a research project of building a replication framework from relational database to document oriented (non-relational/nosql) database system. This is to fulfill the research requirement of the MSc degree. The POC would be developed to demonstrate the solution by using MS SQL Server (relational) and mongoDB (document oriented).  I would like to hear any comments/thoughts about this from the community.


I started a discussion on this in LinkedIn mongoDB user group and received couple of valuable comments. Many thanks for those who given the comments. 


You can see them here.

Thursday, July 5, 2012

sp_attach_db / sp_detach_db are deprecated

These two system stored procedures are used frequently by DBAs in general administrative work. However these are marked as deprecated so they will be removed from future versions of SQL Server. I checked in MSDN and from SQL Server 2005 these are marked as deprecated. Interestingly they still available in SQL Server 2012 release as well. However it recommends not use them for any development work and use the new method using CREATE DATABASE statement. 

sp_attach_dbsp_detach_db


Attaching a database using CREATE DATABASE;


CREATE DATABASE Archive
      ON (FILENAME = 'D:\SalesData\archdat1.mdf')
      FOR ATTACH ;
GO

Tuesday, June 26, 2012

How to use SQL Server Management Studio (SSMS) effectively

SSMS is the primary development tool use by SQL Server professionals to work with SQL Servers. This has introduced in SQL Server 2005 and it has evolved with rich features till SQL Server 2012. In this article I discuss some of the valuable features of SSMS which increases the productivity and efficiency of users. The target audience would be any user (beginner to intermediate) who deals with SSMS to do various tasks with SQL Servers. 
sql-server-performance.com

Tuesday, April 24, 2012

Log Shipping Error: The server 'LOGSHIPLINK_%' already exists

I received the below error while setting up log shipping with monitoring server.


Save Log Shipping Configuration
- Saving secondary destination configuration [SERVER].[Database] (Error)
                Messages
                * SQL Server Management Studio could not save the configuration of 'SERVER' as a Secondary. (Microsoft SQL Server Management Studio)                
                ------------------------------
                ADDITIONAL INFORMATION:
                
                An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo)
                
                ------------------------------
                
                The server 'LOGSHIPLINK_% ' already exists. (Microsoft SQL Server, Error: 15028)
                
                For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=10.00.4316&EvtSrc=MSSQLServer&EvtID=15028&LinkId=20476
                
- Saving primary backup setup (Stopped)
- Saving Monitor configuration (Stopped)
- Rolled Back (Success)

Without the monitoring server it was successful. Then I dig into the issue and found the issue is related to the linked server mentioned in the error. (The server 'LOGSHIPLINK_% ' already exists.) The secondary server in the environment was recently built. So all the linked servers in the primary server are created manually in the secondary server as well. However the linked server mentioned in the error is related to log shipping and it should create through the log shipping process. So I dropped that linked server from secondary server and then setup the log shipping again and this time it was successful. Then I checked the existence of the same linked server in secondary server, and it was there again. 

Just thought of sharing the information. Cheers for reading the blog post.  

Wednesday, March 28, 2012

SQL Server 2012 Tutorials and Sample Databases

Microsoft SQL Server 2012 has released RTM recently. No doubt that you're interested to learn new features of the new version. If so then it is useful to have the tutorials and sample databases installed on your server. You can get them using below links;

Tutorials here

Sample Databases here

Note that sample databases are just data files (No log file). You need to attach it to your server to create the databases. Use the following T-SQL script.

CREATE DATABASE AdventureWorks2012
ON (FILENAME = 'D:\SQL Server\DATA\AdventureWorks2012_Data.mdf')
FOR ATTACH_REBUILD_LOG

Enjoy with the new version of SQL Server.

Saturday, March 24, 2012

Reasons that can not reclaim transaction log space

The best practice is to setup log backups periodically (if the database is not in SIMPLE recovery) in order to keep the database log size minimum. However there are some occasions even the log backup is setup, the log size keeps growing. To find out the reason you can refer, log_reuse_wait in sys.databases.

As per MSDN, log_reuse_wait is as follows; Also look at the log_reuse_wait_desc column for more detail.

Reuse of transaction log space is currently waiting on one of the following:
0 = Nothing
1 = Checkpoint
2 = Log backup
3 = Active backup or restore
4 = Active transaction
5 = Database mirroring
6 = Replication
7 = Database snapshot creation
8 = Log Scan
9 = Other (transient)

Cheers for reading this blog post.

Wednesday, March 21, 2012

How to get the execution plan of a stored procedure?

Think about a situation where you have identified a badly performing stored procedure in Production environment and now you want to see the execution plan of it for further analysis. Please note, executing the procedure with sample parameter values in Production environment should be done with extra care because if the procedure is only SELECT then no issue but if it has DML operations, then issue.

You can use following DMVs to get the execution plan either XML format or graphical one.

USE AdventureWorks2008R2

EXEC [dbo].[uspGetEmployeeManagers] @BusinessEntityID=60
Note: I executed the SP first so that plan will be created. In actual environment this is not necessary.

select Plan_handle from sys.dm_exec_procedure_stats ps
where ps.database_id=DB_ID('AdventureWorks2008R2')
              and object_id=object_id('dbo.uspGetEmployeeManagers')  


--Copy and paste the plan handle to the below query
select * from sys.dm_exec_query_plan(0x05000500EAE8807F40E13287000000000000000000000000)

Cheers for reading this blog post.


Friday, March 16, 2012

Multi-Subnet Clustering

This is a new feature in SQL Server 2012. However Windows 2008 has this feature. With the introduction of the new feature, SQL Server can have a cluster node in a different subnet. Subnet is a virtual grouping concept available in network. Subnet will enable network administrators to manage networks effectively over geographical areas. 


Since the subnet is a geographical network concept, now SQL Server clustering can be used as DR (Disaster Recovery) solution than ever before. 


More information can be found here.


Tuesday, March 6, 2012

Parameter Sniffing

What is it? First we will go through the scripts below;

Create the following index;

USE [AdventureWorks2008R2]
GO

/****** Object:  Index [NNI_SalesOrderDetail_ModifiedDate]    Script Date: 03/06/2012 15:09:41 ******/
CREATE NONCLUSTERED INDEX [NNI_SalesOrderDetail_ModifiedDate] ON [Sales].[SalesOrderDetail]
(
     [ModifiedDate] ASC
)

Then create the below stored procedure. It is a very simple SP which has single SELECT statement for Sales.SalesOrderDetail table. There are two parameters for two date values.

Friday, March 2, 2012

VLDBs backup in few seconds

VLDB - Very Large Databases, like several Gigs of databases. (50GB, 100GB, etc) How about managing backups? You may need to spend 20-30 hours to take a full backup of such DBs. Of course SQL Server 2008 has improved backup system than its earlier versions. Another alternative method would be to use third party tools like, LiteSpeed or RedGate backups. But stills you can't make it in few seconds. How about a system to take such a large DB backup in a few seconds? ANSWER is to use SAN snapshot backup.

Wednesday, February 29, 2012

Reading a deadlock

Deadlock is a common phenomena in databases where users can work concurrently. Deadlock happens because of pessimistic concurrency control mechanism which is essentially use of locks. You can't prevent deadlocks but of course you need to minimize them occurring. There are several ways of capturing deadlocks in SQL Server. They are;

1. Using SQL Server Profiler
2. Using SQL Server Error Log

The objective of the blog post is to read and understand the deadlock graph which is captured in SQL Server Error log.

Friday, February 17, 2012

Pearson Lanka DBA team at SS SLUG

Pearson Lanka DBA team presented six demo's at last meetup of SS SLUG (http://sqlserveruniverse.com/v2/ssslug.aspx). The series of demo's named as WirlWind DBA session. It is really a mind blowing session which each session limited to ten minutes including three minutes Q&A. 

Why so special about these presentations? Yes; it is based on our day to day DBA activities, which we encountered and gathered the knowledge after doing many research. So thought of sharing the valuable information to the community who loves databases. 

There I presented on how to read a deadlock graph which I'm going to do a small write up and publish in future. 

I've the videos of all the presentations and hoping to upload them to youtube. 

How to interpret Disk Latency

I was analyzing IO stats in one of our SQL Servers and noticed that IO latency (Read / Write) are very high. As a rule of thumb, we know tha...