How to tell if your Azure SQL Stored Procedure is Running

One of the capabilities of the Cotega service is the ability to schedule execution of stored procedures in Azure SQL.  This is handy for many administrators because Azure SQL does not have a SQL Agent capability.  One of the interesting side effects I have seen from this is that long running stored procedures can cause issues when they are scheduled to run frequently.  If the previous execution of the stored procedure had not completed before the next one starts, there can be issues such as locking or even extreme cases where they pile up and eventually cause connection drops due to reaching the maximum number of request counts.

One solution that I have come up with is to add code to the start of a stored procedure that detects if it is already running.  If it is, it simply returns.  It was interesting to see that there was not a lot of information that I could find on the internet to detect if an Azure SQL stored procedure is running, so I decided to build one.  This is what I came up with which will return a count of the number of times a specific stored procedure is currently running:

CREATE FUNCTION [dbo].[CheckStoredProcActive] (@StoredProcnvarchar(255))
RETURNS int
as
begin
declare @spCount int;
set @spCount=(select count(*) FROM sys.dm_exec_sessionss
INNER JOIN sys.dm_exec_requestsr
ON r.session_id=s.session_id
CROSSAPPLY sys.dm_exec_sql_text(r.sql_handle) AS SQL
INNER JOIN sys.objects o
ON SQL.objectid=o.object_id
WHERE  s.is_user_process= 1
AND r.database_id=db_id()
AND [email protected]);
RETURN @spCount;
end;

Then you can simply add the following to the start of your procedure where you replace [ENTER_STORED_PROC_NAME] with your stored procedure name:

if (dbo.CheckStoredProcActive(‘[ENTER_STORED_PROC_NAME]’)> 0)
   return;
I hope others find this helpful.
Liam

How to tell if your Azure SQL database performance issues are data center related

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

Working on a monitoring service for SQL Azure Azure SQL Database allows me to get a really good idea of the common issues DBA’s have when working with their database. If I had to choose the most common area of frustration I hear from DBA’s is whether their performance issue is related only to their database or to the entire data center where their database is hosted? It has been a goal of mine from the very early days of Cotega to help solve this problem. Today I am happy to announce that within Cotega, you are now able to see your performance data overlayed with that of the data centers performance. As you can see below, here is an example of connection latency done from outside the Azure data centers. It shows the connection latency of my database (in blue) along with the average latency of all other customers in this datacenter (in orange).

connection_latency

This has taken me some time to build because in order to get a true picture of the data center you are in, I needed to have enough customers in each of the data centers such that I could get an aggregated average of all the data gathered from the monitoring of these database without exposing any information about any one customers performance data.

As of right now, I have a really good picture of all the major Azure data centers (with the exception of East Asia). In fact, even without a Cotega account, you can see the status of each of the data centers here. Please note that this data is related only to that of SQL Database and does not reflect that of other Azure services.

I hope that all of you will consider creating an account to help everyone create an absolutely perfect picture of the health of SQL Azure database data centers. To learn more about how to create an account, please visit www.cotega.com.

Liam

How to Perform SQL Azure Performance & Diagnostics Analysis for SQL Azure

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

A fully managed services such as SQL Azure has a number of advantages but there are also a few disadvantages. One of these disadvantages is in diagnosing performance issues. This can be tricky since you do not have access to the machine like you would with an on-premises SQL Server. To date, I have limited Cotega to monitoring of current database issues and then notify users when issues occur. However, more recently, I have added diagnostics capabilities to allow DBA’s to then drill down and really understand what is causing the issue. To do this, I use many of the queries outlined below, many of which were gathered from the expertise of the SQL Server and SQL Azure MVP’s. If you have others that you like to use, I would love to hear from you. For more information on how Cotega does this, please see the following page on Cotega Performance & Diagnostics Analysis.

Performance and Diagnostics Queries for SQL Azure Databases

Top 10 Most CPU intensive queries

SELECT TOP 10 query_stats.query_hash AS \”QueryHash\”,
SUM(query_stats.total_worker_time) / SUM(query_stats.execution_count) AS \”AvgCPUTime\”,
MIN(query_stats.statement_text) AS \”StatementText\”
FROM
(SELECT QS.*,
SUBSTRING(ST.text, (QS.statement_start_offset/2) + 1,
((CASE statement_end_offset
WHEN -1 THEN DATALENGTH(st.text)
ELSE QS.statement_end_offset END
– QS.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats AS QS
CROSS APPLY sys.dm_exec_sql_text(QS.sql_handle) as ST) as query_stats
GROUP BY query_stats.query_hash
ORDER BY 2 DESC

Top 10 Most Resource Intensive Queries

SELECT highest_cpu_queries.plan_handle, highest_cpu_queries.total_worker_time, q.dbid,
q.objectid, q.number, q.encrypted, q.[text]
FROM
(SELECT TOP 10 qs.plan_handle, qs.total_worker_time
FROM sys.dm_exec_query_stats qs
ORDER BY qs.total_worker_time desc) AS highest_cpu_queries
CROSS APPLY sys.dm_exec_sql_text(plan_handle) AS q
ORDER BY highest_cpu_queries.total_worker_time desc”;

Current Connection Info

SELECT e.connection_id, s.session_id, s.login_name, s.last_request_end_time, s.cpu_time
FROM sys.dm_exec_sessions s INNER JOIN sys.dm_exec_connections e ON s.session_id = e.session_id

Top ten Running Queries having the Longest Total Elapsed Time & are Blocking Other Queries

SELECT TOP 10 r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level, r.row_count, st.text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) as st
WHERE r.blocking_session_id = 0
and r.session_id in
(SELECT distinct(blocking_session_id) FROM sys.dm_exec_requests)
GROUP BY r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level,r.row_count, st.text
ORDER BY r.total_elapsed_time desc

Row Count Aggregate Information (total rows, min rows, max rows and last rows) for Queries

SELECT qs.execution_count,
SUBSTRING(qt.text,qs.statement_start_offset/2 +1,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(nvarchar(max), qt.text)) * 2
ELSE qs.statement_end_offset end –
qs.statement_start_offset
)/2
) AS query_text,
qt.dbid, dbname= DB_NAME (qt.dbid), qt.objectid,
qs.total_rows, qs.last_rows, qs.min_rows, qs.max_rows
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt
WHERE qt.text like ‘%SELECT%’
ORDER BY qs.execution_count DESC;

Top 10 Most Expensive Queries by Logical Reads

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_reads DESC

Top 10 Most Expensive Queries by Logical Writes

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_writes DESC

Top 10 Most Expensive Queries by Worker Time

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_worker_time DESC

Queries Taking Longest Elapsed Time

SELECT TOP 10
qs.total_elapsed_time / qs.execution_count / 1000000.0 AS average_seconds,
qs.total_elapsed_time / 1000000.0 AS total_seconds,
qs.execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE
qs.statement_end_offset
END
– qs.statement_start_offset)/2) AS individual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_seconds DESC;

Queries doing the Most I/O

SELECT TOP 10
(total_logical_reads + total_logical_writes) / qs.execution_count AS average_IO,
(total_logical_reads + total_logical_writes) AS total_IO,
qs.execution_count AS execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE qs.statement_end_offset END – qs.statement_start_offset)/2) AS indivudual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_IO DESC;

Calculate the Database Size (MB)

SELECT SUM(reserved_page_count) * 8192 / 1024 / 1024 as DatabaseSize FROM sys.dm_db_partition_stats

Check out the new look of Cotega – SQL Azure Monitoring Service

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

It has been some time since I have posted an update, but the past few months have been quite busy. One of the things that I have been working on is a major update to the Cotega service. I heard a lot of feedback from people on the v1 version of Cotega and I think one of the biggest things I heard was in the area of ease-of-use. There are a lot of things that people want to be able to monitor for their database and I just made it too hard to add all of the things needed. So I decided to step back and try to come up with a new way to set up your database monitoring. Now, you simply point Cotega to your database, provide an email to send notifications and you are ready to go.

In addition, there is now a “Free” subscription that does not require a credit card and allows you to try out the service before making a commitment. The key difference between the Free plan and a paid one is that monitoring is only done once every 2 hours (as opposed to every 10 min for a paid account). In additions paid accounts will send email notifications when database issues occur.

cotega_dashboard

One Response to “Check out the new look of Cotega – SQL Azure Monitoring Service”

How to use PayPal with ASP.NET MVC

I talked previously about why I chose to use Stripe with Cotega to allow me to accept credit card payments. In it I talked about how excellent their support was and how easy it was to take their samples and implement it into my service. I am still extremely pleased with my choice of using Stripe and I have no intention of moving away from it. I have however always wanted to investigate using PayPal as an alternative payment option because occasionally International customers will have credit cards that are not acceptable by Stripe or they just plain prefer to use PayPal. For this reason, I decided to take some time and understand what it would take to implement PayPal into my MVC based service.
Unfortunately, I did not find this to be quite as simple to implement as it was with Stripe. Although PayPal has a number of examples, none of the ones I could find were targeted for MVC and there did not seem to be any good tutorials on this subject to get me started. I hope this blog post will help anyone thinking to implement PayPal with MVC. I do want to caveat this with the fact that I am far from a PayPal expert and I do not claim that this is the “PayPal preferred” way to implement this other than to say that I have tried to convert other examples as best I can and it has worked well for me so far. If you have any comments or suggestions, I am very interested to hear your feedback.

Step 1 – Choosing PayPal IPN vs. PDT vs. Express

One of the first things I needed to decide on was the appropriate “PayPal Responder” to use. A responder is like a callback that PayPal does when the transaction is complete (successful, unsuccessful, invalid, etc.). It can also return custom variables that you set when the user first clicks the “Pay Now” button.
Since we are using MVC, we first need to create a View that initiates the transaction to PayPal. This is where the user clicks the “Pay Now” button and is re-directed to PayPal where they enter their credit card or PayPal account information. After this is complete, PayPal can initiate a responder which calls your service with the results of the transaction. The flow of the process looks something like this:

paypal_payment_process

You really only need to use a responder in the cases where you need to continue with some process after that transaction completes. For example, in my case I want to be able to enable a user subscription once the transaction completes. If you are simply accepting payments, there may be no need to go to the complexity of using responders.
There is lots of information on responders but I found that Instant Payment Notifications (IPN) was the best choice for me, and probably yourself as well if you simply want a callback from PayPal to your MVC controllers.

Step 2 – Implementing the View and Form

The next step is to create a payment page using a View where the user can choose to start the PayPal process. Here is an example of a HTML form that I used.

There are a couple of things you should notice in this form:

  • The commented out section allows me to either submit the payment request to the PayPal sandbox (test) environment or to the actual live PayPal url.
  • Notice all the input type=”text” types. These are values that are returned to my controller after the transaction is complete. When I load the page, I set the “device-id” input type to a custom value that I want PayPal to return when the transaction completes. I can use this value to update the user’s status in my SQL database (assuming the transaction is successful).
  • The “return” input type URL defines where the user will be redirected to after the transaction is complete. It is important that you don’t rely on this as a method for completing a transaction because most users will never click on this link. Don’t forget to create this page.
  • The notify_url input is the IPN URL that PayPal will call after the transaction completes.
  • Remember to associate the business input type email address with the PayPal address you used for your sandbox or live account
  • There are a lot more custom values you can use and learn more about here.

Step 3 – Creating an IPN Controller in MVC

Next we will want to create a new ActionResult controller that will receive the responder request from PayPal after the transaction completes. I want to give credit to this page where I was able to leverage a lot of code for this controller. You should also remember to actually create an IPN view (IPN.cshtml) page because PayPal will continue to call this controller until it gets a successful page returned.

public ActionResult IPN() { // Receive IPN request from PayPal and parse all the variables returned var formVals = new Dictionary(); formVals.Add("cmd", "_notify-validate"); // if you want to use the PayPal sandbox change this from false to true string response = GetPayPalResponse(formVals, false); if (response == "VERIFIED") { string transactionID = Request["txn_id"]; string sAmountPaid = Request["mc_gross"]; string deviceID = Request["custom"]; //validate the order Decimal amountPaid = 0; Decimal.TryParse(sAmountPaid, out amountPaid); if (sAmountPaid == "2.95") { // take the information returned and store this into a subscription table // this is where you would update your database with the details of the tran return View(); } else { // let fail - this is the IPN so there is no viewer // you may want to log something here } } return View(); } string GetPayPalResponse(DictionaryformVals, bool useSandbox) { // Parse the variables // Choose whether to use sandbox or live environment string paypalUrl = useSandbox ? "https://www.sandbox.paypal.com/cgi-bin/webscr" : "https://www.paypal.com/cgi-bin/webscr"; HttpWebRequest req = (HttpWebRequest)WebRequest.Create(paypalUrl); // Set values for the request back req.Method = "POST"; req.ContentType = "application/x-www-form-urlencoded"; byte[] param = Request.BinaryRead(Request.ContentLength); string strRequest = Encoding.ASCII.GetString(param); StringBuilder sb = new StringBuilder(); sb.Append(strRequest); foreach (string key in formVals.Keys) { sb.AppendFormat("&{0}={1}", key, formVals[key]); } strRequest += sb.ToString(); req.ContentLength = strRequest.Length; //for proxy //WebProxy proxy = new WebProxy(new Uri("http://urlort#"); //req.Proxy = proxy; //Send the request to PayPal and get the response string response = ""; using (StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII)) { streamOut.Write(strRequest); streamOut.Close(); using (StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream())) { response = streamIn.ReadToEnd(); } } return response; }

Step 4 – Enabling Sandbox and IPN

I highly recommend that you first test this with the PayPal sandbox. You need to sign up separately for a sandbox account from your live PayPal account from the developer page here.
For sandbox accounts, you do not need to enable IPN, but for live accounts you do. Once you are ready to switch to live, log in to your PayPal account and go to Profile | More Options | My Selling Tools | Choose Update for “Instant Payment Notifications” and then enable IPN and set the URL that you wish PayPal to call (which is the IPN controller we created above).

Summary

At this point you should be ready to go. One of the real pains of this system is that there is no easy way to debug your IPN controller if you are running the MVC app on a remote machine that does not have Visual Studio installed. For this reason, I chose to log each of the steps to a “Log” table that I created in my database. That way I could log all the variables that PayPal returned and still get an idea of what was happening or determine if there were any issues.
I am still testing this and have not yet implemented this into my Cotega service for monitoring SQL Azure databases. For now, if you would like to see this working I am using it in a Windows Phone Baby Monitor app that I wrote.
If you have any suggestions or problems I look forward to hearing from you.