How to tell if your Azure SQL Stored Procedure is Running

One of the capabilities of the Cotega service is the ability to schedule execution of stored procedures in Azure SQL.  This is handy for many administrators because Azure SQL does not have a SQL Agent capability.  One of the interesting side effects I have seen from this is that long running stored procedures can cause issues when they are scheduled to run frequently.  If the previous execution of the stored procedure had not completed before the next one starts, there can be issues such as locking or even extreme cases where they pile up and eventually cause connection drops due to reaching the maximum number of request counts.

One solution that I have come up with is to add code to the start of a stored procedure that detects if it is already running.  If it is, it simply returns.  It was interesting to see that there was not a lot of information that I could find on the internet to detect if an Azure SQL stored procedure is running, so I decided to build one.  This is what I came up with which will return a count of the number of times a specific stored procedure is currently running:

CREATE FUNCTION [dbo].[CheckStoredProcActive] (@StoredProcnvarchar(255))
RETURNS int
as
begin
declare @spCount int;
set @spCount=(select count(*) FROM sys.dm_exec_sessionss
INNER JOIN sys.dm_exec_requestsr
ON r.session_id=s.session_id
CROSSAPPLY sys.dm_exec_sql_text(r.sql_handle) AS SQL
INNER JOIN sys.objects o
ON SQL.objectid=o.object_id
WHERE  s.is_user_process= 1
AND r.database_id=db_id()
AND [email protected]);
RETURN @spCount;
end;

Then you can simply add the following to the start of your procedure where you replace [ENTER_STORED_PROC_NAME] with your stored procedure name:

if (dbo.CheckStoredProcActive(‘[ENTER_STORED_PROC_NAME]’)> 0)
   return;
I hope others find this helpful.
Liam

How to tell if your Azure SQL database performance issues are data center related

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

Working on a monitoring service for SQL Azure Azure SQL Database allows me to get a really good idea of the common issues DBA’s have when working with their database. If I had to choose the most common area of frustration I hear from DBA’s is whether their performance issue is related only to their database or to the entire data center where their database is hosted? It has been a goal of mine from the very early days of Cotega to help solve this problem. Today I am happy to announce that within Cotega, you are now able to see your performance data overlayed with that of the data centers performance. As you can see below, here is an example of connection latency done from outside the Azure data centers. It shows the connection latency of my database (in blue) along with the average latency of all other customers in this datacenter (in orange).

connection_latency

This has taken me some time to build because in order to get a true picture of the data center you are in, I needed to have enough customers in each of the data centers such that I could get an aggregated average of all the data gathered from the monitoring of these database without exposing any information about any one customers performance data.

As of right now, I have a really good picture of all the major Azure data centers (with the exception of East Asia). In fact, even without a Cotega account, you can see the status of each of the data centers here. Please note that this data is related only to that of SQL Database and does not reflect that of other Azure services.

I hope that all of you will consider creating an account to help everyone create an absolutely perfect picture of the health of SQL Azure database data centers. To learn more about how to create an account, please visit www.cotega.com.

Liam

How to Perform SQL Azure Performance & Diagnostics Analysis for SQL Azure

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

A fully managed services such as SQL Azure has a number of advantages but there are also a few disadvantages. One of these disadvantages is in diagnosing performance issues. This can be tricky since you do not have access to the machine like you would with an on-premises SQL Server. To date, I have limited Cotega to monitoring of current database issues and then notify users when issues occur. However, more recently, I have added diagnostics capabilities to allow DBA’s to then drill down and really understand what is causing the issue. To do this, I use many of the queries outlined below, many of which were gathered from the expertise of the SQL Server and SQL Azure MVP’s. If you have others that you like to use, I would love to hear from you. For more information on how Cotega does this, please see the following page on Cotega Performance & Diagnostics Analysis.

Performance and Diagnostics Queries for SQL Azure Databases

Top 10 Most CPU intensive queries

SELECT TOP 10 query_stats.query_hash AS \”QueryHash\”,
SUM(query_stats.total_worker_time) / SUM(query_stats.execution_count) AS \”AvgCPUTime\”,
MIN(query_stats.statement_text) AS \”StatementText\”
FROM
(SELECT QS.*,
SUBSTRING(ST.text, (QS.statement_start_offset/2) + 1,
((CASE statement_end_offset
WHEN -1 THEN DATALENGTH(st.text)
ELSE QS.statement_end_offset END
– QS.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats AS QS
CROSS APPLY sys.dm_exec_sql_text(QS.sql_handle) as ST) as query_stats
GROUP BY query_stats.query_hash
ORDER BY 2 DESC

Top 10 Most Resource Intensive Queries

SELECT highest_cpu_queries.plan_handle, highest_cpu_queries.total_worker_time, q.dbid,
q.objectid, q.number, q.encrypted, q.[text]
FROM
(SELECT TOP 10 qs.plan_handle, qs.total_worker_time
FROM sys.dm_exec_query_stats qs
ORDER BY qs.total_worker_time desc) AS highest_cpu_queries
CROSS APPLY sys.dm_exec_sql_text(plan_handle) AS q
ORDER BY highest_cpu_queries.total_worker_time desc”;

Current Connection Info

SELECT e.connection_id, s.session_id, s.login_name, s.last_request_end_time, s.cpu_time
FROM sys.dm_exec_sessions s INNER JOIN sys.dm_exec_connections e ON s.session_id = e.session_id

Top ten Running Queries having the Longest Total Elapsed Time & are Blocking Other Queries

SELECT TOP 10 r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level, r.row_count, st.text
FROM sys.dm_exec_requests r
CROSS APPLY sys.dm_exec_sql_text(r.sql_handle) as st
WHERE r.blocking_session_id = 0
and r.session_id in
(SELECT distinct(blocking_session_id) FROM sys.dm_exec_requests)
GROUP BY r.session_id, r.plan_handle, r.sql_handle, r.request_id, r.start_time, r.status,
r.command, r.database_id, r.user_id, r.wait_type, r.wait_time, r.last_wait_type,
r.wait_resource, r.total_elapsed_time, r.cpu_time, r.transaction_isolation_level,r.row_count, st.text
ORDER BY r.total_elapsed_time desc

Row Count Aggregate Information (total rows, min rows, max rows and last rows) for Queries

SELECT qs.execution_count,
SUBSTRING(qt.text,qs.statement_start_offset/2 +1,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(nvarchar(max), qt.text)) * 2
ELSE qs.statement_end_offset end –
qs.statement_start_offset
)/2
) AS query_text,
qt.dbid, dbname= DB_NAME (qt.dbid), qt.objectid,
qs.total_rows, qs.last_rows, qs.min_rows, qs.max_rows
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt
WHERE qt.text like ‘%SELECT%’
ORDER BY qs.execution_count DESC;

Top 10 Most Expensive Queries by Logical Reads

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_reads DESC

Top 10 Most Expensive Queries by Logical Writes

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_logical_writes DESC

Top 10 Most Expensive Queries by Worker Time

SELECT TOP 10 SUBSTRING(qt.TEXT, (qs.statement_start_offset/2)+1,
((CASE qs.statement_end_offset
WHEN -1 THEN DATALENGTH(qt.TEXT)
ELSE qs.statement_end_offset
END – qs.statement_start_offset)/2)+1) SQLStatement,
qs.execution_count,
qs.total_logical_reads, qs.last_logical_reads,
qs.total_logical_writes, qs.last_logical_writes,
qs.total_worker_time,
qs.last_worker_time,
qs.total_elapsed_time/1000000 total_elapsed_time_in_S,
qs.last_elapsed_time/1000000 last_elapsed_time_in_S,
qs.last_execution_time,
qp.query_plan
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) qt
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) qp
ORDER BY qs.total_worker_time DESC

Queries Taking Longest Elapsed Time

SELECT TOP 10
qs.total_elapsed_time / qs.execution_count / 1000000.0 AS average_seconds,
qs.total_elapsed_time / 1000000.0 AS total_seconds,
qs.execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE
qs.statement_end_offset
END
– qs.statement_start_offset)/2) AS individual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_seconds DESC;

Queries doing the Most I/O

SELECT TOP 10
(total_logical_reads + total_logical_writes) / qs.execution_count AS average_IO,
(total_logical_reads + total_logical_writes) AS total_IO,
qs.execution_count AS execution_count,
SUBSTRING (qt.text,qs.statement_start_offset/2,
(CASE WHEN qs.statement_end_offset = -1
THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
ELSE qs.statement_end_offset END – qs.statement_start_offset)/2) AS indivudual_query,
o.name AS object_name,
DB_NAME(qt.dbid) AS database_name
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) as qt
LEFT OUTER JOIN sys.objects o ON qt.objectid = o.object_id
where qt.dbid = DB_ID()
ORDER BY average_IO DESC;

Calculate the Database Size (MB)

SELECT SUM(reserved_page_count) * 8192 / 1024 / 1024 as DatabaseSize FROM sys.dm_db_partition_stats

Check out the new look of Cotega – SQL Azure Monitoring Service

I am the founder of a startup called Cotega and also a Microsoft employee where I work as a Program Manager. This is a series of posts where I talk about my experience building a startup outside of Microsoft. I do my best to take my Microsoft hat off and tell both the good parts and the bad parts I experienced using Azure.

It has been some time since I have posted an update, but the past few months have been quite busy. One of the things that I have been working on is a major update to the Cotega service. I heard a lot of feedback from people on the v1 version of Cotega and I think one of the biggest things I heard was in the area of ease-of-use. There are a lot of things that people want to be able to monitor for their database and I just made it too hard to add all of the things needed. So I decided to step back and try to come up with a new way to set up your database monitoring. Now, you simply point Cotega to your database, provide an email to send notifications and you are ready to go.

In addition, there is now a “Free” subscription that does not require a credit card and allows you to try out the service before making a commitment. The key difference between the Free plan and a paid one is that monitoring is only done once every 2 hours (as opposed to every 10 min for a paid account). In additions paid accounts will send email notifications when database issues occur.

cotega_dashboard

One Response to “Check out the new look of Cotega – SQL Azure Monitoring Service”

How to Host a Website on Azure and Link it to a Domain like .IO

.COM addresses are becoming harder and hard to come by. It seems each time you look, it is harder to get the name you want. Many startups have become creative in the way they name their site, for example if you wanted to create a site called score.com, you might try calling it scorable.com or scorability.com. Another option that seems to be getting more and more popular is to use non .COM domains such as instant.ly or ordr.in or ginger.io. Many of these domains are easier to get, but the downside for many is that they are typically more expensive and the process of configuring them is often harder. I want to talk about how I managed to create URL using the io domain which used a site hosted on Windows Azure (where all the html pages are hosted). I am going to assume you already know how to create a web site on Windows Azure.

Getting the Domain Name

In the past, I have used GoDaddy as the place that I set up most of my domains. For .COM’s this is incredibly cheap (most of the time you can find a promo code for just about anything) and I have found them to be extremely supportive and easy to work with. However, in this case I wanted to get a .IO domain which is actually a Indian Ocean domain because the .COM was owned by someone else trying to sell it. To do this, I used http://nic.io where it cost ~$93 for the year (as opposed to ~$8 for a .COM domain).

DNS Server Configuration

After I purchased the .io domain from nic.io, I learned that they needed to be provided with DNS settings requiring a Primary and Secondary DNS Server. This was kind of frustrating since GoDaddy provided me with this whenever I created a .COM address. So after looking a little more I learned that I could just use GoDaddy for this. In my case, I could just log into my existing GoDaddy account and configure this for no extra charge. For you, you could either look at another DNS Server provider or create an account with GoDaddy to do this.

GoDaddy DNS Configuration

Within my GoDaddy account, I launched the DNS Manager and chose to create a new Offsite Domain. This was pretty hidden which is why I am adding a screen shot of where I went.

dns_config1

In the Domain Name text box I entered my .IO domain name, clicked Next and copied the 2 DNS servers that were provided.  Copy these as you will need them in a minute.

Create A Record

The next step is to configure the GoDaddy DNS servers so that when a request for my domain .io is requested, it know where to point the user to.  In my case it is a Windows Azure website with a .cloudapp.net name.    To do this config, in the GoDaddy DNS Dashboard you click on “Edit Zone” under the domain you just added.  In the A Host section, choose “Add Record” and for “Host” use @, and for “Points To” add the IP address to your Azure hosted site.

Choose Save

a_record

Add DNS Servers to NIC.io.

The final step is to configure NIC.io so that it knows to use the GoDaddy domain servers you just configured. Log in to the admin panel for NIC.io and choose to Manage the DNS Settings. In the “DNS Servers or Mail & Web Forwarding Details”, for the primary server enter the first DNS server you copied from the previous steps and in the Secondary Server enter the second. If you only have one DNS server, I think that is probably ok. At the bottom choose “Modify Domain”.

Give it an hour or two

It usually takes an hour or two for these configurations to fully get updated. However, after that you should be all set to go.

Using Cloud Services to send Notifications with Windows Phone

I am pretty excited because this week one of my other personal side projects called the “Windows Phone Baby Monitor” just won that “2012 Appies People’s Choice Award”, an internal contest for Microsoft colleagues. What I would like to do in this post is give you some background on how I managed to implement SMS, Phone and Email notifications from Windows Phone for this app.

baby_monitor-191x300

Background

This application basically turns your Windows Phone into a baby monitor. You leave the phone near your baby while it is sleeping and then when it detects crying it will send a notification by phone, email or SMS. There are a number of things that I do within the application to set the decibel level and length of time the crying lasts to avoid false notifications like doors slamming.

Sending Notifications with Windows Phone API’s

When I first started building the application, I figured this would be very simple to implement because I could just write to some sort of Windows Phone API that would allow me to send the SMS, email or phone notifications. Although there is an API for doing this, the main issue I had was the fact that the user needed to physically click an approve button for the notification to complete. Since I figured it was somewhat unrealistic to expect a baby to sit up, move to the phone and click a button when it was done sleeping, I needed to come up with another solution.

Using Services to Send Notifications

From my work with Cotega (which is a monitoring service for SQL Azure databases), I had a little background experience on how to send notification. For that service, I used Amazon’s SES to send email notifications. For this Baby Monitor, I chose to use a different service called SendGrid to send email notifications. I used this service over Amazon SES because I wanted to compare the two and also because there was a great offer of 25,000 free emails / month for Azure services. Since SendGrid does not support sending SMS or making phone calls, for this part I chose to use Twilio.

SendGrid

SendGrid was incredibly easy to implement within my MVC service. The way it works with the Baby Monitor is that when the phone detects crying, it makes a WebClient request to my MVC service, passing some details like the Windows Phone device id, decibel level and the email address that needs to be contacted. From that point the service makes a SendGrid request to send the email. As you can see, the controller code is pretty simple and looks like this.

using SendGridMail;
using SendGridMail.Transport;

// Create the email object first, then add the properties.
SendGrid myMessage = SendGrid.GenerateInstance();
myMessage.AddTo(emailAddress);
myMessage.From = new MailAddress("\"Cotega Baby Monitor\" ", "Cotega Baby Monitor");
myMessage.Subject = "Baby Monitor Alert!";
myMessage.Text = "Crying was detected by the Baby Monitor.  Decibel Level: " + maxDecibels + ".  ";
 
// Create credentials, specifying your user name and password.
var credentials = new NetworkCredential("[myusername]", "[MyPassword]");
 
// Create an REST transport for sending email.
var transportREST = REST.GetInstance(credentials);
 
// Send the email.
transportREST.Deliver(myMessage);

Twilio for Sending SMS and Phone Notifications

This part was a little more complex because although sending notifications through Twilio is pretty cheap within North America, I still needed a mechanism to limit the ability for people to send unlimited notifications. For this, I chose to implement a token based system that was linked to the device id of the phone. When they download the app, I give them a certain number of tokens to send SMS and make phone calls (emails are free), and if they wish to purchase more they can do so and tokens are applied to their account. There are some really cool things about Twilio such as:

  • Text to Voice for phone calls: This allows me to send some text to the Twilio service such as “Your baby is crying” and when Twilio makes a call to the the person’s phone, the person hear’s a computer like voice stating “Your baby is crying”.
  • Attaching audio files to phone calls: When I call the Twilio service, I can pass it a link to a URL that contains an audio file. When Twilio makes the call to the person, the audio file can be played over the phone. This is really useful because I can attach a snippet of audio of their baby crying to allow them to ensure that it is really their baby crying and not a jack hammer in the background or something else.

Just like SendGrid, the code used in the MVC controller is really very simple using the TwitML api. Here is what it looks like for sending SMS messages.

using Twilio;
using Twilio.TwiML;
?
1
2
3
//twilioAccountSID and twilioAuthToken values are store in the web.config (hopefully encrypted)
public string twilioAccountSID = ConfigurationManager.AppSettings["twilioAccountSID"];
public string twilioAuthToken = ConfigurationManager.AppSettings["twilioAuthToken"];

//This it the controller code that sends the SMS message

var twilio = new TwilioRestClient(twilioAccountSID, twilioAuthToken);
string smsMessageBody = "Baby Monitor Alert! Crying detected at decibel Level: " + maxDecibels + ". ";
if (fileName != null)
smsMessageBody += "http://cotega.com/audio/" + fileName + ".wav";
// if phone number is 10 digits I need to add the +1
if (phoneNumber.Length == 10)
phoneNumber = "+1" + phoneNumber;
var msg = twilio.SendSmsMessage("+1[mytwilionumber]", phoneNumber, smsMessageBody);

There is a little bit of code that I have not show before and after that first checks a SQL Azure database to see if they have enough tokens to send a notification, and then updates their token count after the message is sent.

The code for making the phone call is very similar.

// Create an instance of the Twilio client.TwilioRestClient client;client = newTwilioRestClient(twilioAccountSID, twilioAuthToken);// Use the Twilio-provided site for the TwiML response.String Url = "http://twimlets.com/message";Url = Url + "?Message%5B0%5D="+ "Hello.%20Crying%20was%20detected%20by%20the%20baby%20monitor.";if(fileName != null)Url += "&Message%5B1%5D="+ "http://www.cotega.com/audio/"+fileName+".wav";// Instantiate the call options that are passed to the outbound callCallOptions options = newCallOptions();// Set the call From, To, and URL values to use for the call.// This sample uses the sandbox number provided by// Twilio to make the call.options.From = "+1[my twilio number]";options.To = "+1"+ phoneNumber;options.Url = Url;// Make the call.Call call = client.InitiateOutboundCall(options);

Notice how I can attach a string of text along with a url to http://twimlets.com/message. For example, try clicking this link and see the XML that is created. If you passed this to Twilio it would make a phone call and say “Hello from Liam”. Notice also, how I attached a link to a WAV file. Twilio will take that audio and play it when the person answers the phone. Very cool right?

As I mentioned, I did not go into many details of how I linked the Windows Phone device ID to the service to allow me to verify they have enough tokens before making the call, but if you are interested in this, let me know and I can help you get going.

Liam

How to use PayPal with ASP.NET MVC

I talked previously about why I chose to use Stripe with Cotega to allow me to accept credit card payments. In it I talked about how excellent their support was and how easy it was to take their samples and implement it into my service. I am still extremely pleased with my choice of using Stripe and I have no intention of moving away from it. I have however always wanted to investigate using PayPal as an alternative payment option because occasionally International customers will have credit cards that are not acceptable by Stripe or they just plain prefer to use PayPal. For this reason, I decided to take some time and understand what it would take to implement PayPal into my MVC based service.
Unfortunately, I did not find this to be quite as simple to implement as it was with Stripe. Although PayPal has a number of examples, none of the ones I could find were targeted for MVC and there did not seem to be any good tutorials on this subject to get me started. I hope this blog post will help anyone thinking to implement PayPal with MVC. I do want to caveat this with the fact that I am far from a PayPal expert and I do not claim that this is the “PayPal preferred” way to implement this other than to say that I have tried to convert other examples as best I can and it has worked well for me so far. If you have any comments or suggestions, I am very interested to hear your feedback.

Step 1 – Choosing PayPal IPN vs. PDT vs. Express

One of the first things I needed to decide on was the appropriate “PayPal Responder” to use. A responder is like a callback that PayPal does when the transaction is complete (successful, unsuccessful, invalid, etc.). It can also return custom variables that you set when the user first clicks the “Pay Now” button.
Since we are using MVC, we first need to create a View that initiates the transaction to PayPal. This is where the user clicks the “Pay Now” button and is re-directed to PayPal where they enter their credit card or PayPal account information. After this is complete, PayPal can initiate a responder which calls your service with the results of the transaction. The flow of the process looks something like this:

paypal_payment_process

You really only need to use a responder in the cases where you need to continue with some process after that transaction completes. For example, in my case I want to be able to enable a user subscription once the transaction completes. If you are simply accepting payments, there may be no need to go to the complexity of using responders.
There is lots of information on responders but I found that Instant Payment Notifications (IPN) was the best choice for me, and probably yourself as well if you simply want a callback from PayPal to your MVC controllers.

Step 2 – Implementing the View and Form

The next step is to create a payment page using a View where the user can choose to start the PayPal process. Here is an example of a HTML form that I used.

There are a couple of things you should notice in this form:

  • The commented out section allows me to either submit the payment request to the PayPal sandbox (test) environment or to the actual live PayPal url.
  • Notice all the input type=”text” types. These are values that are returned to my controller after the transaction is complete. When I load the page, I set the “device-id” input type to a custom value that I want PayPal to return when the transaction completes. I can use this value to update the user’s status in my SQL database (assuming the transaction is successful).
  • The “return” input type URL defines where the user will be redirected to after the transaction is complete. It is important that you don’t rely on this as a method for completing a transaction because most users will never click on this link. Don’t forget to create this page.
  • The notify_url input is the IPN URL that PayPal will call after the transaction completes.
  • Remember to associate the business input type email address with the PayPal address you used for your sandbox or live account
  • There are a lot more custom values you can use and learn more about here.

Step 3 – Creating an IPN Controller in MVC

Next we will want to create a new ActionResult controller that will receive the responder request from PayPal after the transaction completes. I want to give credit to this page where I was able to leverage a lot of code for this controller. You should also remember to actually create an IPN view (IPN.cshtml) page because PayPal will continue to call this controller until it gets a successful page returned.

public ActionResult IPN() { // Receive IPN request from PayPal and parse all the variables returned var formVals = new Dictionary(); formVals.Add("cmd", "_notify-validate"); // if you want to use the PayPal sandbox change this from false to true string response = GetPayPalResponse(formVals, false); if (response == "VERIFIED") { string transactionID = Request["txn_id"]; string sAmountPaid = Request["mc_gross"]; string deviceID = Request["custom"]; //validate the order Decimal amountPaid = 0; Decimal.TryParse(sAmountPaid, out amountPaid); if (sAmountPaid == "2.95") { // take the information returned and store this into a subscription table // this is where you would update your database with the details of the tran return View(); } else { // let fail - this is the IPN so there is no viewer // you may want to log something here } } return View(); } string GetPayPalResponse(DictionaryformVals, bool useSandbox) { // Parse the variables // Choose whether to use sandbox or live environment string paypalUrl = useSandbox ? "https://www.sandbox.paypal.com/cgi-bin/webscr" : "https://www.paypal.com/cgi-bin/webscr"; HttpWebRequest req = (HttpWebRequest)WebRequest.Create(paypalUrl); // Set values for the request back req.Method = "POST"; req.ContentType = "application/x-www-form-urlencoded"; byte[] param = Request.BinaryRead(Request.ContentLength); string strRequest = Encoding.ASCII.GetString(param); StringBuilder sb = new StringBuilder(); sb.Append(strRequest); foreach (string key in formVals.Keys) { sb.AppendFormat("&{0}={1}", key, formVals[key]); } strRequest += sb.ToString(); req.ContentLength = strRequest.Length; //for proxy //WebProxy proxy = new WebProxy(new Uri("http://urlort#"); //req.Proxy = proxy; //Send the request to PayPal and get the response string response = ""; using (StreamWriter streamOut = new StreamWriter(req.GetRequestStream(), System.Text.Encoding.ASCII)) { streamOut.Write(strRequest); streamOut.Close(); using (StreamReader streamIn = new StreamReader(req.GetResponse().GetResponseStream())) { response = streamIn.ReadToEnd(); } } return response; }

Step 4 – Enabling Sandbox and IPN

I highly recommend that you first test this with the PayPal sandbox. You need to sign up separately for a sandbox account from your live PayPal account from the developer page here.
For sandbox accounts, you do not need to enable IPN, but for live accounts you do. Once you are ready to switch to live, log in to your PayPal account and go to Profile | More Options | My Selling Tools | Choose Update for “Instant Payment Notifications” and then enable IPN and set the URL that you wish PayPal to call (which is the IPN controller we created above).

Summary

At this point you should be ready to go. One of the real pains of this system is that there is no easy way to debug your IPN controller if you are running the MVC app on a remote machine that does not have Visual Studio installed. For this reason, I chose to log each of the steps to a “Log” table that I created in my database. That way I could log all the variables that PayPal returned and still get an idea of what was happening or determine if there were any issues.
I am still testing this and have not yet implemented this into my Cotega service for monitoring SQL Azure databases. For now, if you would like to see this working I am using it in a Windows Phone Baby Monitor app that I wrote.
If you have any suggestions or problems I look forward to hearing from you.