Chris Umbel

ADO.Net Data Services with jQuery

Lately I've been spending quite a bit of time messing around with ADO.Net Data Services. Thus far I'm pleased with it. Considering it sits nicely on top of EF models and I've been using EF for nearly all of my personal projects of late I've been able to glean a service layer with almost no work.

If there's one thing I'm not passionate about, however, it's web interface work. I'm very glad technologies like jQuery exist but I'm ecstatic it's other people's jobs to work with them. But, I have no choice but to do some web work on occasion. Whether it's for simple administrative tools at the office or even the development of this very website sometimes I simply must write web interface code.

With both of those concepts on the brain I decided to take a stab at consuming an ADO.Net data service from jQuery for a personal project I'm working on. The result. Easy as falling off a log.

To illustrate an example we have to start out with a web project containing an entity model. For those of you who are unfamiliar with generating models it's outlined here. In this case I'll just create a simple, single table model that will house the names of hockey players. Note that this will require downloading and including the jQuery and JSON javascript libraries into the project.

There isn't too much to do to get our service to work with our model. Just change the parent class of our service to employ our entity model in the constructed type and the access rules. Because this is just an example we'll allow the service to modify Player entities without restriction.

public class HockeyDataService : DataService<HockeyEntities>
{
	public static void InitializeService(
	    IDataServiceConfiguration config)
	{
      	config.SetEntitySetAccessRule("Players", EntitySetRights.All);
	}
}

That's it, our service is ready. Run your web project and it's good to go. The time has come to consume the service from client-side javascript on a web page. Here's a basic layout for a button to retrieve our list of players, provide a place to put the list, and a form to create a new player.

<button onclick="return getPlayers();">Get Players</button>
        
<div id="_players" style="width:400px;height:100px;">
</div>
        
<div id="_edit" style="position:absolute;width:400px;height:100px;
            left:200px;top:10px;">
        <input type="text" id="_firstNameTextBox" />
        <input type="text" id="_lastNameTextBox" />
        <button onclick="return addPlayer($('#_firstNameTextBox').val(), 
            $('#_lastNameTextBox').val());">Add</button>
</div>

Now for the magic. We're just going to use simple jQuery AJAX calls to our service with the HTTP verb GET to retrieve our players and POST to save a player as is outlined in the following script. I'm not going to cover it here, but the PUT and MERGE verbs can be used for updating while the DELETE is used for, you guessed it, deleting.

<script>
function getPlayers() {
    $("#_players").html("");

    $.ajax({
        type: "GET",
        url: "HockeyDataService.svc/Players",
        data: "{}",
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        success: function(data) {
            displayPlayers(data.d);
        },
        error: function(xhr) {
            alert(xhr.responseText);
        }
    });
    
    return false;
}

function addPlayer(firstName, lastName) {
    var player = { FirstName: firstName, LastName: lastName };

    $.ajax({
        type: "POST",
        url: "HockeyDataService.svc/Players",
        data: JSON.stringify(player),
        contentType: "application/json; charset=utf-8",
        dataType: "json",
        success: function(data) {
            displayPlayer(player);
        },
        error: function(xhr) {
            alert(xhr.responseText);
        }
    });

    return false;
}

function displayPlayer(player) {
    $("#_players").html($("#_players").html() + "<div>" + 
        player.FirstName + " " + player.LastName + "</div>");
}

function displayPlayers(players) {
    for (var i in players) {
        displayPlayer(players[i]);
    }
    return false;
}

</script>

And that's pretty much all there is to it. Most of the data access and service layers were auto generated. Not too shabby for such a small amount of work.

Fri May 29 2009 12:05:18 GMT+0000 (UTC)

Comments

Podcasts

These days there are so many useful sources of information for software developers: websites, user groups, message boards, second life events, conferences, not to mention good old print mediums such as magazines and books. In the last few years another interesting means of content delivery has come into play with the proliferation of portable mp3 players... podcasts.

I'd like to share a few that I've found useful not only for keeping me up to date on the latest and greatest software development news but to pass the time during my 60 miles of driving every day. Keep in mind to play these you only need podcatching software, not necessarily an iPod or other portable mp3 player.

.Net Rocks! - This is a great podcast hosted by Richard Campbell and Carl Franklin centered around .Net development. They have awesome guests, typically program managers at Microsoft or MVPs. If you're a .Net developer and you're only going to subscribe to one podcast this is the one right here.

Hanselminutes - This podcast generally focuses on ASP .Net development and is hosted by Scott Hanselman, who is now a Microsoft employee. Much like .Net Rocks the format involves a guest. This podcast overflows with expertise in my opinion.

SSWUG Radio - This is the official podcast of the SQL Server WorldWide User Group. It's generally a monolog hosted by Stephen Wynkoop and useful both to DBAs and more casual database developers. The focus is clearly on SQL Server but Oracle, DB2 and MySQL topics are covered on occasion.

SQL Down Under - An Australian SQL Server podcast hosted by Greg Low. It has a typical interview format with knowledgeable guests.

Polymorphic Podcast - Here's another .Net podcast, this one hosted by Carig Shumaker. This guy works for Infragistics, a user interface company. The cast is very instructional and tutorial-esque in nature.

Going Deep - A Channel 9 MSDN podcast with interviews with the architects who work for Microsoft. They discuss the innards and design of the platform.

sd.rb - The san diego ruby user group's video podcast. Essentially it's recordings of presentations at their meetings. Topics are wide ranging and quite informative.

Biznik - The official Ruby on Rails podcast.

Tue May 26 2009 23:38:52 GMT+0000 (UTC)

Comments

Exchange webdav automation

Recently a situation arose where it would be very handy for me to write some code to automate some tasks dealing with exchange server that I'd have gone to CDO for in the past.

I was basically trying to pull attachments out of unread mails with specific subjects and mark the owning mail as read. My first move was strait to CDO... Low and behold microsoft REALLY does not want you using CDO from within .net (generally not supported except for circumstances incompatible with my needs) and accessing exchange via ADO is supported only on the exchange server itself and this was prohibited by circumstances beyond my control.

Since I wasn't prepared to use CDOX in an unsupported fashion the only real option I had available was to issue commands to exchange via webdav. It seemed klunky at first, but turned out to be quite handy. You really can go far with automating exchange by HTTP, and I don't mean total hacks like faking browser requests to web outlook.

Consider the following code:

/* send the webdav query */
XmlDocument inboxXml = new XmlDocument();
HttpWebRequest request = (HttpWebRequest)WebRequest.
Create("http:/mail.mydomain.com/exchange/myuser/inbox/");

/* impersonate current process */
request.Credentials = CredentialCache.DefaultCredentials;
request.ContentType = "text/xml";

/* the actual name of our command */
request.Method = "PROPFIND";

/* read the response back from exchange */
HttpWebResponse response = (HttpWebResponse)request.GetResponse();

using (StreamReader reader = 
new StreamReader(response.GetResponseStream()))
{
    inboxXml.LoadXml(reader.ReadToEnd());
}

/* xsd plumbing so our xpath queries will work */
XmlNamespaceManager namespaces = 
new XmlNamespaceManager(inboxXml.NameTable);
namespaces.AddNamespace("a", "DAV:");
namespaces.AddNamespace("d", "urn:schemas:httpmail:");

/* iterate all items in mailbox that are 
proper unread DotCom messages */
foreach (XmlNode mailNode in inboxXml.
SelectNodes(@"/a:multistatus/a:response/a:href", namespaces))
{
    Console.WriteLine(mailNode.InnerText);
}

That code simply asked exchange for a list of mails in a specific mailbox.

Look at where we set the request method, "PROPFIND" in this case. Think of that a function name. It tells the server what action to do i.e. get the properties of a mailbox. Not listed above but also used in my program were "X-MS-ENUMATTS" and "PROPPATCH" list attachments and set an email's state to "read" respectively.

You issue arguments to these webdav requests similarly to how the POST HTTP method works by writing to the request stream before reading the response like so:

byte[] bytes = Encoding.UTF8.GetBytes((string)query);
request.ContentLength = bytes.Length;
System.IO.Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();

where the object "query" is a string such as

<?xml version='1.0'?>
<D:propertyupdate xmlns:D='DAV:' xmlns:hm='urn:schemas:httpmail:'>
 <D:set>
  <D:prop>
   <hm:read>1</hm:read>
  </D:prop>
 </D:set>
</D:propertyupdate>
for a PROPPATCH request setting at item to "read".

I suppose this may seem like an off the wall way of talking to exchange but it can be quite handy.

Tue May 26 2009 23:05:06 GMT+0000 (UTC)

Comments

Linq to Object Performance

Linq (specifically Linq to object) really has improved the "feel" and readability of .Net code by turning what we used to do with flow control, like loops and conditionals, into a query expression.

Few people love tight code during development more than I, but I also have to support applications as they age. As the user base and application data grows performance almost always becomes a concern.

In that vein I figured I'd whip up a simple test over a sizable amount of data and compare the execution times of a Linq and a traditional approach to filtering a list.

First I needed something to build a list of:

public class Movie
{
	public string Title { get; set; }
        public TimeSpan RunTime { get; set; }
}

Then I filled a List named "movies" full of a nice round number of records... oh, say 1048576. 262143, roughly one quarter, of the records had a RunTime greater than 2 hours.

Now that I had some data it was time to test. The following is a Linq query I used to retrieve a list of movies longer than two hours and put them into a list named longMovies:

start = DateTime.Now;

List<Movie> longMovies = (from m in movies 
	where m.RunTime.Hours > 2
        select m).ToList();


Console.WriteLine(DateTime.Now.Subtract(start).Milliseconds);

Three executions resulted in an average of 204 milliseconds on an AMD Turion X2, 2.1 Ghz (updated on 2009/11/15 to more modern hardware).

Now to compare it to a traditional iterative approach:

start = DateTime.Now;

longMovies = new List<Movie>();

foreach (Movie m in movies)
{
	if (m.RunTime.Hours > 2)
        	longMovies.Add(m);
}

Console.WriteLine(DateTime.Now.Subtract(start).Milliseconds);

The average result after 3 executions: 132 milliseconds, about 2/3 of the Linq equivalent.

The moral of the story is that you have to be careful with Linq to object, at least with large sets. It may read nice and give you the "feel" of set-based operation, but under the hood the compiler's doing all sorts of expensive things like creating and calling delegates which it uses in Linq "Where" clauses. Your own conditionals will be far faster.

Mon May 11 2009 01:28:00 GMT+0000 (UTC)

Comments
< 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 >
Follow Chris
RSS Feed
Twitter
Facebook
CodePlex
github
LinkedIn
Google