Category Archives: Uncategorized

Debugging Sails Applications with WebStorm

Lately I’ve been experimenting with a Rails-like web framework based on node and express called Sails. While it’s still missing a few convenient features from Rails like view scaffolding, manual migrations, and model associations, it’s been fun to work with. I’ve been using the fantastic WebStorm editor from JetBrains, which has some great features for node development, including an interactive debugger.

What’s not straightforward, though, is how to enable debugging for a Sails app. Normally, you start a Sails application by running the sails lift command at the command line. This presents a problem when debugging with WebStorm, as the IDE assumes that you’ll be running the node executable, passing in the name of the startup file of your app. This is easy to fix, though.

First, create a new Run/Debug configuration in WebStorm, based on the default “Node.js” configuration. The “Node interpreter” and “Working directory” fields should already be filled in for you. Check to be sure the working directory is set to the root of your Sails app. In the “JavaScript file” field type “app.js” or choose it from the file explorer dialog by clicking the “…” button to the right of the “JavaScript file” field. Sails generates this file when you create a new project, but you probably wouldn’t notice it unless you were looking for it.

sails debug 1

Next, you’ll need to install the Sails npm module directly into your application, rather than relying on the module you likely installed globally if you were following the instructions on the sails.org website when getting started. To do this, type “npm install sails --save” at the command line while in the root directory of your app. Note the lack of the “-g” option; this stands for “global”, and it’s not what we want. The “--save” option just adds a dependency to your package.json file, so that if you clone the project from source control later, “npm install” will automatically download the Sails package.

You should now be able to use this configuration to run and debug your Sails app from WebStorm.

That’s great, but what if you want to use CoffeeScript in your Sails project? Both Sails and WebStorm support CoffeeScript, but the above setup will result in a CoffeeScript parsing error when you try to start up your app with any *.coffee files present. If you run “sails lift” at the command line, everything works just fine, but you’re losing the ability to debug your application. Frankly, I’m not sure what the difference is between the state of the environment created by “sails lift” and the one created by running the node executable against app.js. Fortunately, I found a workaround.

By installing the Sails module locally, you actually ended up with a copy of the code that runs when you invoke “sails lift“. All we need to do is tell WebStorm to use that instead of app.js. To do this, open up your run configuration in WebStorm and change the “JavaScript file” field to “node_modules/sails/bin/sails.js“. In the “Application parameters” field, type “lift“.

sails debug 2

This will allow your app to run via the WebStorm runner. To enable automatic generation of JavaScript files from your CoffeeScript files and the mapping files that will enable CoffeeScript debugging, we need to create a CoffeeScript file watcher. To do this, go into your Project Settings -> File Watchers, click the plus sign, and choose “CoffeeScript”. The default settings should work.

sails debug 3

Now, when you debug with WebStorm by clicking on the little green beetle, you should be able to set breakpoints in your CoffeeScript code and debug it line-by-line.

Entity Framework Code-First Performance Issue with String Queries

TL;DR – This StackOverflow post explains the observed issue and the resolution

The Problem

I noticed today that a particular lookup in our application was performing quite poorly, so I dug into it with DotTrace and Entity Framework Profiler (we’re using EF Code-First on .NET Framework 4.0 in this particular app). When I zeroed in on the query that was causing the slowdown, I was surprised to find that it was a very simple one that looked something like this:

context.MyTables.Where(m => m.SomeStringProp == stringVar);

Ruling Things Out

Looking at EF Prof, the query it was generating also looked quite simple:

SELECT [Extent1].[ID], [Extent1].[SomeStringProp], [Extent1].[SomeOtherProp],
...
FROM [MyTable] as [Extent1]
WHERE [Extent1].[SomeStringProp] = '1234567890'

The SQL query itself ran almost instantaneously when run through SQL Server Management Studio. Since it wasn’t the query (I thought), I figured it must be the initialization of the EF context, the generation of the SQL query, or the materialization of the MyTable object that was taking so much time.

I ruled out context initialization by adding another EF query before the problem one. This new query, now the first one executed against the context, ran fairly quickly, and the slow one was still slow. Then, I ruled out object materialization by substituting the Where() call with a SqlQuery() call, like so:

context.MyTables.SqlQuery("SELECT [Extent1].[ID] ... WHERE [Extent1].[SomeStringProp] = @param", stringVar);

Doing it this way completely eliminated the performance problem, but SqlQuery() returns fully-tracked entities, just like Where() would, so the bottleneck couldn’t be entity materialization.

The Real Culprit

At this point, I was pretty stumped. Surely the generation of such a simple query from that LINQ statement couldn’t be that slow?

Grasping at straws, I looked at the schema for the table in question, and I noticed something a bit out of the ordinary. The SomeStringProp column was defined as a varchar, not nvarchar like most of the other string columns in the database. Since .NET strings are Unicode, could that have been the problem? I tweaked the SQL query I was running with SSMS to the following to test out the theory:

SELECT [Extent1].[ID], [Extent1].[SomeStringProp], [Extent1].[SomeOtherProp],
...
FROM [MyTable] as [Extent1]
WHERE [Extent1].[SomeStringProp] = N'1234567890'

Bingo! This query ran just as slowly as the LINQ to Entities query.

The Solution

So, how do we tell EF to use the right kind of string when generating the query? It’s as simple as a single data annotation. I changed my entity class like this:

public class MyTable
{
    ...

    [Column(TypeName="varchar")]
    public string SomeStringProp { get; set; }

    ...
}

With that, the performance problem disappeared.

Hopefully this will help someone else who runs into the same issue!

Displaying Database-stored Images in ASP.NET MVC with Data URLs

I’ve worked on quite a few websites which featured user-uploaded images as part of the content. To implement a feature like this, we obviously have to store the image somewhere. One way to do that is to store all the uploaded images directly in the filesystem, then storing the name of the files for the corresponding record in the database. This has always struck me as a bit clunky, since we’re storing part of the data for a Contact (say) in the database, and part of it on the filesystem. I’d much rather store the image in the database and have everything all in one place.

The problem in a web context is that the normal way of displaying images is to render an <img> tag in the HTML and have the browser make a subsequent request to the server at the URL contained in the tag. If you stored your images in the database, this would mean that you’d need a separate action method that would query the database again and write the image content directly to the response stream. But using an HTML feature called Data URLs, we can actually render the content of the image itself to the markup on the page, and not need to make another server request.

The way to do this is to transform the bytes of your image into a base 64-encoded string, put a special prefix on it, and set that as the value of the src attribute of your <img> element.

So, assuming you had a Entity Framework Contact object that looked like this:

public class Contact
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public byte[] Photo { get; set; }
}

And a view-model like this:

public class ContactViewModel
{
    public int Id { get; set; }
    public string FirstName { get; set; }
    public string LastName { get; set; }
    public string PhotoString { get; set; }
}

You could grab it from the database and send it down to the view like this:

public ActionResult Index(int id)
{
    var c = _contactEntities.Find(id);
    var vm = new ContactViewModel 
                 { 
                     Id = c.Id, 
                     FirstName = c.FirstName, 
                     LastName = c.LastName, 
                     PhotoString = "data:image/png;base64," + Convert.ToBase64String(c.Photo)
                 };
}

And render the view:

@model ContactViewModel;

...

<ul>
    <li>
        <span>First Name:</span> @Model.FirstName
    </li>
    <li>
        <span>Last Name:</span> @Model.LastName
    </li>
    <li>
        <span>Photo:</span> <img src="@Model.PhotoString" />
    </li>
</ul>

Keep in mind that this will increase the download size of your page, so you’ll have to weigh that against the convenience of storing images in the database. Also be aware that this won’t work in IE 7 and below, and data URLs are limited to 32k in IE 8. For more info, see the wikipedia article on Data URIs.

An Update After a Long Hiatus

It’s been almost two years (!) since my last post, and a lot of things have happened since then. Rather than dwell on the reasons, I’m just going to jump right in and give an update on what I’m up to. I want to get back into posting technical content, but I feel I’d be remiss if I didn’t mention the big changes in my life here.

Alice

This past April we welcomed our third daughter, Alice Josephine, into our family. She’s a wonderfully laid-back baby for the most part, but a dependable sleep schedule has still eluded us, which makes doing after-hours work like blogging a bit difficult.

Why is a raven like a writing desk?

 

MVP

A bit further back, about a year ago, I was awarded the Most Valued Professional award from Microsoft in the Data Platform Development area. At the beginning of 2011, I had started doing a lot of presentations at user groups and conferences about Entity Framework. Speaking was my main community contribution, along with leading the Shreveport .NET User Group.

This summer, after I’d turned in my contributions for the year for my potential MVP renewal in October, I received word from my MVP lead that the Data Platform Development expertise was being retired. Thankfully, they were willing to count my DPD contributions toward another area that I’d like to move into. I ended up choosing ASP.NET, and crossed my fingers; the bar for MVP is pretty high for that expertise. After a very nervous October 1st morning, I got my renewal email, so I’ll be an ASP.NET MVP for the next year. That means that I’ll probably shift the focus of my conference presentations to more web stuff.

 

Improving

At the MVP summit last year, Devlin Liles and Tim Rayburn cornered me and gave me the hard sell on coming to work with them at Improving Enterprises. They were very open and candid, and made me think seriously enough about the idea to talk to my wife about it. I thought she’d be pretty resistant to the idea of moving to the Dallas area, but that turned out not to be the case. Long story short, with her support, I successfully went through the interview process at Improving and took a position as a Senior Consultant at the Addison, TX this past July. We’ve moved to Frisco, which puts us only about a five minute drive from my mom’s house (which she is ecstatic about).

It’s been an awesome experience so far, and I’m getting to work on some interesting projects with some very smart people. I’m really looking forward to the Las Vegas retreat that’s happening in a couple of weeks!

 

Moving Forward

I realize that this post has been pretty terse, but I felt like I had to do it that way, or I’d never get through it! This was mainly a speed bump that I had to get over in order to move on to what I really want to talk about.

As a new ASP.NET MVP, I’m going to need to step up my contributions in that area in order to have a good chance of being renewed next year. Thusly, I’m going to try to start posting some ASP.NET-related content on here in the coming weeks and months.

Besides that, in order to simply exercise my writing muscle, I may post some non-tech-related content as well.  Expect things about coffee, parenthood, and World of Warcraft, most likely. ;-)

 

Using PowerShell to Ease the Pain of Branch-per-feature in Web Applications

I’m currently using Mercurial for source control at work, and I absolutely love it.  I love the cheap branching, fast operations, and merging that actually works.  One of the side effects of using a branch-per-feature workflow in Mercurial is that you’re constantly creating new copies of your project structure in the file system.  Unlike Git, where the guidance is to create branches within one working copy of the repository and switch between them, the Mercurial community recommends creating full clones instead.

Even when doing development work, I like to use IIS for serving my web applications rather than the Visual Studio web server (Cassini), so my development environment is as close to production as possible.  I’ve gotten bitten a couple of times when transitioning from Cassini during development to IIS in production, so I decided to just use IIS from the start.

Using these technologies in combination, I started to run into a problem.  Every time I created a clone of my web app’s repository, I had to set up the directory as an IIS application, plus add the permissions required for IIS to read static files (and in my case, write to a temp images directory, since I’m using the Microsoft charting tool).  To make this process easier, I whipped up a couple of PowerShell scripts to take care of all those tasks in one fell swoop.

   1: # New-App.ps1

   2: # usage: New-App "VirtualDirectoryName"

   3: param([string]$appName = "appName")

   4: $path = $pwd.Path

   5: $fullAppName = 'IIS:SitesDefault Web Site' + $appName

   6: pushd

   7: cd iis:

   8: ni $fullAppName -physicalPath $path -type Application

   9: cd c:

  10: popd

  11: $acl = Get-Acl $pwd.Path

  12: $inherit = [system.security.accesscontrol.InheritanceFlags]"ContainerInherit, ObjectInherit"

  13: $propagation = [system.security.accesscontrol.PropagationFlags]"None"

  14: $arIUSR = New-Object system.security.AccessControl.FileSystemAccessRule("IUSR", "FullControl", $inherit, $propagation, "Allow")

  15: $arIISIUSRS = New-Object system.security.AccessControl.FileSystemAccessRule("IIS_IUSRS", "FullControl", $inherit, $propagation, "Allow")

  16: $acl.SetAccessRule($arIUSR)

  17: $acl.SetAccessRule($arIISIUSRS)

  18: Set-Acl $pwd.Path $acl

A couple of things about this one. A few things are hard-coded, like the site name (this will probably be “Default Web Site” on your machine, too unless you’re running a server OS) and the “FullControl” access, which can be changed to whatever minimum level of access you need the IIS accounts to have, like “Read” or “ReadAndExecute”.

I wish there was an easier way to set the permissions on the directory, but the System.Security .NET API was the only way that I found.  I’ve always felt that calling .NET code from PowerShell was a little bit kludgey, but I’m glad it’s at least possible to fill in the gaps in functionality.

In order to not leave an orphaned IIS virtual directory when I’m done with a branch, I use this script, which will search for the app by the physical path. This one could be fleshed out a little more.  It assumes that the virtual directory exists at the root of the default web site, and that you’re executing the script from the directory itself.

   1: #Remove-App.ps1

   2: $path = $pwd.Path

   3: pushd

   4: cd iis:

   5: cd 'IIS:sitesDefault Web Site'

   6: $site = ls | Where-Object {$_.PhysicalPath -eq $path}

   7: ri $site.Name

   8: cd c:

   9: popd

One more note: you’ll need to import the “WebAdministration” PowerShell module to get this to work. If you’re on Windows 7 and you’ve got PowerShell docked on your task bar, you can just right click and choose “Import System Modules”, the web admin module (along with a few others) will be imported into your PS session.  Otherwise, you can execute “Import-Module WebAdministration” at the PowerShell prompt or in your profile script.

Hope this helps somebody!

I Love Lucy

This past Friday, the Sullivan family got a little bit bigger.

Lucy Sullivan

Mom and baby are both doing fine.  I have to say, at least from the daddy perspective, the prior experience definitely helps.  I feel like things are going much easier than they did with our first daughter, Molly.  We’re still getting quite a bit less sleep than normal, but there aren’t as many unknowns, and we don’t get stressed out about everything the way we did the first time.

And Lucy herself has been making it pretty easy; she’s a champion sleeper, just like her daddy!

Needless to say, blogging has taken kind of a back seat, but I hope to start back up again soon.

Upgrading to Windows 7 RC

I’ve had Windows 7 Beta installed on my personal laptop since it became available for download back in January, and I’ve really enjoyed using it. However, since I found out that, starting July 1st, copies of Windows 7 Beta will shut down every 2 hours, I thought it might be time to go ahead and install the RC.  The thing that was holding me back was that there’s not an upgrade path from the Beta to the RC, so I would have to do a clean install of the OS.  My machine was probably not due for a Windows reinstall quite yet, but it never hurts.

I also decided to take the plunge into 64-bit for the first time, and it hasn’t been too painful.  The only thing that I ran into was that shell extensions for 32-bit programs don’t show up.  Fortunately, the three programs that I want the shell extensions for (Tortoise SVN, 7-Zip, and Vim) all have some form of 64-bit version available, so I’m all good.

The most painful part of the rebuild process by far was installing SQL Server 2008.  There are several downloads on the download site, and the descriptions of each have been so marketing-ized that you’re not even sure what the difference is between them all.  Once you’ve got the right one (maybe, who can tell?) downloaded, the installer is so cluttered with options and “upgrade analyzers” that it’s difficult to tell how to actually install the product.  The install experience needs some serious rework.

The other thing I’m going to have to do is reinstall GRUB, because right now I can’t get to my Ubuntu install, but I think I’ve found a good guide to doing that here.  Wish me luck!

NHibernate ICriteria Queries – Way Too Many Strings

I hoped to have a lengthier blog post ready for tonight, but I just didn’t end up with enough time.  Instead, I’m just going to use this opportunity to say that I’m really looking forward to NHibernate.Linq being production ready.  The following block of code is more or less the query I have to build up to execute that search I talked about last post, not including the extra filters that are appended for each user-input value:

ICriteria query = session.CreateCriteria(typeof(Campaign));
query.CreateCriteria("CampaignStatus", "s");
query.CreateCriteria("CampaignType", "ct");
query.CreateCriteria("CampaignLocations", "cl")
    .CreateCriteria("Location", "l")
    .CreateCriteria("County", "c")
    .CreateCriteria("Territories", "t")
    .CreateCriteria("Company", "com")
        .Add(Restrictions.Eq("com.CompanyID", 1));
IList cams = query.SetProjection(Projections.ProjectionList()
                  .Add(Projections.Count("cl.DateResponseReceived"), "ProspectCount")
                  .Add(Projections.CountDistinct("l.LocationID"), "LocationCount")
                  .Add(Projections.GroupProperty("CampaignName"), "CampaignName")
                  .Add(Projections.GroupProperty("ct.Name"), "CampaignType")
                  .Add(Projections.GroupProperty("s.Message"), "CampaignStatus")
                  .Add(Projections.GroupProperty("StartDate"), "StartDate")
                  )
                  .SetResultTransformer(Transformers.AliasToEntityMap)
                  .AddOrder(new Order("StartDate", false))
                  .List<IDictionary>();

Waaaay too many strings for my taste.  Looking forward to completely strongly-typed queries.

One Year at Praeses

May 7 marked one year since I started my current job here at Praeses, back home in Louisiana.  It’s been a pretty good year, and I’ve come away with some valuable experiences.

Apart from some playing around (and I realize now that that’s what it was) with ASP.NET at Data-Tronics, my first real-world experience with the platform came when I started here.  At my previous job, all web work was done with classic ASP, and I was anxious to start using a more modern technology.  I still have nightmares about the awful spaghetti-code reporting system that I worked on when I started at DTC.  When I got to Praeses, I discovered that ASP.NET had its own set of anti-patterns and pitfalls; no technology is a panacea (no matter what the Rails guys try to tell you).  That said, I do definitely enjoy developing on this platform more.  WebForms may be full of cruft, but writing in C# beats the pants off of writing in VBScript any day.

I also got my first three Microsoft certifications during my first year here.  I wasn’t sure what to think about the prospect of getting these certs before I started, but now my opinion is pretty well formed.  Like a BS in computer science, certifications prove that you can take tests, mostly.  You may learn a thing or two during the process, but most of your learning is really going to take place on the job and through personal study.  At the end of the day, it’s mostly just a line on your resume, but a line that may open up opportunities for you, and so one that may be worth pursuing.  I had a very interesting discussion with our CEO via email when he asked me if I thought that studying for the exam I had just passed had made me a better developer.  That entire story is probably worth a post of its own, but suffice it to say that I’ve done my best to influence the way our company thinks about developer education.

A couple months after my arrival in Louisiana, I decided to take advantage of an interesting opportunity.  I had just started attending the Fort Smith DNUG when I left, and I had greatly enjoyed the additional learning and networking opportunities it provided.  However, there was no .NET user group closer to Shreveport than Dallas, so I decided to start one.  It has been an interesting experience, but one which I in no way regret undertaking.  I think the Shreveport .NET community needs this resouce, whether I was the one who got the ball rolling or not, but I’m glad that it was me.  It’s been a great way to improve my organizational skills, and I think it will serve me well in the future.

In addition to all the professional stuff that I accomplished this year, my personal life has also been quite busy.  We sold a house, bought a house, and lived through staying at my in-laws for 5 months.  Our baby went from a cooing, screaming infant to a walking, talking, singing, dancing little person.  My wife Rachel became a tutor with a substantial client-base, and I became much more adept at taking care of Molly by myself in the evening, at least for a couple of hours at a time.  We found a good chruch home in River Valley Church, and a pastor that we can really look up to as a spiritual guide and mentor in Lowell Kenyan.

All in all, it’s been a pretty good year.  We still miss our friends from Fort Smith (Josh and Jen, our LTD friends, all my buddies from work), but Bossier City is really starting to feel like home again.  And it’s good to be home.

 

P.S. - Yeah, I know, I forgot to post yesterday.  Two days into it and I already goofed.  Not exactly the picture of good follow-through, am I?  Well, it was the spirit of the exercise, anyway, so I’ll just extend the timeline by one day.

Back Up and Running

Okay, so here’s what happened.  Kevin Dente twittered about the hosting deal he got a couple of weeks ago at Dreamhost.com, which was much better than what I was paying at GoDaddy.  Since my renewal was coming up soon anyway, I thought it was a great opportunity to save some money.  So I filled out all the forms, paid my $20, and was quite happy with myself.  I quickly realized my mistake, however.  The one choice the forms at Dreamhost didn’t offer me was which OS to run on.  That’s right, Dreamhost is Linux only, and I was running on BlogEngine.NET.  Fail.

It turned out not to be that bad, though.  One of the auto-installable applications Dreamhost offers is WordPress, so I just had to figure out how to import my blog entries to WordPress from BlogML, the export format used by BlogEngine.NET.  Fortunately, Aaron Lerch had developed a module to do just that.  After following his and Nate Irwin’s instructions, all my old posts were imported, albeit with some weird question-mark-looking characters in the posts.  I’ve cleaned up a lot of those, but if you look at some of my older posts that I was too lazy to clean up, you’ll see what I’m talking about.

I have to say, after using dasBlog and BlogEngine.NET, the WordPress management interface is like a breath of fresh air.  People talk about the codebase being crappy, but the user experience is excellent.  Lots of features, a pleasant UI, and stuff just works.  I’m quite happy I made the move, even if it was by accident.

Another couple of plusses because of the Dreamhost move:  unlimited storage space, ssh access, and Rails.  I’ve really been wanting to dig more into Rails lately, and having the opportunity to put stuff out on the intarwebs if I happen to create something cool is great.

Thanks for bearing with me as I transitioned!