Tuesday, September 28, 2010

MVC Abstract Controller

It’s been a while since I’ve posted anything here.  I’ve been busy, and most of the stuff I am working on I either can’t blog about or isn’t worth blogging about as it’s pretty much already covered in a dozen other blogs.  But I have been doing a lot of very basic ASP.NET MVC coding lately on a couple of sites.  I’ve stepped step back from my architect role to do some heads down coding to fill a personnel gap.  I don’t get to do that a lot, and I rather enjoy it when I get the chance. 

But I did come across a very interesting ASP.NET MVC situation this week that required more than a little thought.  And since my monthly blog entry was due for work, I put a long entry about an ASP.NET MVC Abstract Controller on the Palador web site blog.

Feel free to leave me a note here or there.  I’m always afraid of putting a big ‘code’ blog up because it’s just asking for someone to come along and rip the snot out of it and show me a better way to do it. But in this case, I’ll take the chance, as the solution was simple and saved me a lot of time and frustration.

Thursday, August 5, 2010

Book Review: Microsoft SharePoint 2010 – Building Solutions

 sp2010 The full title of this book by Sahil Malik is Microsoft SharePoint 2010 – Building Solutions for SharePoint 2010, and it was recently published by Apress.  I love Apress books, and have 6 others on various .NET related topics on my shelf.  SharePoint 2010 is something I needed to get up to speed on really quickly for a new project at work, as in I had less than a week to make sure I knew what I was talking about.

The book is 367 pages, and covers a wide variety of topics in SharePoint 2010.  It demos a few techniques and tools, and identifies some of the differences between SharePoint 2007 and SharePoint 2010, so even if you know SharePoint 2007, it will really help you to identify what is different about 2010. 

I’ve barely worked in SharePoint in the past.  I tried a little bit of SharePoint 2003, and dabbled in 2007, but frankly, after seeing how painful it was to work in either of those environments, I wanted nothing to do with it.  SharePoint was where good developers went to watch their career die.  Once you knew it, you were guaranteed to get every crappy job out there, and because you were so valuable to the execs and their TPS reports, there was no way you were ever going to get out without quitting your job.  And then, of course, you were so pigeonholed as a SharePoint developer, you could never get out from under it.  Even if I knew SharePoint 2007, I would never put it on my resume.  Career suicide.

SharePoint 2010 seems to be a little different.  With the advent of Windows Identity Foundation and the advanced made in PowerPivot Services, there are definitely things that can be done in 2010 that would take forever in a typical ASP.NET application.  And Linq to SharePoint is a godsend.

Malik covers all these topics with a few quick examples so that at least you know what is there, and how to use it.  Each topic is probably worthy of a book of its own.  I don’t have that kind of spare time, so I needed something at this level so I at least can speak the language and help to architect where these features can be used.  Some features I will deep dive into in the future.  Some features I will leave for others on my team to get familiar with.  You can’t know them all.

I do have a very big gripe with this book.  It does not appear as if it was ever professionally edited by a real copy editor.  There are grammar and spelling mistakes on just about every page.  As a fellow author, I know how hard it is to produce clean copy by yourself.  A second or third set of eyes is always necessary.  So I don’t blame the author for this, I blame the publishing company, Apress.  I really hope this is not a sign of things to come.  I know things are changing in the publishing world, and time to market on these books is critical and compressed.  But in the two or three dozen technical books I’ve read in the last few years, and the dozens of fiction books I’ve read, I can only remember a few typos.  This book had hundreds, and it drove me batty.  I had to reread entire sentences to try to make sense out of them because words were transposed, missing or extras added in and never removed.

Also, the print on this book, and the line spacing is small.  It really felt like they were trying to save paper by eliminating paragraph breaks.  If the book was a fast action thriller, that may not be a problem, but in a technical book, I expect proper formatting.  It helps to convey information and give your brain a break; an indication that the topic or idea is changing.  Without a break, it gets really hard to read. A lot of the graphics are blurry or too dark to read, though that could have been my eyes as I was reading late in the evening.  At one point, it was so bad, I wanted to hurl the book across the room.

I’m going to recommend this book, but I really hope they do another edit on it.  Architects should read it.  SharePoint 2007 devs should read it to get up to speed on the changes.  New devs for SharePoint 2010?  Not sure.  It does give a good introduction to SharePoint Development, but you’ll probably end up buying another book as well to get into the nitty-gritty.

SharePoint 2010 seems to have come a very long way from it predecessors, but it still feels like it is behind the times, and a little kludgy.  I’m worried that there are difficulties hidden in the practicalities of working with it that are glossed over by this book – things that you could spend a week trying to figure out, only to be frustrated by some weird case where you have to use I before E except after C when the word is in French and followed by an apostrophe.  SharePoint 2010 is to SharePoint 2007 as .NET 2.0 was to .NET 1.1.  The next version should be like .NET 3.5, infinitely better and more stable.  2010 is usable, but just not quite there yet.

But it no longer seems like career suicide to know it.  You’ll probably just need to make sure it is not the first item you list on your resume.

Wednesday, August 4, 2010

Book Review: Programming Windows Azure

I meant to review Sriram Krishnan’s Programming Windows Azure book a few weeks ago, but I have been slacking.  It wasn’t until last night, when I was halfway through my next technical book (which I will review soon), that I realized that I had neglected to do the obligatory blog post.

I’ve been working with Windows Azure since the day after the CTP Launch.  I was in the Early Adopter program.  I corresponded regularly via email or via the forums with various members of the Azure Team for about 8 months between December 2008 and July 2009.  I’ve led three major projects in the Azure world, and at PDC last year, members of the Azure team were finally able to put a face to the name of the guy that kept asking the questions that kept them working a lot of hours.  Most of them didn’t hold that against me.

I give that background because when I bought this book I didn’t think I, of all people, would get much out of it.  It was going to be another trophy on my shelf, a book read, a subject I thought I knew well enough to make this what I would consider a ‘light read’.

I was in for a shock.  A refreshing one.  This book taught me something new, or closed a gap in my knowledge base, on just about every page.  While I had worked with Windows Azure, it was in the context of delivering a specific product, and my focus was on using Windows Azure as a tool to allow the product to launch, and a lot of the details had be bypassed.

This is a great book – one of the top 3 or 4 technical books I’ve read in the past 3 years.  Every developer working on Azure must read this book.  Must.  Nothing beats the experience of working on a new platform, but this book will sure make it easier, and teach you the benefits and the pitfalls of this environment.

The biggest downfall of this book is that Chapters 1 (Cloud Computing) and 2 (Under The Hood) will lose a lot of developers who have no interest in the internals of Azure or the history of cloud computing.  These are both necessary and interesting chapters, but for the non-architect or truly technical geek, most will skim them or skip them.

Also, the scope of the book is limited to the Azure Platform itself, and it does not cover things like the AppFabric or the role of Windows Identity Foundation in Azure.  I’m looking for a good book on those as well, since I have yet to deep dive into the Service Bus, and haven’t worked with WIF since it was still called Geneva.

The rest of the book is superb, and like I said, I learned something on every page.  Every Azure developer should have this trophy on their shelf.

Microsoft Word 2010 Bug

I spend a lot of time in Microsoft Word 2010 writing my novels.  I most often am working on my ACER 1420P laptop while doing it, and honestly, this is the only place I have seen this bug, but it annoys the crap out of me.

Here’s what happens:

  1. Open a new or existing document
  2. Start typing
  3. Let you mouse cursor drift up over the tool bar (like the one for text formatting). 
  4. Keep typing.
  5. Eventually, characters will start repeating.  First you’ll see three ‘c’ characters (or whatever character that falls victim to this bug on that given day).  If you see this start to happen, sometimes moving the mouse will stop the next phase from happening, but if you miss it because you watch your fingers and not the screen (which I am prone to do), phase 2 of the bug will soon kick in.
  6. Phase 2 involves Satan taking over your computer.  If you just happen to hit the backspace key or a cursor key, it will start repeating indefinitely.  Next thing you know, your whole document is gone or you are sitting at the top or bottom of your document, and you still can’t get control back.

The only thing that I have ever found that works is to pull up task manager, and kill Word.  But at least it will ask you if you want to save your changes before it closes.  If the repeating character is a cursor movement, no problem.  If it was the backspace key, you are SOL.

I find that the bug shows up most often if the cursor seems caught between tool bar buttons.  It looks like a Windows event has gotten into an infinite loop, and as a developer, I would not want to be the one who has to try to figure this one out.

That said, I really hope this gets fixed soon.  It happens to me at least once a week, and more than once has cost me work.  I save early and save often because this one could strike at any moment.

Monday, July 5, 2010

Gradients, CSS and Overflow

Last year, I built a website (http://www.joebeernink.com) to handle aspect of my burgeoning writing career.  I also used it to learn ASP.NET MVC.

I had a designer do the styles and the graphics, but I did all the HTML work.  When I originally laid everything out, the content all fit nicely on a single 800px long div which exactly matched the length of the repeating vertical gradient the designer put together for me for the outer background. 

Once the content got too long, the repeating style left a lot to be desired.  It not only repeated across the X axis, but the Y axis as well, which meant starting at pixel 801, I had this nasty black line across the bottom of the screen, and then the gradient began again.  Yuck.

So instead of that, I set overflow = auto on all my inner divs, which for a while meant that the one page that had too much content had an inner scroll bar.  I was kind of okay with that (I shouldn’t have been), but CSS has never been something I’ve worked with a lot, so I ignored it.

Lately, as I’ve been upgrading my site for an upcoming conference, every page began to have the scroll bar, and it was killing me.  I talked to my designer at work and they shook their heads and said, “Dude, that’s like one line of CSS to fix.  Don’t be such a loser.”  Okay they didn’t say that, but they should have.  They said it was pretty easy, and could be done in a single line of CSS.  They were going to send me that line of CSS, but must have forgot.  So yesterday, knowing that a single line solution existed, I got ready to do some CSS spelunking, and eventually came up with the solution.

Out with the old CSS Classes

    background-image: url("../../Content/beernink_pagebkgd.jpg");
    margin-left: auto;
    margin-right: auto;
    background-repeat: repeat;

    margin-left: auto;
    margin-right: auto;
    padding: 20px 30px 20px 30px;
    background-image: url("../../Content/beernink_insideBkgd.jpg");
    background-repeat: repeat;
    margin-bottom: 5px;
    width: 740px;
    _height: 1px; /* only IE6 applies CSS properties starting with an underscrore */
    overflow: auto;
    height: 500px;

And in with the new

    background: #252E33 url("../../Content/beernink_pagebkgd.jpg");
    margin-left: auto;
    margin-right: auto;
    background-repeat: repeat-x;

    margin-left: auto;
    margin-right: auto;
    padding: 20px 30px 20px 30px;
    background-image: url("../../Content/beernink_insideBkgd.jpg");
    background-repeat: repeat;
    margin-bottom: 5px;
    width: 740px;
    _height: 1px; /* only IE6 applies CSS properties starting with an underscrore */
    /* overflow: auto;
    height: 500px; */

The #252E33 is the color at the bottom of the gradient. The color nicely fills the page beyond the 800px line, and looks great. 

There are probably better ways to do this, but this was simple and allowed me to say the site was ‘done-done’ for the conference later this month.

Of course, now I want to find a real blogging engine in ASP.NET MVC that I can deploy with my site.  Any recommendations?

Friday, May 7, 2010

Silverlight + Azure Shared Access Policy Issue

We’ve been working with Shared Access Policies in Azure for the last week or so, and for the most part it was working.  But it would only work for a while, then it would stop.  Then it would start working again.   It took some help from Steve Marx and Jai Haridas on the Azure Team to figure out what was going on.

This first piece of code is part of our storage manager that retrieves the Uri for the blob from Azure Blob Storage, and creates a Shared Access Policy for that blob so that our Silverlight Video Player can directly access the blob without going through a very slow web service call.

   1: /// <summary>

   2:         /// Get the Uri for a specified video blob

   3:         /// </summary>

   4:         /// <param name="videoId">The unique identifier of the video</param>

   5:         /// <returns>A Uri ponting to the Video</returns>

   6:         public Uri GetVideoBlobUri(Guid videoId)

   7:         {

   8:             var log = Log4NetHelper.GetLogger();

   9:             log.DebugFormat("Getting Video for video id {0}", videoId);


  11:             try

  12:             {

  13:                 CloudBlobContainer container = GetContainer("VideoContainerName");


  15:                 CloudBlockBlob cloudBlockBlob = container.GetBlockBlobReference(string.Format(CultureInfo.InvariantCulture, "{0}", videoId));


  17:                 var readPolicy = new SharedAccessPolicy

  18:                 {

  19:                     Permissions = SharedAccessPermissions.Read,

  20:                     SharedAccessExpiryTime = DateTime.UtcNow + TimeSpan.FromMinutes(10)

  21:                 };


  23:                 var blobUri = new Uri(cloudBlockBlob.Uri.AbsoluteUri + cloudBlockBlob.GetSharedAccessSignature(readPolicy));


  25:                 log.DebugFormat("GetVideoBlobUri successfully retrieved for video id {0}", videoId);


  27:                 return blobUri;

  28:             }

  29:             catch (StorageClientException ex)

  30:             {

  31:                 // If the blob was not found, return null

  32:                 if (ex.ErrorCode == StorageErrorCode.BlobNotFound)

  33:                     return null;

  34:                 // Rethrow the exception in all other cases

  35:                 throw;

  36:             }

  37:             catch (Exception ex)

  38:             {

  39:                 log.ErrorFormat("Error while retrieving video blob {0}", ex.Message);

  40:                 throw;

  41:             }            

  42:         }

In our Silverlight Web Service, we call this manager class and return the Uri embedded in an XElement.  Originally we did this to allow us to return other information with the Uri, but at this point, the Uri is all we are passing through.

   1: /// <summary>

   2: /// Get video URI.

   3: /// </summary>

   4: /// <param name="videoId">ID of the given video.</param>

   5: /// <returns>Status</returns>

   6: public XElement DemoVideoUri(string videoId)

   7: {

   8:     var blobManager = UnityFactory.Current.Resolve<IBlobManager>();

   9:     var blobUri = blobManager.GetVideoBlobUri(new Guid(videoId));


  11:     string xml = string.Format("<VideoUri>{0}</VideoUri>", HttpUtility.HtmlEncode(blobUri.AbsoluteUri));

  12:     var sr = new StringReader(xml);

  13:     return XElement.Load(sr);

  14: }

On the Silverlight side, we take make a call to this web service, extract the Url from the XElement, and pass the Url into the MediaPlayer like so

   1: var client = new WebClient();


   3: var videoUri = GetVideoUri();


   5: client.DownloadStringCompleted += (x, y) =>

   6:                                Dispatcher.BeginInvoke(

   7:                                    () =>

   8:                                    {

   9:                                        var xdoc = XDocument.Parse(y.Result);

  10:                                        var query = from b in xdoc.Descendants()

  11:                                                    select b.Value;

  12:                                        HostedVideoUri = new Uri(HttpUtility.HtmlDecode(query.First()), UriKind.Absolute);

  13:                                        mediaPlayer.Source = HostedVideoUri;

  14:                                        Log(string.Format("HostedVideoUrl = {0}", HostedVideoUri.AbsoluteUri));

  15:                                    }

  16:                                    );


  18: client.DownloadStringAsync(new Uri(videoUri, UriKind.Absolute));



  21: private static string GetDemoVideoUri()

  22: {

  23:     var host = HtmlPage.Window.Eval("window.location.hostname;") as string;

  24:     var path = HtmlPage.Window.Eval("window.location.pathname;") as string;

  25:     var refererUri = host + path;

  26:     var url = string.Format("http://{0}/xxxx.Svc/DemoVideoUri/{1}", GetHost(), VideoId);

  27:     return url;

  28: }

This is all pretty simple code (and the above code works, by the way).

Where we ran into problems was in the Web Service, when populating the xml string.  Originally, we used blobUri.ToString() instead of blobUri.AbsoluteUri.   This caused big issues (403 errors returned from the Azure Storage Service), when the Video Player tried to retrieve the blob because the Url returned from the Shared Access Policy generator can have spaces in it.  And Uri.ToString and Uri.AbsoluteUri work very differently when handling spaces.  I did not know this until last night.  Uri.ToString unescapes the Uri before returning it.

So why did it work sometimes, and not all the time?  Simple.  Sometimes the Uri from the Shared Access Policy Generator has spaces, and sometimes it does not.  The former did not work, while the latter did.  We were looking for a pattern in the number of times we called the web service, or the interval between calls, and the size of the video blob.  But it was as simple as a single space.

I think we spent at least 8 hours over the last two days trying to track down this bug.  Hopefully this saves someone else some grief.

This post is cross posted on http://www.palador.com/blog/Blog/Silverlight--Azure-Shared-Access-Policy-Issue intentionally.

Monday, April 26, 2010

Book Review: Professional Application Life Cycle Management with Visual Studio 2010

We’re jumping into Visual Studio 2010 this week, and beginning the migration to TFS 2010 as well.  I wanted to get familiar with TFS 2010 and try to standardize our practices a little more before we mis-configured everything, so I bought the first book I saw that seemed to cover TFS 2010.  Professional Application Lifecycle Management with Visual Studio 2010 by Mickey Gousset, Brian Keller, Ajoy Krishnomoorthy and MArtin Woodward fit the bill for what I was looking for and more.

The book is broken into 5 Parts:

  • Architect
    • This was a little basic, and highlighted a few new features in VS 2010 that I once used Visio to accomplish.  But it also clarified for me a few basic UML constructs that I rarely used or may have been using incorrectly.  I’m not sure I’ll use all of the available UML diagrams very frequently, mainly because not everyone on the team will be able to use them or understand them, but I’ll see how it goes.  Ironically, this may be the last section that is truly useful for me.
  • Developer
    • I’m going to make sure all my developers are familiar with the new capabilities of VS2010, and this is a fantastic guide to what is possible.  It make take a little effort to get to add these tools to the dev tool belt, but I think these time savers make the difference between a professional development shop, and a place that is just throwing code over the fence.
  • Tester
    • A great section about brand new functionality in 2010, and an area both testers and devs should be reading.  I’m hoping that I can really alter our expectations of the relationship between dev and QA through the use of these tools, and that by the end of the year, our QA process can be much more thorough than it is now, and not cost any more time than it already does.  It’s definitely been the neglected child of the development process, and my goal for the rest of 2010 is to bring it up to speed.
  • Team Foundation Server
    • A solid primer on TFS and some good guidance on branching and merging.  A must read for dev leads, architects and build engineers
  • Project / Process Management
    • This is the section that really got my attention at first, and the one I would like everyone here to read, even the execs, and especially the solution managers.  Half the battle of good project execution is getting everyone on the same page of terminology and process.  This is definitely worth a read for a shop that is will to make changes, not for the sake of making changes, but to fix issues and bring themselves into compliance with the rest of the development world to eliminate the vocabulary barrier.

I found all the sections useful as a starting point to either make slight improvements to the processes we use, or as a guideline to make wholesale changes to how we are working to improve our output. 

What I really liked about this book was that while I could have figured this stuff out on my own over the next year or so, it gave me a good primer to get started, so hopefully that will result in fewer wrong steps, and let me know about hidden features that I may have never discovered. 

I powered through this book in about a week, and I’m sure I remember less than 50% of it.  But at least I know where to go to look things up, and can begin to plot out direct for the configuration of our TFS servers and our processes.

Tuesday, March 2, 2010

Book Review: Introducing .NET 4.0

I recently finished reading Alex Mackey’s book, “Introducing .NET 4.0 With Visual Studio 2010”.  VS2010 won’t be launched until April 12, 2010, which is still just over a month away, but I’ve been champing at the bit, ready to dive into .NET 4.0 for months now.  Mackey does a great job of whetting my appetite without drowning me in the minutiae of each new or improved feature. 

.NET 4.0 is all about improvements to pieces that have been slowly coming together over the past few years.  There have been a number of out of band releases to the .NET Framework since .NET 3.5 launched.  .NET 4.0 enhances a lot of those releases, and, in some cases, completely goes back to the drawing board.  There are a few new features that will make developers happy, but for me, the money is in the things Microsoft made better. 

Mackey assumes you have a good knowledge of .NET, as this book targets what is new since 3.5.  He includes things which were included in .NET 3.5. SP1, since not everyone may have gotten into that yet.  I was pleasantly surprised at how much of the SP1 functionality we have actually put to use here, and was able to skim those chapters.

There are parts of the book that were finished before the actual functionality of VS2010 or .NET 4 were finalized, but that’s what you’ll get when you buy a book on a product before the product is released. Caveat Emptor.

By the end of the book, I was able to make some decisions on what functionality to target for further research.  My bets are on three areas that will have the biggest payoffs for my team and my customers:

  • MEF (Managed Extensibility Framework)
  • Entity Framework 4.0
  • Windows Workflow 4

With MEF being such an integral part of Office 2010, a good working knowledge of it is essential to provide process enhancements for our customers.

We use Entity Framework 1.0 in just about every project we work on, and the leap to 4.0 should resolve a number of significant issues and code costs that we have encountered in the past year.  I expect to see a large effort reduction in application development costs due to this upgrade.

If the improvements to Windows Workflow are as good as advertised, then I believe WF will go from shunned cousin to an accepted member of the development family.  There’s a lot of power in WF, but the previous implementation was lacking.  I stopped considering the previous releases as viable last year after a few false starts, but I have much higher hopes for it going forward with V4.

I recommended Introducing .NET 4 to my colleagues as a good getting started guide, and directed some to target certain chapters to match up with the type of work they typically do, or areas where they have yet to become involved in to give them a 40,000 foot view. 

I will probably go out and buy other .NET 4 books as they become available, but will focus on the three areas above as they will have the biggest impact to my architecture choices in the coming months. 

Mackey helps to give that guidance, and gets me excited for the new features.  That’s what I want in an introductory level book.

Cross Site Blogging

Today, I’m proud to announce the launch of a revamped Palador.com web site.  Palador is the consulting company that I work for, and as the Senior Application Architect there, I will be doing my part to blog there, as well as here, and my personal blog as well.  That’s a whole lot of blogging, but I actually get paid to blog there, so hey, I’m okay with taking that on.

Occasionally, I will cross post blogs here and there.  Sometimes things will be posted there and not here, depending on the topic. We’ll see how it goes.

We’ll have five or six of us posting entries on a variety of topics, which is cool as it just goes to show how diverse our skill set is here. 

Anyway, take a spin over to the new site and let us know what you think.  I wrote or edited a lot of the copy there, so if you find issues, let me know.

Monday, February 1, 2010

Living the Evolved Digital Life

Okay, it’s been a couple of months since I evolved my digital life, cut the cord to Premium Digital TV, and got my house in order.  So where am I now?

1.  Lesson Learned:  You cannot run digital video over a wireless router, especially if you have more than 1 wall to go through.  Bite the bullet and pull the ethernet cable if you can.  I was always at 4 bars with the XBox 360, but the router would shutdown every hour or so (sometimes 3 times in a single episode of CSI).  Pulling the cable through the floor was a PITA, but worth it now.

2.  Spend some time and tune your Harmony remote.  It’ll take a lot of time to get it right.  I still have work to do on mine, but each improvement is noticeable, and after a while, you can’t believe you used to use 3 (or 4) remotes.  That said, sometimes it still is a big pain, especially when the kids are sleeping and you switch video inputs, and the sound is cranked up and you can’t get back to turn down the volume down fast enough.  I find myself planning my moves with the remote.

3.  Make sure you have enough disk (at least 1TB) on your Media PC.  I still haven’t hooked up my HD tuner because I haven’t wanted to spend any more money on the setup right now to add enough disk to record HD shows.  I’d also like to have a dedicated Media PC with faster processors that can also host PlayOn, but since we have multiple laptops floating around the house, most of the time the PC is a dedicated Media machine.

4.  2010 will be the year Internet TV really takes off.  My wife says we were probably a year or two early jumping online, and maybe we were.  I’m looking at things like Boxee Box and thinking that was probably what I wanted.

5.  We miss having a channel guide on our TV.  We currently have to switch over to the Media center to get it, and because we get more channels on the cable than what the media center allows us to see, we don’t always know what is on at any given time.  This might be worth a little more research.

6.  I don’t use Hulu near as much as I thought I would.  I just don’t want to go back and watch old shows that often.

7.  I watch a lot more Netflix and am willing to break movies up over multiple nights if I need to.

8.  The kids don’t miss on demand that much.  We have a few videos laying around at home, and a few lined up on Netflix.  That seems to get us through rainy Saturday afternoons.

9.  Maybe I’ve just been busy the last couple of months, but I am watching less TV.  That was a side effect I hoped for.  I’m reading more, and watching better TV when I am watching it.  When there is a little resistance to the inertia of just keeping the TV on all evening, you find other things to do pretty quickly.

10.  I haven’t sprung for the XBox Live Gold Membership since I get everything I want through PlayOn, but as I was watching a movie last night, I definitely noticed that the hop across my HomeServer through PlayOn left dark scenes a little blocky.  For $40 a year, it might be worth it to get the membership.

These things always turn out to be a little more complicated than you think they will.  I feel pretty good about it now, but it’ll be a few months before I really forget the pain of the conversion.

Thursday, January 14, 2010

.Net 4.0 and Azure

We were mapping out a release schedule for one of our Azure based applications today, and a major part of the application needs to be completely refactored to eliminate tight coupling between our WPF client and our server application.  This coupling is exacerbated by the inability to properly XML serialize some of our Entity Framework 1.0 objects due to the recursive traversal capabilities of EF 1.0.  We’d like to push the refactor back until EF 4.0 is available, but that brought up the question of when .NET 4.0 would be available on Azure. 

There is no set release data for Azure with .NET 4.0 support at this time. However, Scott Guthrie mentioned on his blog on Dec 17th that

“We are working with the Azure folks right now to try and get .NET 4 installed on it as soon as possible.  Unfortunately I don't have an exact ETA yet.”

However, the Azure team this week (today since Thursday is their deployment day) did a release to include an ‘OS Version’ attribute for roles so you could specify a particular Azure Build level when deploying. It will default to the most current version if you don’t set it, so it is a way to ensure that if you don’t want to be upgraded, you won’t be. Right now, they only support one version of the Azure OS. This has to be a precursor to the .NET 4 rollout, and something we have been trying to get them to include since our very first meeting with the Azure Team back in November 2008.  I haven’t looked at the feature in detail, but I’m glad they’ve addressed the concern.

My guess is that they will have to spin up .NET 4.0 support well before the commercial launch of .NET 4 because of the integral role of Azure in VS2010, and that in order for final testing to happen, they’ll have to allow full .NET 4.0 Azure deployments. Kind of a chicken and egg thing.  Stay tuned.

Setting up VS2008 For Windows Mobile 6.1 Development

There are a few tricks to setting up your PC for Windows Mobile 6.1 Development that are needed to get moving.  Of course some of this will depend on exactly what you want to do in Windows Mobile.

My project was pretty simple.

  1. Create a Windows Mobile Forms app that interfaces with a Motorola M9090-G scanning device that allows a user of the scanner to scan their employee badge barcode, a barcode for a shipping document, and a barcode for a series of packages.
  2. The user will scan a large number of packages, and then send their scan records in a batch to a central database for further processing.  The user may or may not be close to a wireless access point at the time of the scan.
  3. The app has to be fast.   The folks using this device will fly through dozens of packages a minute, and there will be multiple scanners working to unload a truck, but there are logical gaps in the loading and unloading where the app can upload to the database.

It’s pretty obvious I needed a client cache for the data.  I chose Microsoft SQL Server CE.  For the backend data store, we’re using SQL Server 2008 (with Change Tracking turned on)

I didn’t want to custom build a synchronization methodology, and since I played with Microsoft Sync Services a bit last spring on another project, it seemed like a good place to start.

First off, VS2008 SP1 comes with a number of emulators built in, but no Windows Mobile 6 emulators.  In order to get the right emulators installed, download the following, and install in the following order.  You’ll need to shut down VS2008 to complete this install.

  1. Windows Mobile 6 Profession and Standard Software Development Kits Refresh
  2. Windows Mobile 6 Professional Images (USA).msi
  3. Microsoft Windows Mobile Device Center Driver
  4. Microsoft SQL Server CE for Devices
  5. Microsoft Synchronization Services for ADO.NET for devices – note that you cannot user Sync Services 2.0, as it is not device ready yet.

You should now be able to fire up VS2008 and create a new Smart Device Project.  Make sure you set it up for Windows Mobile 6, or you’ll not have all the options you need, and will have to start over.

One mistake I made, was not realizing that there is a different version of Microsoft SQL Server CE for Devices than for PCs.  You will need to download the correct version to get everything to work.

I strongly suggest creating two solutions for this type of an application.  One for the Mobile client, and one for the WCF Service, whether it be a Windows Service or a Web Service.  It makes it a lot easier to debug, and helps to ensure that you don’t try to deploy Mobile targeted assemblies to the server and vice versa.  The IDE should prevent you from doing it, but it doesn’t hurt to take this approach anyway.

I like writing code, but I like getting projects done even more.  So if I find code out there that works, I’m not afraid to put it to use.  There were a couple of projects I found that really help to do some of the heavy lifting:

  1. SyncComm on Codeplex.  This provides you with all the plumbing you need to get Windows Mobile Sync working in your project.  If there is one thing I would change (and did) in the project, it was to break the ClientService.cs up into another partial class to remove the customizations that were done to it.  I have found three methods that I moved into a separate file.  Otherwise the code gets wiped out when you regenerate it.  Cost me an hour of work.  See code below.
  2. Custom Message Encoder: Compression Encoder on MSDN.  Download the sample there.  The download link is trickily hidden under the title of the article.  This provides you with all kinds of samples.  The one you want to go to is under <installroot>\WCFSamples\WCF\Extensibility\MessageEncoder\Compression\CS.  Take the GZipEncoder and add the project to your server solution.
public ServerClient(System.ServiceModel.EndpointAddress endPointAddress, BindingType bindingType)

        : this(GetBinding(bindingType), endPointAddress)

    { }


    static Binding GetBinding(BindingType bindingType)


        Binding binding;


        switch (bindingType)


            case BindingType.Basic:

                binding = CreateDefaultBinding();


            case BindingType.Compressed:

                binding = CreateCompressionBinding();



                throw new ArgumentException("BindingType value not excepted");



        return binding;




    //set compressed endpoint binding custom properties here

    public static Binding CreateCompressionBinding()


        // Create a CustomBinding

        var customBinding = new CustomBinding();

        // Create a compression binding element

        var compressionBindingElmnt = new CompressionMessageEncodingBindingElement();

        // ..and add to the custom binding



        // Create an HttpTransportBindingElement and add that as well

        var httpBindingElement = new HttpTransportBindingElement();




        //Set here desired values. Take care to match such values 

        //in app.config in SyncComm host project

        //max buffer size

        //httpBindingElement.MaxBufferSize = int.MaxValue;

        //max received message size

        //httpBindingElement.MaxReceivedMessageSize = long.MaxValue;

        //max buffer pool size

        //httpBindingElement.MaxBufferPoolSize = long.MaxValue;




        return customBinding;



In order to get WCF to connect from the Windows Mobile 6 Emulator to a Web Service on the local host, you’ll need to follow the steps listed by Chris Brandsma on StackOverflow.  This is critical and can cause a lot of frustration if you don’t do it.

So as of today, I have a client on my Windows 6 Emulator, a Web Service, and the basic data flowing, though I have a lot of work left to do to test and refine the processing, and to test on an actual device.  I’m sure I’ll find a few more issues, but I wanted to note what I had done to this point, just in case I need to replicate the process on another PC or build server here in the near future.

Let me know if this doesn’t work for you.