Thursday, December 13, 2007

Icon Restore

This is a nifty thing for those of you who change resolutions from time to time (or have them changed for you without your permission, if you know you know what I mean).

Icon Restore is free and does just what it says it will do.  Note that if you move things around after saving your icon locations, you will have to remember to save your new position.

If you like your desktop the way you like it, this is for you.

Tuesday, December 11, 2007

Two, Two, Two Mints In One

My employer, Alogent Corporation, is developing an API to interface to an existing piece of code I'll call Deposit Server.  It normally runs as a Windows service, which I'll call DepositServer.exe.  The API (WebApi) is designed to be called via the web, so it runs under IIS within the aspnet_wp.exe process.  Each of these has its own Visual Studio solution file, each with numerous projects.

If you happen to be coding and debugging on both sides of the fence (the Windows service and the IIS process), you can use one Visual Studio instance to attach to both processes at once in order to trace between the web API and the Deposit Server as traffic flows back and forth. 

What I did was create a single solution, let's call it DualSolution, by merging the contents of the existing WebApi and DepositServer .sln files.  Rather than try to figure out which Project entries were common to both solutions, and there are quite a number, I just copied the entire contents of each .sln file into the new one, back to back. 

When you open the resulting solution in VS, you will get a bunch of notifications that 'project xxxx cannot be added because it already exists in the solution'.  You just click OK, and VS will take care of the cleanup when you save the solution the first time.

(At first I thought I needed to merge the contents of the Global sections and have only one Global, but VS appears to take care of that as well.  Just append one .sln file after the other, open up the result, and you're good to go.)

When you want to debug, attach to process DepositServer.exe, then turn right around and attach to aspnet_wp.exe.  Now, you can set breakpoints from end to end. 

Of course you can do something similar by having two copies of VS running, one with the WebApi solution, the other handling DepositServer.  But it seems to me there are two upsides of the combined solution method:

  1. You save memory by having only one copy of VS running rather than two
  2. You save memory by not having the projects common to both solutions open twice

Devenv.exe (the guts of the Visual Studio IDE) tends to get pretty hefty as you ask it to do things;  here are the Working Set sizes on my machine, as reported by Process Explorer.  Your mileage will vary, based on what's open in the solution, the add-ins you've got installed, moon phase, how VS feels at the moment, etc. 

Each of these measurements was taken with all projects collapsed, no source files open, no Start Page open.  I let things "settle" for a while, as I notice the WS size creeps up a bit for a few minutes after the solution is open for business in the IDE.  The solutions are (for the moment) VS2005.

Solution opened and its state Working Set size (in KB)
Visual Studio with no solution open 57,016
DualSolution, after opening 400,256
DualSolution, after attaching to both processes 421,404
WebApi, after opening 196,384
WebApi, after attaching to aspnet_wp.exe 212,228
DepositServer, after opening 403,508
DepositServer, after attaching to DepositServer.exe 427,004


Memory to debug with combined solution: 421,404 kb
Memory to debug with two VS instances: 639,232 kb
Not having to swivel your head between two sets of breakpoints in two different windows:  priceless

Tuesday, December 04, 2007

Hurry! Act now!!

Repent!  The End is Near!

It has been announced that Virtual Server 2005 will no longer be supported after 2014.  Plan now to migrate your virtual images, before it's too late. 


You know your company's getting big when it announces that an already outmoded product will see its sunset seven years from now.

Thursday, November 15, 2007

Sysinternals Suite - New and Improved Packaging!

All of the good stuff in one package.  This is a great idea for a newly-paved machine. 

Windows cannot open this file

This has been an annoyance for a long time.  When you open a file with an "unknown" extension, you get the dialog asking you if you want to search the Interweb to find an appropriate program.  Like the author of this post, I have never, ever found anything through this.  I automatically choose "Select the program from a list".

Now you can perform some registry hackery to bypass this dialog and always go to the "Select from a list" step.

Tuesday, October 30, 2007

Buy 1, Donate 1 Laptop for $399 - The One Laptop Per Child Project

A while back, I saw a 60 Minutes interview with Nicholas Negroponti about the One Laptop Per Child Project, and it was very interesting.  These things really are useful in an African village.  Kids, as kids do, were taking to them like ducks on a pond.  It's a cool box, too.

Now we have a chance to both satisfy our lust for curiosity about the technology and help a child.  Come November 12th, there is a two week program where you can buy one of these babies and have one donated.  Sign up here to be notified when it officially starts.

Rob Walling has blogged about it.

Wednesday, October 24, 2007

WCF Performance Comparisons

Ken Brubaker points to a summary by Clemens Vasters of a white paper comparing various existing distributed communication technologies with WCF.  The bottom line - WCF is

  • %25 - %50 faster than ASMX
  • %25 faster than .NET Remoting
  • %100 faster - %25 slower than .NET Enterprise Services (DCOM)

Also in his summary, Clemens remarks (italics his):

For WSE 2.0/3.0 implementations, migrating them to WCF will obviously provide the most significant performance gains of almost 4x.

The performance differences alone make WCF a technology worth exploring.

Team Foundation Server Version Control

Jeff Levinson provides a brief overview of how you could set up and do change management for development using TFS.  The approach is a bit different than I've seen in the past.

Thursday, October 18, 2007

Integrating TortiseSVN with Visual Studio.NET

Here's a nice CodeProject article on both using TortiseSVN (the primo Subversion client) and integrating it with Visual Studio.  It discusses Ankh, VisualSVN, and an add-in developed by Garry Bodsworth (described here).  There are some useful tips on Subversion in general, and some of the comments have interesting questions and answers as well.

Wednesday, October 17, 2007

Beginning the WCF Journey

I'm going to dip my foot into WCF.  That's more than a toe but less than a headlong dive.  Of the four books listed in our library, I have found two.  Here are some quotes from the intros: might instinctively treat WCF as just another API.  Resist this temptation... WCF is not just a wrapper around existing functionality or just another whiz-bang API.  WCF is the evidence that a tectonic shift has occurred in distributed software development.

Justin Smith, Inside Windows Communication Foundation

Tectonic shift!?  Wow!  Equally as eye-opening is this one:

To me, WCF is simply the next development platform, which to a large extent subsumes raw .NET programming. 

Juval Lowy, Programming WCF Services

This could be fun...

F# About to go Mainstream

I've been noodling around with F# for a while, partly to get used to the upcoming Functional Programming F# Major Chordfeatures of C# 3.0, partly to expand the way I approach programming problems.  Now, Somasegar has announced that Microsoft will be making F# a first class language, along with C#, VB.NET, etc.  This is very  good news, not only for devotees of ML-type languages, but also for FP in general.  When something is readily available in Visual Studio, there's more of a likelihood that someone will give it a try*. 

And while it's true that other ML implementations for .NET have been around for a while, such as Nemerle, SML, CAML, OCaml and Haskell, the polish that Microsoft will put on F# will make it much easier to use. 

Also, unlike some of the above examples, F# is under active development by Microsoft Research in the UK.  Don Syme is the leader of the team.  F# can be either compiled or scripted like Python, but it has ML's type inference and is type-safe.  It generally performs as well as (or better than!) compiled C# code.  For an intro and overview of features, check out the project's main page.







* Except for J#.  Don't ever try J#.

Monday, October 15, 2007

REMs don't sleep

Let's say you have a .BAT file with a command like

c:\bin\dcc32.exe   [parms to compile a Delphi program]

It will return an %errorlevel% of greater than zero if the program does not compile, zero if it does.

Since I was testing this by double-clicking, I put a PAUSE statement right after it to see the output of the compile step. 

c:\bin\dcc32.exe   [parms to compile a Delphi program]

Once the whole build project started working, I left the PAUSE in but REM'ed it out so I could remember what I had done in case I needed to go back and debug later.

c:\bin\dcc32.exe   [parms to compile a Delphi program]
rem pause

It turns out that REM is an "active" statement in a .BAT file and sets %errorlevel% to zero!  When the compile line really did fail, the error was being masked by the REM right after it, so the MSBuild project kept going as if nothing was wrong.  Once the REM line was removed, the MSBuild project failed as it should.

Wednesday, October 10, 2007

A gift idea for that special someone

Got somebody who's hard to buy for?  Here's something that's appropriate for either a nerd or a geek.

Database Humor

From xkcd.

Thursday, October 04, 2007

Software Branching and Parallel Universes

There's a nice description of version control branching and merging on Jeff Atwood's blog.  He points out some patterns and anti-patterns taken from a Microsoft article.  We are facing some complex issues with branches and customer-specific features right now at work.  I don't know the best answer (yet), but these two articles give a good framework for discussing the technical as well as organizational ramifications.

Wednesday, October 03, 2007


If your code is running in a Sql Server context (sproc, trigger, CLR assembly, etc.) and you wish to know who initiated the execution, you can try the T-SQL command SELECT SYSTEM_USER.  This will give you one of three answers:

  1. The Sql Server userid, if Sql Server Authentication was used to log in
  2. The Windows user in the form Domain\UserLoginName, if Windows Authentication was used to log in
  3. The name of the currently executing context

Number three is interesting, because it masks the "real" user identifier behind a persona.  To change your context, issue the command

EXECUTE AS USER = 'Gsl\SierraServer'

Then no matter who you are or how you logged in to the server, SELECT SYSTEM_USER will return "Gsl\SierraServer".  The BOL even calls it the "impersonated" context.

Not that customers would ever fiddle with sprocs or table schema or permissions that you carefully craft and install...

Monday, October 01, 2007

Helper DLL for MSBuild

In creating a new build, I found I needed a way to set an environment variable within the MSBuild script for subsequent use by a .BAT file.  On CodePlex, I found a project called SDC Tasks Library that filled the bill and then some.  From the site:

This is the latest version of the SDC Tasks for .NET 2.0. The SDC Tasks are a collection of MSBuild tasks designed to make your life easier. You can use these tasks in your own MSBuild projects. You can use them stand alone and, if all else fails, you can use them as sample code.

There are over 300 tasks included in this library including tasks for: creating websites, creating application pools, creating ActiveDirectory users, running FxCop, configuring virtual servers, creating zip files, configuring COM+, creating folder shares, installing into the GAC, configuring SQL Server, configuring BizTalk 2004 and BizTalk 2006 etc.

This used to be on GotDotNet (RIP) but is alive and well now on CodePlex.  To use this in your MSBuild script, put something like this at the top of the script:

<UsingTask AssemblyFile="Microsoft.Sdc.Tasks.dll" TaskName = "Microsoft.Sdc.Tasks.SetEnvironmentVariable" />

When you need to use it...

<SetEnvironmentVariable Variable="MyEnvVar" Value = "$(BuildPathVar)\Bin" Target="Process"/>

The help file that comes with the .DLL is only a reference and seems to have no examples, but for some tasks, the usage is pretty clear.  As always Google is your friend for real-life examples.

Tuesday, September 25, 2007

Using Subversion from .NET

DotSVN is an Open Source .NET library that allows you to programmatically access a Subversion library.  While this is an interesting idea, what I found even more interesting was the article's graph showing the growth in usage of Subversion over the last 3.5 years.  The rate of adoption is dramatic, and this is just for public Apache servers!

A "weak" reason to invoke garbage collection

Call GC.Collect() is generally considered unnecessary and can actually disrupt the smooth functioning of the .NET garbage collector.  Dennis Dietrich gives a scenario where it is reasonable to call the garbage collector in your code.  His example outlines a situation where you want to test that all references to an object have been deleted at some point in the program so that the object can be garbage collected.  The catch-22 is if you maintain a reference to the object to see if it still exists, it cannot be collected.

Enter WeakReference.  This class allows you to maintain a reference to an object while letting it be collected if need be.  In the time between the object's destruction and its pickup by GC, a WeakReference allows you to continue to use the object.  Now continuing to use an object that's in limbo may seem like a bad idea, as the garbage man could drive by anytime.  But here's an interesting scenario where you may wish to keep a WeakReference so that you could have the possibility of turning it back into a strong reference (and thus keep the object from being collected).

In our testing case, we wish to do the opposite, make sure we cannot continue to reference the object.  After GC.Collect() is called, we check the IsAlive property of our WeakReference. 

On a side note, it's interesting that something so fragile implements ISerializable.

Saturday, September 01, 2007

Windows XP Commands

In the process of looking up some cheat sheets, I found this site.   The SQL 2005 was what led me there, but this caught my eye, since a lot of guys at work use the command line.  There are things in here I had never heard of, much less used.

For example:  CSVDE Import or Export Active Directory data.  Wow.  I've used TRACERT quite a bit, but now there's also PATHPING.  If you are cmd junkie, check it out.

Windows XP Commands

TDD? .NET 2.0 SP1??

I recently started a new position where they use VS2005 and Test-Driven Development, which is something new to me.  At home I've been using VS2008 beta2, and it has been very solid.  So, I installed it side-by-side on my company laptop.  All of a sudden, one of the unit tests I was working with failed.  After a bit of digging, it turned out that it was related to the order of List<T>.Sort().

This particular unit test was for a custom IComparer that cared only about certain high-priority filename extensions.  The rest of the extensions were left essentially unsorted, and the filename portions were ignored.  The thing was, the unit test was expecting all of the items sorted to be in a particular order.  When I put some debugging in, I found that order of two of the four "non-high-priority" extensions were swapped when I ran the test.

After some angst and some mild reproaches from my boss ("We usually use VPCs for installing beta software..."), I decided the best course of action was to repave the machine.  Later on, buried in a post about how garbage collection is changed in 3.5, I found a reference to "SP1 for the 2.0 framework"!

It kind of makes sense that something would need to be done to allow the new multi-targeting feature, but I had always thought everything was going to be pretty much the same in the 2.0 framework.  Turns out that's not exactly the case.  I didn't come to any conclusive evidence in short time I spent Reflectoring, but it's clear that something's different, since the size and date of mscorlib is different in 2.0 after 3.5 is installed.

In any event, I pointed out that the framework documentation for 2.0 (and all subsequent versions) says that a Quicksort algorithm is used, which does not guarantee that equals will remain in their original order.  I finally got a concession that perhaps the tests could be revised.  In doing so, I found out that the version control history showed that someone else had found the same problem and had simply changed the expected sort order to match what came up on that test machine!

As my first taste of Test-Driven Development, this was a bit sour.  Shouldn't we be thinking about what needs to be tested instead of just writing any old thing?  In this case, it was testing that items inserted into the List<T> after an initial sort would also be sorted when you once again issued List<T>.Sort().  WTF?!?  If List<T>.Sort is broken and it's up to my puny unit test to find it, let's all just pack up and go home.

If a test produces a "false failure", shouldn't we see if something needs to be revised instead of changing the test to match the new result??  If it's too much trouble to make sure the tests test what should be tested and have them run correctly, why do we do TDD?

Friday, July 27, 2007

Goodbye, J# - We hardly knew ye.

I will shed tears not of sadness but of joy at the apparent disappearance of Visual J# from the list of languages to install in Visual Studio 2008 beta 2. I always use the Customize option of installations, even if only to see what's being installed. If I see optional languages I will not need, I get rid of them. If I see an environment I will not be using, I get rid of it. I have no need of Visual J#, so why should I have to install it? Yet VS2005 had complaints along the way when I tried to rid myself of it. Now, perhaps, we are finally getting rid of a tool that only supports a now ancient version of Java and is thus likely to be useless in "converting" a Java programmer.

Don't miss the samples page after you install. There are some new and interesting items there.

Thursday, July 05, 2007

I'm still struggling with how to keep up and in which subject areas. In the past couple of days I've encountered a number of sources of information related to this. When a number of related things cross my path in short order, I pay attention.

One was a blog from Dan Moran a SQL Server specialist (or at least he stays right on top of SQL Server tech info). His recent blog post The More Things Change references a much earlier one with a provocative title (Can Generalists Handle Complex IT?)

Here's the thing that caught my eye:
Back in the days of "Little House on the Prairie," Doc Baker did a good job of handling the medical needs of Walnut Grove. He was a great doctor for the time and knowledge available, but today I'd want to see a top specialist if I needed brain surgery. More and more, what’s considered a commonplace solution is the IT equivalent of brain surgery, but more often than not, the IT equivalent of general practitioners still do the work.
As a confirmed generalist by circumstance (almost all my jobs have been with very small IT departments or software companies) and temperment (high degree of curiosity about new tech, esp. software), I have to wonder...

Am I doing a service or disservice to my employer by trying to keep up?

On the one hand, if there is literally nobody else who can handle, for example, keeping up with service packs and which ones to apply and when, should I take on this area? Hiring a consultant at my current job (a non-profit) would be out of the question. Who else will keep the wolves from the security door?

On the other hand, if you become too much of a generalist, do you start to become unemployable? If a company wants a consultant, will they look at someone who has good problem solving skills because of broad experience but not a great deal of in-depth experience in the job's skillset?

A generalist tends to know how to "get things done" but not all of the ins and outs of the best practices in the area. Is it more important to focus on, let's say, ASP.NET and have little or no practical experience in WinForms? In a bigger outfit, sure that works well. They can hire specialists. But even so, isn't there room for someone who has a larger vision of how something can be architected because of experience with mainframes, minis, and and VTAM as well as PCs, LANs and HTTP?

Another source was a Hanselminutes podcast, an interview with Tim Ferriss, author of The 4-Hour Work Week. One of his main thrusts is Simplify. He has an assistant respond to much of his email, for instance. When you really need to concentrate, reduce distractions - close your door, redirect your phone, turn off IM & email. The podcast is worth a listen, though if you turn up enough to hear Ferriss, you will be blown away by Scott. (A little db metering before recording would be welcomed...)

Web Worker Daily has a few items, here, there, and over here, which have a few tidbits of useful info.

One of the more interesting writers on the subject in my brief confluence of influences is Christopher Hawkins, who has written several times on the topic. One of the quotes:
Basically, if it's not helping me to secure or complete projects for my company, if it's not helping me to make money, if it's not improving my life in some way, it's mental clutter and it's out.
I can agree with this, but what if keeping up is what helps your company make money? And I think any techie will agree that lifetime learning is essential, whether or not you try to learn all or just some of what's going on.

Finally, I stumbled across some encouraging items. Perhaps I'm just adjusting to The New Normal, which is where overload is a Good Thing and Continuous Partial Attention is the order of the day.

The one thing that is clear is that any tendency toward ADD, which many if not most techgeeks have, is exacerbated by infoloading (vs. carboloading, which we also do). The danger is that you will flit from thing to thing without really absorbing any of it.

My idea is to make a list of topics, turn off everything else, learn a significant amount about the first one. Repeat as needed.

Saturday, June 16, 2007

Measuring the impact of the Sales Force

K. Scott Allen solves a common problem for developers - how to define the impact of what salespeople do.

I first became familiar with this phenomenon when working for a software development outfit. I really liked the salesman (singular), who taught me the term "Salesman's License". This refers to the fact that the customer is always right and can only be sold if the requested feature is "in the pipeline". He took some license with the current state of our products, based on what he thought we could do or were actually thinking about doing.

Thanks for helping me find a vocabulary to talk about this issue! ;^)

How do you do it?

I'm beginning to crack. I'm skimming more and more, compiling a few quick samples rather than building a useful app, adding more blogs than I drop on more and more subjects. I know other people have expressed this, so....

How do you handle it? It must be easier when, say, WCF development is your job. You are surfing one of the first waves of a new technology. I'm more of a generalist in my job. I may have to handle "my system doesn't work" questions as well as "I need to know how many angels danced on the head of pin between June and December of last year". Inbetween, I try to keep my mind alive by enhancing our main web app, writing a utility, or exploring new technologies.

But my natural curiosity takes me too far, into too many areas. It's just all so fascinating! So, what do you do when it's not your job to keep up with technology but you crave doing it? Do you use will-power and make yourself stick with a few things at a time? Explore F# just up to the point that you know you could rewrite your query app in it, even if you don't actually do it?

I've heard you talk about the issue - what do you actually do to handle information overload?

Netscape 9.0 beta

I started with Netscape long ago, before IE was actually out. As IE started to compete, I stuck with NS (obviously at first, since IE was nothing much). Even after IE became a browser and our company decided to become a "Microsoft shop" by entering into the Solution Provider program, I volunteered to test our apps with the "other browser".

I stuck with it until the HR page of the new company I started with failed to register one of my kids for insurance, because a Microsoft-only feature did not show up when I used NS. When I found out about Avant, I dropped NS and have not looked back until now.

I've known that there was underground development of NS going on, but as IE7, Firefox 1.x and 2.x came out, it meant less and less. Finally, I decided to try 9.0 beta, partly for nostalgia.

Right away I was put off by the tab text refreshing at the same rate that .gifs and other elements were being downloaded. This began to bother me so much visually that I uninstalled it. That's rare for me, as I tend to the packrat end of the spectrum.

I will probably try another version someday, but I'm not sure why NS is still going. In skimming the features, I failed to see what they have that isn't being done well by someone else. It's sad from a "good old days" point of view. But only for a moment, as the firehose continues to pump. Acropolis, Silverlight and DLR are adding to my already-crippling burden of things to track.

Wednesday, May 30, 2007


Here's something I've been using for some time now, and I think it's great - FolderShare. It's a way to selectively exchange files with users that you invite (versus using a global P-to-P type of sharing where anyone can see your shared directory).

It is owned by Microsoft now and is still in "beta" even after a couple of years, but it works without a hitch. Almost. (see below)

I have a directory on both my home and work machines that are connected in a "library". When I find an interesting utility that would be useful at home, I put a copy in that folder and the transfer happens automatically, in the background. Naturally, the sending and receiving machines have to be running the FolderShare program, but I rarely turn off either machine.

Not only that, but you can have multiple users sharing the same folder. It's kind of like RSS for files, only you control who can 'subscribe' to the directory.

One of the really nice things is that it works behind just about any firewall because it's port 80 based!

Now, the hitch. In theory, you can visit the web site and browse another machine's shared directories (the site is password protected, of course). If you click on a particular file, it will initiate a file transfer. However, I have had only mixed success with this. Sometimes it will just sit forever, sometimes it will timeout, sometimes you get the file. Size (in this case) doesn't matter.

And in my case, the chums who control the firewall have added this site to the blocked list under the category "Peer to peer sharing sites". I've put in a request to have it unblocked, but it looks like they won't do it. But the folder sharing auto-transfer feature is just great and works flawlessly.

Tuesday, May 08, 2007

What's so good about NHibernate?

I work with data quite a bit and am always a sucker for on the lookout for new ORM/code generation tools to help automate cruds and queries. For a while now it has been MyGeneration d00dads. Free (always a good thing, since I work for a non-profit), fairly shallow learning curve, not too many quirks. But you never know when something better might be waiting for you Out There.

I have seen several items lately, particularly from Ayende of course (I love the idea of Bumbler!) Sam considers it Noteable. In the past, he has also mentioned it in contrast to Microsoft's flagging ORM efforts as a Good Thing.

Time to check it out. I go to (or rather a page off of and check out the Nhibernate Quick Start Guide. In five steps, the first of which you may not need (create a database), the last of which is really several steps to get the configuration and connection objects lashed up, you can begin firing at your database.

The first few examples look good, such as:
User newUser = new User();
newUser.Id = "joe_cool";
newUser.UserName = "Joseph Cool";
newUser.Password = "abc123";
newUser.EmailAddress = "";
newUser.LastLogon = DateTime.Now;
session.Save(newUser); // commit all changes to the DB

Seems natural enough. Then we go down a bit to find

Let's say you want to retrieve an object when you know the user ID (eg. During a login process to your site). Once a session is opened it's a one-liner; pass in the key and you're done:

// open another session to get the new user
session = factory.OpenSession();
User joeCool = (User)session.Load(typeof(User), "joe_cool");

Casting? Why are we casting?? Why not create an object of the type you want and get IntelliSense for its columns, properties, and methods? That's what the MyGeneration d00dads architecture lets you do. Here is their canonical example.

And what's up with all the code you have to write? One would assume that NH devotees use code generation for the class with its properties and the XML mapping file that has to be embedded in your assembly. But no mention of it in the quickstart. If I thought I would have to do all that, I would give up right away. (As it happens, MyGeneration has a template that does the class and .hbm.xml file. If it didn't, I would surely be writing my own.)

Now a lot of any ORM or DAL tool is what you get used to doing. If NH floats your boat, I won't try to argue you out of it. But I'd love to know if anyone using NH has also tried something a bit more automated, perhaps even SubSonic, which is starting to look very nice in v2.0.

If you have tried something else and still use NH, I'd love to know why.