Friday, December 3, 2010

DRY vs KISS

Two developers are playing poker. The chips are all on the table and it's time to show their hands. One throws down the DRY hand. The other throws down KISS. Which hand wins? Who gets to take the pot home?

There are many guidelines in software development. These are two prominent ones: DRY and KISS. For those who haven't heard of them, DRY stands for Don't Repeat Yourself and KISS stands for Keep It Simple, Stupid.

The underlying principle behind DRY is that there should be one authoritative source for any artifact in a piece of software. This can take different forms depending on the context. One example might be a constant literal. If a literal, for example a numeric value or a string value, exists more than one place in a program, it's a bad thing. The duplication should be removed and replaced with a named constant. The constant becomes the authoritative source for the value. This also has the side effect of making the software more readable if a good name is chosen because the meaning can be conveyed, not just the value. Another example is duplicated code. If lines of code are copied and pasted, there becomes more than one authority for the code's behavior. Instead, those duplicated lines should be put in a function that can be called from the various places the behavior is needed.

The KISS principle is based on the idea that complexity makes code that's hard to understand, maintain and debug. Code that is hard to understand will have more bugs in it to begin with and take longer to get working properly. It will also be harder for the person who has to maintain it to make changes and hence will increase the cost of ownership over the lifetime of the product.

On the surface, I think most developers will agree that both these principles are good. However, I think there is disagreement upon which one trumps the other.

An argument could be made that removing duplication increases complexity. In the examples for DRY above, adding a function or constant declaration adds indirection. As you read the code, the value is not immediately obvious and the behavior is not readily apparent. You have to go somewhere or use some feature in the IDE to find the value or determine what the routine does. In addition to readability, as you write, you have extra work to make a separate function. It is much faster to simply copy what you need and paste it somewhere else.

I understand these arguments. I know first hand the temptation to grab a section of code and paste it somewhere else. I know the pain of slowing down and having to think deeply about how to not make the duplicate copy of the code. Given all that...

DRY trumps KISS every time.

When I look at the long term maintainability, having one place where behavior or values or anything else is defined is always the better thing. I've experienced first hand maintenance of code with high levels of duplication. I have made changes in one place without knowing about the duplication and not also fixing the same code elsewhere. I have seen others do the same thing. The testing and validation in these scenarios can get very painful. When maintaining software and making changes, dealing with a single authority is much more accurate and faster, even with increased indirection.

In addition to long term maintainability when behavior should be changed, not repeating yourself reduces the chance of copy and paste errors. More times than I care to count, I've found bugs where a block of code was copied and minor changes made throughout it. Missing even one thing that should be changed will result in insidious bugs setting up residence in your application. Many times these won't be caught until much later in the product's life-cycle, increasing the cost of fixing them.

Just today I violated this principle... even after thinking "do I really want to do this?"... even with the draft of this article fresh in my mind... and I introduced a bug that took me a minute to figure out. I had these two lines:
const string name1 = "value one" ;
var value1 = !computeValueForName(name1);
That I duplicated and changed to this:
const string name2 = "value two" ;
var value2 = !computeValueForName(name1);
See the error? When I did, I promptly refactored this to:
var names = new [] { "value1" , "value2" };
var values = from n in names select !computeValueForName(n);
So, yeah, when I see violations of DRY I have almost a compulsive need to fix it. How about you?

Monday, November 15, 2010

Apple Raves

About two months ago I ranted about some issues I had with my new Apple MacBook Pro. For the most part, after additional time with the machine, they are all pretty much still valid. I'm still not used to the keyboard layout; I keep reaching for keys that don't exist. The menus at the top of the screen still escape me.[1] As for finding other machines on the network, that could be a bigger issue than it is for me. Once things are mapped, the O/S seems to do a decent job of finding them. As long as the network topology is stable, it's not a huge issue. It's simply an occasional annoyance rather than a day-in/day-out frustration. And finally, the stability isn't as bad as it first appeared. Now that things are setup and working, I haven't had any crashes or freezes. I don't think it's good as Windows or Linux, but it's not as bad as it first seemed.

With that recap and update of what I don't like, this article is about what I do like. And there is plenty.

Battery life

The battery life on this thing is superb. I haven't actually timed it or have any hard data, but seat of the pants performance feels as good as any laptop I've had with much bigger and heavier batteries. If I was a real road warrior, the lack of being able to change batteries might be a problem, but in the last 10-plus years of using a laptop, I've really only needed a backup set of batteries a couple times. For my uses, if it lasts a day of off-and-on use, it's good. And this has done that and then several hours of continuous use in the evenings. Perhaps one of the reasons the battery lasts so long is because of the next item...

Sleep/hibernate

I've never had a laptop that is so consistent in it's ability to go to sleep and wake up when needed. This issue has been one of the biggest ongoing technical flaws in all the other laptops I've had in the past. They were either slow enough at it or crashed regularly enough that I didn't use this feature very much. The Mac does it so smoothly and seamlessly that I do it quite frequently. This might be part of the reason my battery life is better: I'll close the lid and put it to sleep in cases where in the past I'd leave it on. For example, those times when I know I won't need it for five minutes but don't want to shutdown. I guess these periods of inactivity were using more battery than I gave them credit for and they added up so the difference is note-worthy.

Networking

Wow, what can I say about the networking connectivity. It just works. Flawlessly. Everytime. It finds and connects to networks easily and intuitively. Once I've connected to a network, it remembers it and reconnects again without hassle. I can be connected to one network, sleep the machine, go somewhere else with an already known network and begin working again as if nothing has changed. I never knew how much of a hassle network connectivity was until I didn't have the noise anymore.

Delightful surprises

I just noticed this the other day and it's an example of the subtle attention to detail that is throughout much of the system: the Finder's date column adjusts the format based on the width of the column. If the column is narrow, it shows mm/dd/yy. Widen it a bit and it shows mm/dd/yy hh:mm. Some more space and it starts using abbreviations. It continues giving more detail like this until everything is spelled out completely.

Summary

So, all in all, it's a really nice machine. It's not so much nicer I'd never own anything else and, as I mentioned, there are some real annoying "features" about it. On a scale of 1 to 10, I'd give it a good, solid 8. Fairly comparable to the other high-end Dell's I've had in the past.

1. Literally, they escape my notice; I simply don't see the menus up at the top of the main screen. I'll be working with a program wondering how to do something, looking around frustrated. And then, when I'm about to give up thinking the s/w is brain dead, I remember to look at the top. This is particularly pronounced when working with multiple monitors. The menu isn't even on the same screen as the application.

Thursday, September 9, 2010

Apple Rants

I first played with an Apple computer in the early '80s at a local computer store. My next experience was with a Macintosh an employer purchased to play on. Over the years, I've worked on mini-computers, S-100 based computers, PCs with various versions of DOS, Unix and Windows. Despite friendly jabs I may give friends, I don't really consider myself a computer or operating system bigot. I've watched the maturing of Apple computers from the side-lines, fairly impressed by the changes I've seen over the last 10 or so years. I have a number of friends who are Mac fans and have observed the improvements over their shoulders. I saw a lot of Apples as Windows machines at a recent Microsoft conference. More than once I've heard that the MacBook was the best Windows laptop around.

Given all this, when I recently needed to get a new computer I decided to take the plunge and get a MacBook Pro. I was pretty excited to get something so different than my usual fare. It arrived last week and I've spent the better part of the weekend and last couple days configuring it and a new Windows server I got at the same time. Overall, I really like it. The hardware design, the look and feel, the fit and finish are superb. It feels really solid. It feels like a BMW or Mercedes compared to the Dell's Chevy or Ford feel. As good as the hardware is, the operating system software doesn't seem to match up. The following paragraphs rant about some of the issues I have with it.

The biggest issue has to be the Copy/Cut/Paste key mappings. Why are they different than Windows and Linux? What was the reasoning that said they should do something different from the rest of the industry that has no real value? To be different? In my opinion, this is a major impediment to people new to OS X feeling comfortable with it. On SuperUser.com I did find a hint to be able to remap the keys, so they now work as I expect. I wonder though how many people just put up with it as frustrated users.[1]

The next issues seems like a huge anachronism. Circa 1985, before windowing PC operating systems had multi-tasking, each application took over the screen. In those days, putting the Menu bar at the top of screen made sense. However, as soon as you could have more than one application open at a time and visible on screen, basic user experience guidelines dictate the menu bar should be with the window it controls, not at the top of the screen. Given their big emphasis on design in some areas, why Apple thinks violating basic user design principles of keeping similar functions close to where they're used as it relates to the menu bar completely escapes me.[2]

Microsoft has used the SMB protocol for computer discovery and file sharing for a long time. It has become the defacto standard. I find it incredible that there is no way to browse the network and find computers dynamically on the Mac. It seems one can only connect to a networked computer if you already know the name. And to connect to it you have to use a fairly cryptic "smb://computer_name:139" syntax. Really? ... Really? Linux has "just worked" in this regard for many years. And it's worked overall better than Windows itself has. It's about time Apple caught up to the real world.

Finally, Snow Leopard just seems less stable than Linux and Windows have in a long time. Its stability feels like Windows did for the Windows 95 release. Several times in last couple days things have gotten wonky that were fixed by rebooting. A number of times when this happened, shutdown didn't work. It just hung after clearing the screen; I had to power down by holding the power button until it powered off. I can't remember the last time that happened on a Windows or Linux box.

Given all these rants, I don't want to give the impression I don't like my new machine. It is speedy. It is solid physically. And really, overall, I haven't had a ton of problems. Perhaps the problems I have had are more remarkable and obvious given the cleanness of the rest of the system. I'm not yet ready to do as some have and install something else as the base operating system and run OS X in a virtual machine.

2. See "The Structure Principle" at Principles of User Interface Design. Many people talk about this principle such as this one and this one.

Tuesday, September 7, 2010

Synergy setup between a MacBook and Linux machine

I just got a new MacBook Pro a couple days ago. Wow, it's a nice machine. For my first Apple product, I'm pretty impressed. I have a couple things I don't like about it that I'll probably rant about some other time. But for today's post, I wanted to chronicle my adventure in getting it to be a Synergy server for an old Dell Dimension running Ubuntu. All said, it was the most problematic software configuration I've had in a really long time, but in the end I got it working pretty well. I kind of reminded me of trying to sort out IRQ and DMA addresses way back in the dark ages.

I first heard about Synergy a couple years ago when a friend of mine connected his iMac and Windows boxes together. For those who don't know, Synergy is an open source project that allows one machine's keyboard and mouse to be used on another machine in a seamless manner. The machine that has the keyboard and mouse physically connected is the server and the machine that only has a screen on it and uses the input devices from the other is the client. The two machines are connected through the network. So, this solution doesn't require any additional hardware than what the computers probably already have. Once configured, the mouse pointer can be moved off the side of one machine's screen onto another and the keyboard focus follows. It can also work with more than two machines. Any number of boxes can be setup as clients to use the common server.

Official installation instructions can be found on the Synergy wiki. I followed them mostly. Below are some issues I ran into and how I worked around them.

Step one was to download the software from the repository. For the Mac it came as a dmg file that installed without issue. For the Linux machine, it was available in both deb and rpm formats. Since I'm running Ubuntu, I loaded the deb file. It installed just fine too. In both cases, it put two applications in /usr/bin: the client, syngergyc, and the server, synergys.

Step two was to configure the server. This is done through a text file. The instructions seem to imply that it is optional, but it isn't. This is the only way to configure the server. Another point not clear in the instructions is the configuration file is only needed on the server and can be anywhere and called anything. You specify the name, including the path, on the command line for the server.

Somewhere I heard about GUI configuration tools; I thought it was on the setup page, but when I went to find it, I couldn't. Where ever I heard about them, I tried SynergyKM on the Mac. It was a dismal failure. It seemed to get confused and couldn't maintain its state very well so I eventually abandoned it, but not after wasting some time trying to get it to work. In the end, I just wrote the configuration file myself. It wasn't as hard as it first seemed. I called it /etc/synergy.

I also found a GUI tool for the Linux client and tried using it. However it used a different version of Synergy and when the Ubuntu Software Center tried to install it, it somehow messed up the apt state files. Another weird problem. I haven't had apt files get messed up in quite a while. I had to go into aptitude and do some clean-up to repair things so the GUI tools would work properly again. In the clean-up I also removed the GUI tool and just handled setup manually in a terminal window.

After installation and configuration, the next step is to run both the server and client in a terminal to test connectivity. My suggestion is to not skip this step. I thought I could and ran into some opaque problems that became obvious when I ran the software in a terminal. They all related to machine names used and my getting confused about which names to use where. By default, the instructions say the hostname is used, but that didn't really work too well for me on the Mac, so I ended up specifying my own names for the synergy connections. It took a bit for me to get this to work on both sides, mainly due to my own wrong assumptions about how things worked. When I ran things in a terminal, the error messages indicating my mis-configuration made short work of getting things sorted.

So, I once things ran in two terminal windows, the next step was to start the software automatically at startup. On the old, deprecated site, there's an Autostart Guide and this is linked to from the code.google.com wiki setup page with some additional information. I didn't see either on the new, official site and had to do some experimentation using the information from these two pages to get everything working properly. I ran into several time consuming problems doing this.

The first problem was a how to connect to the MacBook by name. The Synergy client needs either the IP address or the DNS name of the server in its command line. This makes sense from a client programming perspective. However, from a system administration perspective not so much. I'm using DHCP to assign IP addresses and as I have a small home network, I don't have a DNS server setup to handle local requests like this. The standard name resolution mechanism didn't work for me. I wanted to use the netbios name since it was available but it took a bit to figure out how to get it. In the end, I found the nmblookup command which, given the name will give some information, including the IP address. But, it's not in an incredibly useful format for scripts. I had pipe the output to grep to get just the line I wanted and then pipe that to cut to get just the IP address. Specifically the command looks like this:
nmblookup macbook_name | grep "<" | cut -f 1 -d" ".

With this in hand, I wrote a short bash script to look for a server and then connect to it once it found one. This is where I ran into my second problem. It's been a while since I've done any shell programming and I had to do a bunch of web searching to remind myself of the details. Everything indicated string equality could be done with either "==" or "=". As a C programmer, I defaulted to using "==". My script kept giving me unexpected operator errors on odd lines without operators. After commenting some lines out, I found the if statement at fault. On a whim, I changed the operator to be "=" and the problem went away.

The final version of the script became this:
#!/bin/sh
svr_ip=""
until [ "$svr_ip" != "" ]
do
svr_ip=`nmblookup macbook_name | grep "<" | cut -f 1 -d" "`
if [ "$svr_ip" = "" ]; then
sleep 2
fi
done
synergyc $svr_ip

(Note: right now this will fail if the server reboots and gets a different IP address since once syngeryc starts, it continues running even when the server goes offline.)

Once the script was done, I then followed the instructions to modify the gdm files, substituting my shell script for the synergyc call.

When this was all working, my next step was to get the autostart working on the Mac. By and large, this went faster than the Linux side but not without its own problems.

The first one was that the install location of the synergy software was different from the examples. So, the first time I tried it, it simply didn't do anything. There were no errors anywhere that I could find pointing to the problem. It was a matter of going through the script line by line until I found it.

With the software's path corrected, I ran into the second problem. When I logged on, the screen went blank and I never got anything else. The menu bar never appeared. The dock never appeared. The desktop icons never appeared. I couldn't do anything. However, the mouse moved the cursor and it would move onto the Linux desktop. Ok, I knew it was running, but how to get my desktop back? As a new Mac user I had no clue.

After a bit of searching, I found I could boot the installation CD and fire up Terminal to get to my system. I inserted the CD, held down the power button to turn the machine off and then held down the "C" key as I turned it back on. As advertised, it booted the CD and I found the terminal in the Utilities menu. I edited the script and added an ampersand to the end of the call to start the server so it would fork the process and run in a different thread. After rebooting, everything finally worked as expected.

As I said up front, once I got it working, it works really well in spite of the problematic configuration. The only issues I've found are relatively minor: 1) copy/paste doesn't work between machines, 2) the screen saver on the server doesn't cause the screen saver on the client to start, 3) I haven't gotten it to work prior to logging onto the Mac and 4) the above noted problem if the server reboots and gets a different IP address. While it'd be nice if these worked, in the greater scheme of things, they are pretty minor failures and I'll live with them for now. I think with more time the last two can be overcome with simply configuration changes that I haven't figured out yet.

Tuesday, June 15, 2010

Message Compiler errors and how I solved them

As part of my day job, for the last several weeks I have written a Windows Service in .Net 3.5. Today I did some clean-up of messages written to the EventLog. Because internally there are several parts to this service, I wanted to separate the messages using the Category field. After some searching of the web, it seems this isn't drop dead trivial. But, as it didn't seem terribly complex either, I decided to dive into the task.

One of the requirements is that the text for the categories are in a resource DLL. Unfortunately, these aren't natively supported by the .Net framework. You have to manually create the files, compile them with two compilers and then link it into the DLL. Even though it consisted of several steps, it still seemed pretty straight forward based on the details on MSDN. Following the instructions, I created a simple .mc file and tried compiling it with mc.

The Message Compiler emitted a bunch of errors:
  • invalid character (0x29)
  • invalid character (0x57)
  • invalid character (0x10)
  • Invalid message file token
Obviously something was wrong. Another web search revealed nothing. I tried to make minor changes to the file, reduced things to the bare minimum, but continued to get these errors. It seemed I had some odd characters in the file that none of the the editors I used revealed. Finally I used the command line "type" command and there they were: the Byte Order Marks. Arg. After some head scratching, I used the old text mode "edit" command and it revealed the characters. I easily deleted them and tried to compile. It worked! No errors!

Of course I really wanted to edit the file in Visual Studio and so made a change to the newly edited file and saved it. VS kindly put the BOM right back in. It can be so annoyingly helpful. Yet more searching for a means of eliminating the BOM didn't reveal anything that I wanted to use. I found a script or two that I could install in the IDE, but I didn't want to force the rest of the team to have to do this. The more environmental things that have to be configured and maintained, the harder it is to setup a new development machine. After a bit of pouring through VS's Tool | Options, I found Text Editor | File Extensions. This allows you to define which editor to use for a given file type. I added the .mc extension and told it to use the User Control Editor. Sure enough, this editor didn't put the BOM in the file. While not ideal for the team environment, it is better than a script.

So, these were some of the lessons I learned today working with Message Text Files and the Message Compiler. I hope it helps someone else.

Wednesday, March 24, 2010

Utah Code Camp: Spring 2010

Utah Code Camp is a free, day long software conference run by volunteers with workshops presented by local developers. This spring, the day is loaded with 47 workshops in 11 different tracks. The topics range from various Microsoft .Net technologies to open source projects and architecture to career development. The next camp is this Saturday.

Several have been organized, but the first UCC I have been able to attend was last fall. The opening keynote was by Alistair Cockburn and he talked about Hexagonal Architecture. This was followed by six workshop time slots with multiple options available. I chose to attend ones on WCF, Powershell, Microsoft's Multitouch API, new features of C# 4.0, usability, the WIX installer and finally Windows Azure.

I'm looking forward to this event and encourage any developer in the Salt Lake City area to check it out. It's still up in the air if I'll make it. I hope I can.

For those looking for more information, check out the Utah Code Camp website or follow them on Twitter.