Tech in the 603, The Granite State Hacker

The Great Commandment

While I was writing a post the other day, I noticed that I had neglected a topic that I find very important in software development. Risk management.

There are only a few guarantees in life. One of them is risk. Companies profit by seizing the opportunities that risks afford. Of course, they suffer loss by incidents of unmitigated risks. All our government and social systems are devices of risk management. In business, risk management is (now, and ever shall be) the great commandment.

Many software engineers forget that risk management is not just for PM’s. In fact, software and its development is fundamentally a tool of business, and, by extension, risk management. The practice of risk management in software really extends in to every expression in every line of source code.

Don’t believe me? Think of it this way… If it wasn’t a risk, it would be implemented as hardware. I’ve often heard hardware engineers say that anything that can be done in software can be done in hardware, and it will run faster. Usually, if a solution is some of the following…
· mature,
· ubiquitous,
· standard,
· well-known,
· fundamentally integral to its working environment

…it is probably low risk, particularly for change. It can likely be cost-effectively cast in stone (or silicone). (And there are plenty of examples of that… It’s what ASIC’s are all about.)

Software, on the other hand, is not usually so much of any of those things. Typically, it involves solutions which are…
· proprietary,
· highly customized,
· integration points,
· inconsistently deployed,
· relatively complex / error-prone
· immature or still evolving

These are all risk indicators for change. I don’t care what IT guys say… software is much easier to change than logic gates on silicone.

I’ve dug in to this in the past, and will dig in more on this in future posts, but when I refer to the “great commandment”, this is what I mean.

Tech in the 603, The Granite State Hacker

Application Platform Infrastructure Optimization

In doing some research for a client on workflow in SharePoint, I came across this interesting article about the differences between BizTalk 2006 and the .NET Workflow Foundation (WF).

The article itself was worth the read for its main point, but I was also interested in Microsoft’s Application Platform Infrastructure Optimization (“APIO”) model.

The “dynamic” level of the APIO model describes the kind of system that I believe the .NET platform has been aiming at since 3.0.

I’ve been eyeing the tools… between MS’s initiatives, my co-workers’ project abstracts, and the types of work that’s coming down the pike in consulting. From the timing of MS’s releases, and the feature sets thereof, I should have known that the webinars they’ve released on the topic have been around for just over a year.

This also plays into Microsoft Oslo. I have suspected that Windows Workflow Foundation, or some derivative thereof, is at the heart of the modeling paradigm that Oslo is based on.

All this stuff feeds into a hypothesis I’ve mentioned before that I call “metaware”, a metadata layer on top of software. I think it’s a different shade of good old CASE… because, as we all know… “CASE is dead… Long live CASE!”

Tech in the 603, The Granite State Hacker

Compact and Full .NET Frameworks

One of the things I’ve been intrigued by for a while now is the fact that code compiled for the .NET Compact Framework (all versions) executes very nicely on the full .NET Framework.

For example, my personal hobby project, “Jimmy Sudoku”, is written in C# for the .NET Compact Framework 2.0. There are actually two install kits. The first is a .CAB file for Windows Mobile devices. The second is an .MSI for Windows 9x, XP, and Vista. The desktop install kit even serves two purposes. First, it installs the program on the desktop. Second, it leverages ActiveSync to push the .CAB up to the Windows Mobile device.

It’s a .NET Compact Framework app especially for Windows Mobile devices, but many ‘Jimmy’ fans don’t have a Windows Mobile device to run it on.

The coolest part is the ease in which all of the components inter-operate. The .EXE and .DLL’s that are delivered to the mobile device are the very same as the ones that are delivered to the desktop. Like Silverlight to WPF, the Compact Framework is a compatible subset of the full framework, so interoperability is a given.

Even better, you can reference CF assemblies in Full framework assemblies. One immediate offshoot of this in my hobby project… the web service I built to service “Game of the Day” requests actually references the CF assembly that implements the game state model and game generator code. The assembly that generates games on Windows Mobile PDA’s & cell phones is the very same assembly that generates games in the ASP.NET web service.

Admittedly, there are some bothersome differences between the CF and the Full .NET Framework. The CF does not support WPF. The CF has no facilities for printing. Also, while the CF does supports some of the common Windows Forms dialogs, it does not support File Save and File Open dialogs on Windows Mobile Standard Edition (Smart Phone / non-touchscreen) devices.

These differences can be overlooked to some extent, though, for the fact that one compiled assembly can execute on so many very different machine types. Further, with interoperability, one can extend a CF-based core code with full-framework support. For example, I’m currently playing with desktop print functionality for my hobby project.

Something that I’d really love to see, some day, is a good excuse to develop a Windows Forms app for a client that had shared components between the desktop and a mobile.

I can imagine that this model would be superb for a huge variety of applications, allowing a fully featured UI for the desktop version, and an excellent, 100% compatible, very low risk (almost “free”) portable version.

I’ve often thought this would work great for apps that interface hardware, like:
field equipment,
mobile equipment,
vehicles of all sorts,

…simply plug in your PDA (via USB or Bluetooth), and it becomes a smart management device for the equipment, using the very same code that also runs on the desktop.

Tech in the 603, The Granite State Hacker

Semi-IT / Semi-Agile

While working on-site for a client, I noticed something interesting. On the walls of some of my client’s “users” offices, along with other more classic credentials, are certifications from Microsoft… SQL Server 2005 query language certifications.

I’ve heard a lot about the lines between IT and business blurring. We talk a fair amount about it back at HQ.

Interestingly, this case is a clear mid-tier layer between classic IT (app development, data management, advanced reporting) and business in the form of ad hoc SQL querying and cube analysis. In many ways, it’s simply a “power-user” layer.

The most interesting part about it is the certification, itself. The credentials that used to qualify an IT role are now being used to qualify non-IT roles.

Another trend I’m seeing is development ceremony expectations varying depending on the risk of the project. Projects that are higher risk are expected to proceed more like a waterfall ceremony. Lower risk projects proceed with more neo-“agility”.

The project I was on was apparently considered “medium” risk. The way I saw this play out was that all of the documentation of a classic waterfall methodology was expected, but the implementation was expected to develop along with the documentation.

In many ways, it was prototyping into production. Interestingly, this project required this approach: the business users simply did not have time to approach it in a full waterfall fashion. Had we been forced into a full-fledged classic waterfall methodology, we might still be waiting to begin implementation, rather than finishing UAT.

Tech in the 603, The Granite State Hacker

Economic Detox

While contemporary headlines bode poorly for the U.S. economy, I see them as signs of hope…

I keep hearing high-pitched alarms about the weakening U.S. dollar, inflation, energy prices, the housing market bubble burst. We all see the ugly face of the these conditions.

Global trade has been a bitter (but necessary) pill for the U.S. Perhaps the Clinton-detonated U.S. economic nuclear winter (of global trade, NAFTA, etc.) is finally starting to give way to a new economic springtime in the States.

In the late 90’s US market, there were a lot of excesses in the technology sector. Then the bubble burst. When the dust settled, we (the US IT industry) found ourselves disenfranchised by our sponsors… corporate America beat us with our own job hopping. U.S. Engineers hopped off to the coolest new startup, and rode their high salaries into the dirt, while enduring companies went lean, mean, and foreign. We had become so expensive, we were sucking our own project ROI’s completely out of sight. By hooking foreign talent pools, the ROI’s were visible again.

Nearly a decade later, look what’s happening around the world… Many foreign IT job markets are falling into the same salary inflation trap that the U.S. market fell into… They are going through the same inflation we experienced. Their prices are rising.

Combine their salary inflation with our salary stagnation and a weakening dollar, and what do you get?

A leaner, meaner domestic competitor.

In a sense, it’s like that in many sectors of the U.S. economy.

So let the U.S. dollar weaken… It means that America can go back to being product producers (rather than mindless consumers) in the global market!

Tech in the 603, The Granite State Hacker

If It Looks Like Crap…

It never ceases to amaze me what a difference “presentation” makes.

Pizza Hut is airing a commercial around here about their “Tuscani” menu. In the commercial, they show people doing the old “Surprise! Your coffee is Folgers Crystals!” trick in a fancy restaurant, except they’re serving Pizza Hut food in an “Olive Garden”-style venue.

It clearly shows my point, and that the point applies to anything… books, food, appliances, vehicles, and software, just to name the first few things that pop to mind. You can have the greatest product in the world… it exceeds expectations in every functional way… but any adjective that is instantly applied to the visual presentation (including the environment it’s presented in) will be applied to the content.

If it looks like crap, that’s what people will think of it.

(Of course, there are two sides to the coin… What really kills me are the times when a really polished application really IS crap… it’s UI is very appealing, but not thought out. It crashes at every click. But it looks BEAUTIFUL. And so people love it, at least enough to be sucked into buying it.)

Good engineers don’t go for the adage “It’s better to look good than to be good.” We know far better than that. You can’t judge the power of a car by its steering wheel. Granite countertops look great, but they’re typically hard to keep sanitary.

When it comes to application user interfaces, engineers tend to make it function great… it gives you the ability to control every nuance of the solution without allowing invalid input… but if it looks kludgy, cheap, complex, or gives hard-to-resolve error messages, you get those adjectives applied to the whole system.

So what I’m talking about, really, is a risk… and it’s a significant risk to any project. For that reason, appearance litterally becomes a business risk.

For any non-trivial application, a significant risk is end-user rejection. The application can do exactly what it’s designed to do, but if it is not presented well in the UI, the user will typically tend to reject the application sumarily.

That’s one thing that I was always happy about with the ISIS project. (I’ve blogged about our use of XAML and WPF tools in it, before.) The project was solid, AND it presented well. Part of it was that the users loved the interface. Using Windows Presentation Foundation, it was easy to add just enough chrome to impress the customers without adding undo complexity.

Tech in the 603, The Granite State Hacker

Compromise & Capitulation

There’s three different flavors of Windows Mobile in the 6.x line. Standard, Classic, and Professional.

Standard = Smart Phone, no touchscreen
Classic = PDA w/touchscreen
Professional = PDA / Phone with Touchscreen

One of the other interesting little gotchas is that the .Net Compact Framework 2.0 compiles the same for all three editions. Unfortunately, once in a while, you get a “NotSupportedException” out of the Standard edition.

A few days ago, in order to get my sudoku program published, I decided to simply avoid a problem I had with the Standard edition’s lack of a SaveFileDialog and OpenFileDialog. My avoidance manifested in a “not supported” message of my own, if the user tried to save / load a file in that environment.

Today, I capitulated… I implemented an alternative file save/load functionality which kicks in automatically when the program gets a “NotSupportedException” on the common dialogs.

It’s in 3.0.3, which I’ve re-published on PocketGear.

Tech in the 603, The Granite State Hacker

Jimmy SuDoku 3.0 Released

Those of you who have worked with me on a project in the past few years probably know of my hobby project. It’s an implementation of SuDoku. It’s made for Windows Mobile devices (cell phones, etc.), but it also runs on Windows XP (et al).

The old version, 2.5, had been published on PocketGear. This last update was published in January, 2007, just before I started with Edgewater.

I’ve been hacking at it here & there since then, but the project suffered from lots of maladies… most significantly lack of time.

So after more than a year and a half, I’m happy to finally announce Jimmy SuDoku 3.0!

3.0 has a whole new game state model, based on CLR classes rather than an XML DOM. This means the puzzle generator’s fast enough on hand-held devices that it doesn’t need a web service to do the work for it. Another side-effect of this change is a smaller run-time memory footprint, though I’m not sure by exactly how much.

I also figured out how to leverage the hardware controls on WM6.0 & 6.1 devices so that non-touchscreen devices can play, too.

Tech in the 603, The Granite State Hacker

WALL•E and Enterprise Data Landfills

“Life is nothing but imperfection and the computer likes perfection, so we spent probably 90% of our time putting in all of the imperfections, whether it’s in the design of something or just the unconscious stuff. “
-Andrew Stanton, director of Disney/Pixar’s WALL-E, in an interview on the topic of graphic detailing.

I’m enough of a sci-fi geek that I had to take my kids to see WALL*E the day it opened. I found it so entertaining that, while on vacation, I started browsing around the internet… digging for addititonal tidbits about the backstory.

I came across the quote, above, initially on Wikipedia’s Wall-E page.

The simple truth carries across all applications of contemporary computer technology. Technology tools are designed for the “general” cases, and yet, more and more often, we’re running into the imperfect, inconsistent, outlying, and exceptional cases.

To follow the thought along, perhaps 90% of what we do as software developers is about trying to get a grip on the complexities of… everything we get to touch on. I guess the remaining 10% would be akin to the root classes… the “Object” class, and the first few subordinates, for example.

Andrew Stanton’s quote reminds me of the 90-10 rule of software engineering… 90% of the code is implemented in 10% of the time. (conversely, the remaining 10% of the code is implemented in the remaining 90% of time). I tend to think of this one as a myth, but it’s fun thought.

It’s dealing with the rough fringes of our data that’s among the industry’s current challenges, but it’s not just corporate data landfills.

I recently heard a report that suggested that technology will get to the point that commercially available vehicles with an auto-pilot will be available within the next 20 or so years. What’s really the difference, to a computer, between financial data, and, say, navigational sensor data?

So to flip that idea on its head, again, and you could have more intelligent artificial agents spelunking through data warehouses… WALL-DW ? (Data Warehouse edition)

Then again, I wonder if the 80-20% rule isn’t what gets us into our binds to begin with.

Tech in the 603, The Granite State Hacker

Real Software Heroes

While scanning the channels looking for an interesting show to watch, I came across a show on the Science channel… “Moon Machines“. I couldn’t have been luckier than to see the chapter “Navigation”.    (Update: Full video online here:  http://www.dailymotion.com/video/xxxiij )

I’d heard bits about the technology that went into the Apollo missions, and how there were some of the first “modern” IC-based computers on board, but I never really thought about the implications of what they were doing. Having these computers aboard meant they had software. There wasn’t exactly COTS systems for navigating to the moon.

The episode focused quite a bit on the experience of the software development team, including some at the personal level. There were quotes like “Honey, I’m going to be in charge of developing something called ‘software’.”… (and the response: “Please don’t tell the neighbors.”)

I’ve felt pressure on projects before… stuff I was working on that represented millions of dollars in its success, and presumably millions lost in its failure. I’ve even worked on software projects where runtime production logic errors could send people to jail. I’ve never written software that human life directly depended on.

My hat is off to the folks who took up this monumental challenge for the first time in software history, and made it work. To me, that’s every championship sports victory… ever… combined.

All I can say is… wow.

They knew what a monumental victory it was, too… 40+/- years later, and the engineers they interviewed were still moved by the awe of their own accomplishment, and the personal sacrifices they made to pull it off.

As well they should be. Fantastic!!