Never-Ending Bill for President 'Dilbert' Saga

As my faithful readers know, Redmond magazine was the first publication to seriously suggest that Bill Gates run for president. As you might also know, overexposed cartoonist Scott Adams had the same brilliant idea six weeks after you all heard about it -- and proceeded to take full credit.

One "Dilbert"-loving columnist, NetworkWorld's Paul McNamara, took the bait, and guessed that "Cartoonist Scott Adams started this flapdoodle with a Nov. 19 post on The Dilbert Blog that suggested there isn't anything wrong with this country that President Bill Gates couldn't cure in less time than it takes to get a new operating system out the door. Hey, everyone enjoys a good chuckle...and don't you just love that Dogbert?" Uh, no, and as for Dogbert, definitely no!

The otherwise upstanding Mr. McNamara continued his speciousness by again giving Adams full credit, even after the Barney/Adams idea seemed to die on the vine.

I've got to admit to being a little miffed, and whipped off this heated little message to my old pal:

Paul,

As a columnist, imagine that you wrote a piece suggesting that Bill Gates run for president.

Before publishing, you do a thorough search to make sure the idea is not derivative. You find it's not.

You publish the column to 135,000 subscribers and many more pass along readers.

You also post it on the Web where you have a substantial audience.

There is a quick and passionate reader reaction.

Let's say you did this in October of last year.

Six weeks later, Scott Adams blogs about the same topic, and a different columnist from a different publication gives Adams total

Now, let's say the person who wrote the first column about Gates explained to the second columnist who it was that actually published the idea first, and did so to a broad audience.

You would think the second columnist, let's call him Net Buzz, would give proper credit the next time around.

Instead, this columnist, who purports to stand for honesty, once again gives credit to Scott Adams and then fawns over the fact that Adams sent him a seven-word reply.

If that happened to you, would you be irritated? I thought so.

Paul did apologize for his oversight, but somehow never bothered to correct it.

http://www.billgatesforpresident.net

Posted by Doug Barney on 03/20/20070 comments


The Passing of Another Great

Because computer science is such a young field, it is easy to forget that it has a history. And a fundamental individual close to the beginning of that history passed away last week. John Backus, whose development of the Fortran programming language in the 1950s changed how people interacted with computers and paved the way for modern software, died Saturday in Ashland, Oregon. Fortran was the first high-level computer language; prior to its development, you wrote code in machine language or assembly. The development of Fortran earned Backus the 1977 Turing Award from the Association for Computing Machinery, one of the industry's highest accolades. The citation praised Backus' "profound, influential, and lasting contributions."

Backus also won a National Medal of Science in 1975 and got the 1993 Charles Stark Draper Prize, the top honor from the National Academy of Engineering. Among his other important contributions was a method for describing the particular grammar of computer languages. The system is known as Backus-Naur Form, or BNF, for those of us with training in formal languages.

As a sophomore at a liberal arts college in the late 1970s, my choices in learning computers were very limited – Basic and Fortran. Basic was interactive, while Fortran was batch, done on punch cards. It was not readily apparent to me at the time that programming languages had different design tradeoffs; it was only later, working with pointer-oriented languages, that I learned that languages had deliberate and specific strengths and weaknesses that made them more appropriate in given domains.

Fortran, of course, was best known as a number-crunching language. Its early binding and fixed-sized and rigid data structures meant that the compiler could build the resulting executable to take advantage of the processor it ran on, making its programs extremely fast. It was essential for scientific and engineering applications.

C, a pointer-oriented language, could never be as fast, but work on C compilers eventually made it good enough for all but the most demanding number-crunching applications. Microsoft at one point in time sold Microsoft Fortran, but it disappeared as Visual Studio became ascendant.

Even if Fortran is no longer a common language for mainstream computing, to me it epitomized the concept of TANSTAFFL, used in Heinlein's classic The Moon is a Harsh Mistress. There Ain't No Such Thing As a Free Lunch defines the tradeoffs anyone working in computing makes on a daily basis.

Posted by Peter Varhol on 03/20/20071 comments


Eclipse Delivers for .NET Developers

Most .NET developers are used to working almost entirely in Visual Studio. There are a variety of additional tools available, like commercial debugging and quality tools from the likes of IBM Rational and Compuware. In addition, there are freely available tools that many believe essential to rigorous development processes, such as NUnit. But all of these tools either plug into Visual Studio, or have stand-alone user interfaces. There is no place for yet another, separate IDE that serves as a tools platform. So why would I promote Eclipse (www.eclipse.org), best known as a Java IDE, for that role?

As last week's EclipseCon conference demonstrated, there is a great deal of activity surrounding the Eclipse platform. It's unique model of open source both promotes collaboration for development of various projects, yet encourages competition for value-added features on top of the open source foundation.

Eclipse has over 60 active projects, and over 160 corporate members, as well as over a dozen associate members (1105 Media is one such associate member). There are hundreds of committers, individuals that have earned the right to commit code to the shared code base of one or more of the projects.

Many of these project support Java development. However, many others support generic application lifecycle processes. For example, Telelogic has an offering that provides change management capabilities. Serena leads the Application Lifecycle Framework (ALF) project, which provides a means of uniting the stages of the application development process. Compuware's Corona project provides a way to share software assets across the application lifecycle.

These are not software products in and of themselves, although they can be useful in open source form. But other vendors will take these frameworks and add feature sets that will make them compelling, even to .NET developers.

As these Eclipse projects come together, they will form a set of lifecycle tools that may be superior to anything that Microsoft has to offer. That does not in any way denigrate Microsoft; rather, it is a testament to the power of this model of cooperation and competition. By collaborating on the underlying technology, vendors can share the underlying platform and provide a common foundation for all tools of that category. They can expend individual R&D efforts in adding features to serve specific classes of users, or building an advanced feature set.

Perhaps the best part of this model from the standpoint of the developer is that there is much less chance of vendor lock-in to a particular tool or set of tools. Because of the common underlying platform, it is likely easier to switch to new tools that may offer features more targeted to your needs.

If that is not enough of a reason to start looking at the Eclipse platform, I'll offer another. Someone will offer a .NET IDE on Eclipse. They will either own a license for the .NET Framework SDK (are you listening, Borland?) or they will use the Mono open source framework. Such an offering would not be to compete head to head with Microsoft (who would want to do that?), but rather to offer a .NET development kit targeted toward specific types of applications or industries.

Either way, .NET developers can benefit from Eclipse, now and in the future.

Posted by Peter Varhol on 03/12/20070 comments


Do It Yourself Mashups

No innovation has struck the Web development community quite like mashups.The complete freedom that they offer proves vexing to some application designers and developers (What database should I use? How can I use SQL with this?), yet liberating to others (There is no limit to the applications I can build.).

I confess that I hardly know where to begin.I became computer savvy in the era when the connection between data and application was pretty straightforward.Even for data that was used in multiple applications, the requirements were pretty clear, and the location and format of the data were well known.As a college professor in the early 1990s, I described every application as fundamentally a database.The purpose of the application was to make the data in that database useful.For example, a word processing document was, fundamentally, a database of words, with an interface that organized and presented that data in a way that made sense to the user in that context.

Mashups are radically changing the data-application equation.If I can get at data, chances are I can use it in just about any application, whether or not that application already exists.And chances are that application will be unlike anything built before.At worst, it will have a unique twist to an already-established concept.

I've already written geographic mashups using Microsoft's Virtual Earth as a part of an ASP.NET application.Virtual Earth has an API that makes it exceptionally easy to call maps in just about any configuration, and overlay data on those maps from other Web sites or from local databases.

The next step is user-created mashups, those that enable individuals to build or customize data-logic combinations for their own unique needs.Users determine what data they need, what operations have to be performed on it, and how it is presented, then go ahead and do the implementation.

Sound far-fetched?Not as far-fetched as you might think.As an example, Yahoo's Pipes (pipes.yahoo.com) lets anyone combine, manipulate, and filter feeds into a unique feed.It provides a visual editor that allows you to represent feeds and actions as boxes, and connect them with wires.(A technically inclined person has to appreciate the Pipes name, which also refers to one particular method of moving data from one process to another on a computer).

Once you've built a Pipe, you'll be able save it on our server and then call it like you would any other feed.Further, you can use the output from a Pipe as input to yet another application, similar to the way you would use a Web service.

The trend is clearly lurching toward individual user control of both data and logic.IT professionals may cringe at the concept, but perhaps business is coming to the conclusion that line and staff employees with better information is superior to IT control of computing resources and applications.We know how this story usually ends – workers end up with bad data or unsupportable applications, IT costs rise, and the pendulum goes back to centralized IT control.Mashups may change that equation, and may put users permanently in charge.

Posted by Peter Varhol on 02/16/20070 comments


More to Vista Than an OS

I confess that I can't get very excited about the arrival of Windows Vista. I did buy a new computer, partly in preparation for running Vista (2GB of RAM, and wishing I could get more), but will likely wait a while before I call it my OS of choice.

But the developer technologies associated with Vista, now collectively labeled .NET 3.0, promise to be much more interesting and more immediately worthwhile to the developer community. These include the Windows Communications Foundation (WCF), the Windows Presentation Foundation (WPF), and the Windows Workflow Foundation (WWF).

Thanks to Microsoft's retrofitting these components, albeit imperfectly, to run on Windows XP, developers have been able to use them for some time. The last two VSLive! Conferences in San Francisco have featured keynote talks that focused on the WCF, and the ease of building Web services using this technology. Two years ago, the demonstration centered on writing reliable, secure transactions using three lines of code, down from the 57,000 lines required without the libraries.

As a sometimes practitioner of BPM strategies, I have a special appreciation of WWF. The state machine behind the visual design palette provides an easy way to actually build simple workflows. It lacks the power and compatibility of some of the broadly accepted standards in the business, but it represents an easy way for Visual Studio developers to implement business processes.

WPF is the most interesting, yet most long-term, of the .NET 3.0 technologies. Developers love using new tools to build new user interfaces, and really don't need an excuse to do so. WPF interfaces don't appear to run well on XP (although they can be developed on that OS), so any developer who have a captive Vista audience will likely use that opportunity to use WPF for a new look.

Because WPF has been available in preview form for quite a while, some developers have already gained some practice in it. I've had the WPF tools and runtime components running under Windows XP for almost a year. I can't say I've done anything particularly constructive, but I have designed a few new-looking user interfaces. Others have done quite a bit more with the preview, and are ready for more when the opportunity presents itself.

There won't be a killer application for Vista. No 'gotta have it' gamebreaker. But there are guys who are building new user interfaces, and new web services and business processes, waiting for the opportunity to use them. When Vista reaches critical mass, there will be some great new applications. Count on it.

Posted by Peter Varhol on 02/04/20070 comments


Countdown to Vista

My new laptop computer arrived today, and I'm pretty happy with what I purchased. 2GB RAM, 80GB 7200rps SATA hard disk, 1.83GHz dual-core Intel processor.

Windows XP Professional. No Windows Vista. It's not available until next week. I suppose I could have waited, but I had a significant need that made it inconvenient and unproductive to wait. My only option was to obtain an upgrade coupon with my system. That plus $10 will get me Vista using the Express Upgrade.

Besides, who said Vista's not available? I received it in my MSDN Professional subscription over two weeks ago. Many enterprise customers received it before the end of last year. I doubt that I would have installed it anywhere last year, but I rather like the idea of being able to if I wanted to (and had the time).

Microsoft's distribution strategy reminds me a great deal of our air travel system. I am a frequent flyer on one of the major airlines, with enough status that I occasionally get upgraded to first class. Now, first class is not about comfort. Certainly the seats are wider, and there's a bit more legroom. However, I typically manage to snare exit-row seating, and the legroom there is really just as good.

It's not about service. Certainly the free liquor is welcome, but food is hard to come by even on cross-country flights. It's a nice gesture to take jackets and hang them up, but I rarely wear one when traveling.

No, it's about status. You get to board the airplane first, when the gate attendant explicitly calls for the first class passengers. You get your choice of overhead storage. You get to act occupied while casually watching the coach passenger file in behind you, and occasionally see a child point to an empty first class row and say to his parents, Can we sit here?

Microsoft is creating the same kind of status with Vista. The customers it considers most important, the enterprise IT shops, get Vista first. The developers, with some status but not the most important, get it next. Small business (which is the role I use for online purchases) and consumers go last.

Now, I understand that it takes time to get product to the PC makers and have them prep it for mass market delivery. But there is no good reason not to make it available for download, or on DVD, simultaneously to all interested purchasers. That used to be the way Microsoft brought its products to market.

It's scary to think that Microsoft may have learned something about marketing from our broken air travel system. Were I not receiving an MSDN subscription, I would still be waiting. So on behalf of small business owners and consumers alike, I want to ask Microsoft, Can I sit here?

Posted by Peter Varhol on 01/22/20070 comments


Skills, Talent, and Tech Employment

Whether in good economic times or bad, it is not difficult to find something related to getting a new job in the trade press or the blogs (for example, http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=277393&intsrc=hm_ts_head) . We are all highly valuable, or have obsolete skills, or about to be outsourced, or need business smarts to go along with our technical training.

They make it seem very formulaic. Do this, don't do that. Get certified. Learn this language, or that platform. Specialize. Generalize. Know the business side. Be flexible.

I call them broad overgeneralizations, if not just plain wrong. A job search is a very personal thing. Even if your skill set is exactly the same as mine, our mental model, outlook and attitude, personality, and professional goals are almost certainly substantially different.

And it's a largely random process. Don't even get me started on automated resume screening systems. Even in the best of circumstances, whether or not I get an interview may depend on who looks at my resume, and how they are feeling that day. And the dynamics of that interview have as much or even more to do with the personal chemistry and nonverbal interactions, rather than your qualifications for the job.

That wasn't the way it was supposed to be, was it? Get a tech education and some experience, and the jobs will find you. Even if you have to do some looking, there are plenty of jobs available. But follow the formula, everyone said.

So I'll contradict the published wisdom. There is no formula. And there is a lot of luck involved, both good and bad. There are things you can do to improve your chances, but there is no sure thing. And even the things that you can do to improve your chances are probably things you don't want to do, or don't think you're good at.

First, you have to start with a premise. In technology, you will almost certainly find yourself unemployed on more than one occasion (my number so far is three times). Your company has a layoff, moves locations, or closes completely. You don't fit well into the culture or the required job activities, or have a personality conflict with a new manager. You decide you want some time off.

These events shouldn't take you by surprise, and you should be prepared for them. Certainly having the money available to tide you over is a big part of the preparation, but even more important is the mental side. You should always have a good idea of what you want your next step to be, and what is a realistic path to that next step. And if that path means developing new skills or meeting new people, you should do that now. And you should update your resume on a regular basis. Not after the axe has fallen.

The new skills you need are probably not technical ones. And they may not even be the business skills that everyone says you should be working on. Rather, for lack of a better word, they are personality skills. How to meet new people and have them remember you, how to present new ideas, and how to feel comfortable interacting in a group and working a room. How to build and maintain the electronic equivalent of a Rolodex, and stay in touch with everyone in it.

Does it feel too much like sales? Do you think your skills and experience should speak for themselves? They may, but probably not. If you want the edge, you need to place yourself in uncomfortable situations, and learn how to perform in them.

Don't know where to put yourself in these situations? They are all around you, but they almost certainly have nothing to do with sitting at your desk and coding away for five days a week. There is a Toastmasters session in your cafeteria after hours, a standards committee meeting at your competitor down the street, and a study group in a conference room at lunchtime. Open source projects are begging for help, in testing and documentation writing, if not coding. You can be doing all of these things with an e-mail or two. Yes, work is demanding (I worked twelve hours last weekend, and Nancy Grace is on CNN as I write this), but just doing your job in tech is the surest way to a lengthy stay the unemployment line.

Don't feel like it? You may skate through your career anyway. I still know programmers at the large defense contractor downtown with 25 years of seniority. But you're not doing yourself any favors. Remember the choices you made when the hammer drops.

Bad management and outsourcing made your career more difficult, but not impossible. I'll tell you a secret; I don't like doing any of this stuff either. I get lazy, and don't do them as well as I should. But I do so because in each of my periods of unemployment, I was back at another job within three weeks. Your call.

Posted by Peter Varhol on 01/17/20070 comments


Ten Things for the New Year, Part 2

In my last post, I noted that I had no special skills in prognostication, and no desire to fall into the trap of making uninformed predications. Instead, for 2007 I am listing ten things that I would like to see happen for the new year. Whether or not they do is another issue entirely.

Here is the second part of the list. You can find the first part at http://www.ftponline.com/weblogger/forum.aspx?ID=11&DATE=01/04/2007#713.

6. Mobile applications become ubiquitous. And this means mobile connections, too. I almost always travel with my laptop, and feel out of touch when I can't find a wireless connection. In some hotels, the wired connections can still be unreliable and unrealistically expensive. I would kill for a national wireless Internet plan that is similar to my cell phone plan.

7. The user interface becomes the most important part of an application. Many applications still have poor user interfaces. And that goes for our new client operating system, Windows Vista. Joel Spolsky notes (http://www.joelonsoftware.com/items/2006/11/21.html) that Vista provides a large array of confusing and duplicate choices just to turn off the computer. Moishe Lettvin replies (http://www.joelonsoftware.com/items/2006/11/24.html) with his estimate (from personal involvement) that it took 24 people a year to design that system. This only goes to demonstrate that effort alone does not cut it. What does do so is a concise definition of the goal of the interface, and a focus on the minimum complexity needed to accomplish that goal.

But we rarely build software in general like that; why should the UI be any different. Because more than anything else, the UI both defines the product and determines its success.

8. Java and .NET learn to work together. Java has been with us for almost 12 years; .NET for almost six (and older but similar Microsoft technologies have been used for years longer). CORBA was a complex failure (though a decent technical achievement), and Web Services are not being widely used to connect the platforms. Those who are responsible for building applications and integrating them into existing application infrastructures shouldn't have to choose one or the other because of the limitations imposed by past choices. They just have a job to do. Let's see the Microsoft and Java communities band together to make that possible.

9. My cell phone remains only my cell phone for a while longer. Don't get me wrong; I like music, but I prefer my quiet. The iPhone does nothing for me, although I concede Apple's ability to generate buzz among those for whom buzz is a way of life. I wouldn't mind a Web-enabled phone, except that the user interface problems of the small form factor are intractable (see #7 above).

Further, I want my cell phone off on the plane. I once had to tolerate a women sitting next to me on her cell phone in the process of firing the person at the other end, right up until the time the cabin door was closed. Upon landing three hours later, she turned on the phone once again and continued the process. Spare me this in flight.

10. The software industry continues to surprise and amaze me. I have been a part of the software community for two decades, and cannot imagine doing anything else with my life. In New England, we have a saying that if you don't like the weather, wait 20 minutes and it will change. The software development community is exactly like that. For anyone looking for excitement and intellectual challenge, I cannot recommend a better place to spend a career.

Posted by Peter Varhol on 01/15/20070 comments


Ten Things for the New Year, Part 1

Successful prognosticating is a skill that I don't possess. People increasingly fail to heed my sage but largely incorrect advice on technology, business, and personal affairs, with good reason. There is a mysterious art involved in determining which trends will reach an inflection point worthy of note in the future, and significant skill in determining what that inflection point is and what it means.

That's okay. Most of the people who make predictions are not, in fact, qualified to do so. But that's why I don't call mine predictions. Instead, they are things that should happen, and to a lesser extent, things that I would like to see happen, in the coming year. That way, if they don't happen, I don't look that much like a fool.

Here are the first five.

1. A killer application for Windows Vista is shipped. I have no idea what the killer application may be, but Vista, and the Windows franchise in general, needs a signature event. If I had to choose among graphics, Web services, security, or control for what that killer application will leverage, graphics seems the obvious choice. It is also likely to be a business application, because that is what typically drives big applications on Windows.

2. Open source becomes a universally accepted model for software development. I confess to being skeptical of open source for a long time, not as a technical achievement, but as a viable business model. Last year started to change my thinking, in part because I'm beginning to see how enterprises can leverage open source, and how commercial vendors can make money from it. I've seen the emergence of some good concepts for integrating open source code into projects, and also making open source a cornerstone of a commercial business strategy.

3. Eclipse supports .NET development. There is no reason why this can't happen, and someone will make it happen sooner or later. Why is it important, especially when it can never top Visual Studio in productivity and ease of use? Two reasons. First, developers will have access to a much greater array of tools that are part of the Eclipse ecosystem. Many of them are Java-specific, but an increasing number are platform-neutral or even .NET-focused. Second, developers are increasingly working cross-platform, and the Eclipse architecture has an advantage over Visual Studio for this type of development. This would have to be done by someone who licenses the .NET Framework and SDK from Microsoft, and can build a vision of a cross-platform application lifecycle based on Eclipse.

4. Second Life, or similar avatar-based universe, goes mainstream. This deserves a longer treatment than is possible here, but the ability to create new worlds offers something for everyone – users, developers, and business people. For developers, virtual universes offer an unlimited opportunity to build tools for creating the universe and tools and applications to use within the universe. Think of it as a completely blank slate for building any application you can dream up.

5. OS and application security holes become a thing of the past. No piece of software is perfect, and those who seek out and exploit flaws are highly talented and for some reason dedicated in the task. But developers, testers, and system managers have to spend an inordinate and increasing amount of time trying to make applications, systems, and networks bulletproof when one hundred percent protection simply cannot be guaranteed (and impossible to improve). And increasingly secure systems are also increasingly difficult to use and administer. Too much attention on security is detracting from building quality applications with features users need, and I'd like to see that trend reversed in 2007.

Look for the last five things on my 2007 wish list in a few days.

Posted by Peter Varhol on 01/04/20070 comments


How Do Software Companies Make Money?

Seems like an odd question, doesn't it? Especially when your strategic vendors seem to be draining you dry on support contracts. That may in fact be the case, but we've seen a dramatic shift in the business models of development tools vendors over the last five years.

Many of the companies that I deal with on a regular basis have launched open source strategies over the last year or so. For the most part, they have offered similar software under both an open source and a commercial license. What's the difference? Well, the open source software is largely provided as is (yes, I know that commercial software also uses that as a licensing term), with not support, patches, or upgrades. The commercial license is often includes professionally prepared documentation, comprehensive support, and patches and upgrades. In some cases, it also includes indemnification from patent suits.

Even if it is not strictly an open source strategy, it is often a free (as in no cost, not freedom) software strategy. Cross-compilation vendor Mainsoft gives away a version that does virtually everything that its commercial product does.

One company we talked to quite a bit about this was Terracotta, a provider of Java clustering solutions. I had an interesting statement from Terracotta CEO Amit Pandey recently. It's not possible to make money off developers. Instead, the company seeks to let developers easily acquire and use its technology, and charge only for deployment licenses.

I'm of two minds about this trend. One of the biggest problems that both development tools vendors and the developers themselves have is with shelfware. Shelfware is, of course, software that is purchased but for one reason or another is never used. Developers either have it forced on them by management, or never quite figure out how to integrate it into their normal processes.

The idea is to make developer tools more accessible to try out and use, with no strings attached. This effectively solves the problem of shelfware. If developers don't use the software, the vendor makes no money. If developers use the software, the vendor collects its money through runtime licenses. It is a fair deal for developers, and a validation of the value of the software for the vendor.

One other item of note is that cost of sales goes down substantially; in some cases, it is close to zero. In traditional commercial software companies, the cost of sales can make up as much as 50 percent of the cost of software. A lower cost of sales means that companies can still make money by selling their software less expensively, or by selling fewer copies.

However, giving away tools to developers only works for tools that have a runtime component in production. When I worked at development tools vendor Compuware Corporation, one of the more nagging headaches with our debugging tools was that developers would get evaluation copies, or even bring in a sales engineer for a proof of concept, find and fix all of the problems in their software, and send us on our way.

Now, I understand that there are several messages in that story. Clearly developer tools vendors provide value, but that value is realized at a specific time in the application development lifecycle, rather than throughout the entire process. It is true that developers and their employers seem to have a problem with paying for that value, but that may be at least in part because the vendors haven't yet figured out how to successfully monetize that value.

Suffice it to say that this arrangement may be innovative and the wave of the future, but it is also broken in some cases. It behooves all of us to find a way to fix it, else we may find ourselves missing some of the development tools we now take for granted. And for free.

Posted by Peter Varhol on 12/19/20060 comments


The Good and the Bad, Part 2

A little while ago (http://www.ftponline.com/weblogger/forum.aspx?ID=11&DATE=11/29/2006#704), I posted a list of list of five good events that transpired in technology over the year. I'd like to follow that up now with five bad events. As I mentioned, bad events reflect poorly on the industry, and have negative outcomes for some or all developers.

1. Hewlett Packard. While dysfunctional management has not sidetracked its successful pursuit of sales and profits, that same dysfunctional management has achieved a notable if dubious result – the loss of forty-plus years of goodwill in one fell swoop. This is aided and abetted by a failure of the most senior management to take responsibility for the actions of those they directed. If you were ever looking for an indication that there is no such thing as an ethical employer, you have found it at HP this year.

2. Windows Vista. Yes, it was on the good list, too, because of the expected boost to the industry as a whole. But too many people are asking why we need it. A part of that stems from its late arrival in the market; too many of its features look like reactions rather than innovations. But the worst thing about Vista is that it has no overarching theme. Sure, it is presumably more secure, and presumably has a better (or at least different) user interface, but unlike past releases there is little reason for Microsoft to say, We built this OS because . That is not to say that it ultimately will not be successful; Microsoft has sales and marketing tools that virtually assure that. But the reality is not nearly as exciting as the hype.

3. Software patents. Why do I feel like the nuclear doomsday clock (http://en.wikipedia.org/wiki/Doomsday_clock) is ticking toward midnight here? Software companies are much like countries, and their patents are like bombs. Companies like Microsoft and IBM, with tens of thousands of patents, have in effect nuclear weapons. Much of the rest of the industry has in the back of their minds the thought that one day those weapons will be used against them.

Unlikely? Not if you go by the threats that occasionally get made, most recently by Microsoft. And while I cannot speak of the particulars concerning any one of these nuclear powers, I do know that most have legal teams that are profit centers. These legal teams are charged with identifying other companies' technologies that potentially impinge on their patents. They quietly approach these companies with the threat of a patent lawsuit, and typically make millions of dollars in what they call license fees. I call it a protection racket.

4. Security and identity theft. I received a notification of potential identity theft this year, which brought home the magnitude of this problem. You as an individual can do everything right, and still have someone holding 30-year old records leave an unencrypted computer on an airplane. This has implications to us as individuals, and as developers. The individual aspect should be readily apparent. From the standpoint of the code we right, we have an ethical responsibility to make sure we don't open up someone else's identity to theft. In the future, that ethical responsibility may become a legal one.

5. The decline of development. Let me explain. This one was brought home to me when a vendor representative recently briefed me on the company's new open source strategy. We shouldn't be making money off developers, he said, with the coda that his company would recoup their investment off of deployed applications. While that might be a noble sentiment, I began to wonder when we as a community decided that we would not invest in our developers. Apparently, while I wasn't looking, we decided that developers could make do with free tools. Granted, many free tools are fine, but we should be planning for development in much the same way that we plan for deployment. If we have a plan, supported by money for training, quality, and productivity, we might be surprised at the output we get.

Posted by Peter Varhol on 12/08/20060 comments


It’s All About the Data

Circa 1993, I did some work for a company called TerraLogic (don't bother looking for it; it no longer exists). TerraLogic was among the first companies involved in building mapping software, at that time a C-language API to manipulate maps provided as a part of the development kit.

At that time, the CEO of the company explained to me that the decade of the 1980s was the decade of hardware in the PC community. The true quest was to build boxes that ran faster, all within the limitations inherent in the PC backplane and memory bus.

The decade of the 1990s was the decade of software, when companies struggled to code operating systems and applications that could take advantage of what the hardware had become.

The first decade of the 21st century, he concluded, would be the decade of data. Data would become the most important thing we could generate, and companies would compete not on the basis of hardware or software, but on the quantity and quality of their data.

Well, it turns out that he was half right. In the world of the mashup, data is king. But rather than competing on the basis of data, we seem to be sharing the data. What matters is how the application uses the data rather than the data itself.

Scott Deitzen, CTO of Zimbra and keynote speaker at FTP's WebBuilder 2.0 conference, gave an expansive view of mashups, demonstrating how data from within the same application can be used in innovative ways. For example, the Zimbra e-mail client can preview documents and images sent as attachments to e-mail messages by hovering over them, so that it's not necessary to open them in order to get an idea of what they contain.

The data is already available from within the e-mail client. We typically have to save the attachment and open it with another application in order to find out what it is. The combination of receiving and viewing within the e-mail client represents a mashup of data that was previously unavailable in a single application.

Posted by Peter Varhol on 12/04/20060 comments


Subscribe on YouTube