Lawyers and Other Strangers

I am not a lawyer by any stretch of the imagination, but this article (Is AMD an American Company? http://money.cnn.com/blogs/legalpad/index.html) on the AMD antitrust lawsuit against Intel caught my attention. The relevant antitrust law apparently does not give US courts jurisdiction over antitrust violations practiced in other parts of the world, so that local jurisdictions can apply their own laws.

Apparently Intel has successfully argued that because AMD does not manufacture chips in theUnited States (they are manufactured by an AMD subsidiary in Germany), and the majority of those chips are sold elsewhere, that US antitrust law does not apply to those activities. Instead, US law can only be applied to those chips that are imported for sale in the US, perhaps 30 percent of the total.

I am not so surprised at the ruling (which may end up being reversed on appeal, of course) as I am about the growing body of evidence that our traditional jurisdictional boundaries are breaking down. We have seemingly survived rulings that eBay cannot make Nazi items available for sale in Germany, and that Yahoo cannot present Nazi newsgroups in France. But this type of limitation based on physical jurisdiction is a one-off response to specific legal actions. It does not reconcile the fundamental differences between online and physical boundaries.

It was perhaps twenty-five years ago that, as a teen, I began hearing that the world was getting smaller, and it was essential to have skills and experiences that spanned cultures. In fact, the world did not really get smaller until it became just as easy and inexpensive to communicate across oceans as it did across streets. Now as the world truly gets smaller, we seem even less prepared to deal with the consequences.

We are no doubt going to be increasingly faced with jurisdictional issues where electronic communications or global activity come into conflict with existing laws or traditional practices. Legal experts respond that well-written laws can still be fairly applied even under changing circumstances, or that the law will eventually catch up to the reality, but I have my doubts. All too often, legal demands on the online world seem to conflict with the laws of nature.

Of course, the world is moving faster, while the legal framework seems to plod along at an old-fashioned pace. I fear that unless laws change more quickly to reflect the world as it is, the conflict will be much broader in scope than the little skirmishes we have today, like the Intel-AMD lovefest.

Posted by Peter Varhol on 10/13/20060 comments


In Memoriam

Ray Noorda, ex-CEO of Novell and networking visionary, passed away yesterday (http://money.cnn.com/2006/10/09/news/newsmakers/noorda.reut/index.htm?postversion=2006100919, and others). We hardly think of Novell as a networking company today, what with its merger with Cambridge Technology Partners and its acquisition of SuSE Linux over the past several years (Novell also owned AT&T System V Unix for a brief period). But there was a period of time in the late 1980s and early 1990s when Novell NetWare was synonymous with PC networking.

What happened? As is the case with many companies with innovative ideas, Microsoft eventually set its sights on the networking market. Because Microsoft owned the operating system, it was able to better integrate networking into the platform and eventually offer a less expensive and easier solution.

But Novell made his own mistakes. NetWare enabled PC networking, but did not make it easy. It required a dedicated PC server and significant skills (and perhaps also a measure of luck) to get NetWare installed and operating properly. And Novell did not expend a lot of effort in making it easier. Quite the contrary, in fact, because keeping it difficult enabled the company to develop an army of technical specialists who also evangelized the product.

Certified NetWare Engineer (CNE) training was the surest ticket to job security and financial success in the PC industry for several years. These skills were almost mandatory in installing and maintaining the software. While Novell was training over 50,000 CNEs, Microsoft was busy making networking simple enough in Windows 95 so that little or no training was required. Ultimately this was the right side of history.

Novell also used its own proprietary networking protocol (IPX/SPX), rather than invest in TCP/IP, the protocol of the Internet that eventually became the industry standard. While the company eventually offered TCP/IP in addition to its own protocol, NetWare's peak had already passed.

When Noorda retired from Novell, he founded Caldera Linux (for a while my preferred Linux distribution), which through a strange sequence of events eventually morphed into SCO (without involvement from Noorda), the company that has been attempting to assert legal flaws in Linux over the past several years (Novell, among others, is disputing those claims).

In retirement Noorda also founded the Canopy Group, a venture capital firm that invested in start-up companies based mostly in his home state ofUtah.

Posted by Peter Varhol on 10/10/20060 comments


The Evolution of Information

When I was a youngster in the 1960s, my parents bought into the then-prevailing notion that it was essential to have encyclopedia in the house (literally, and at a significant sacrifice on their part). We ended up with a total of three sets, one of which was pretty comprehensive. By the time I was fifteen, I had read through all of the volumes, in some instances more than once. I have a good memory, and still recall some of the things I read in that era.

My next information revelation came in the early 1990s, with the introduction of encyclopedic information on CD media. I recall being particularly entranced by Microsoft Encarta, whose hyperlinks gave me the unique opportunity to view and parse information randomly, rather than sequentially. A new discovery awaited at the other side of every click.

I'm beginning to feel a similar sense of discovery today, with the Wikipedia as the source. Increasingly I find myself turning to that site for bits of information on a variety of topics, or even for pure enjoyment. I spent most of last Saturday jumping from one link to another in tracing a particular series of historical events across the Pacific Ocean, reading and learning as I went.

My wife, back in college retraining into another career field, has her doubts about Wikipedia. These doubts spring from the seeming lack of proven authentication, most in the form of academic citations. But you will even find some of those in Wikipedia entries, and tests conducted of content between Wikipedia and more established (and costly) encyclopedia has found similar rates of errors and omissions.

Wikipedia has certainly had growing pains. There have been inappropriate entries, both favorable and unfavorable, to people and events. The editorial process is a community effort, rather than a rigorously designed process, which makes many people uncomfortable about depending on it as an authoritative source.

But the entries I have viewed are dispassionately written, neutral in tone and position, and seemingly accurate. It still remains to be proven that a community-based editorial process can satisfy the demanding needs of an authoritative information source, but the trend is a positive one.

I have never owned an encyclopedia, making do with the various academic and public libraries at which I have had privileges over the years. And now I'm gradually getting to the point where I will likely be able to depend entirely on the Wikipedia and similar sources. I eagerly anticipate the next phase of the information revolution.

Posted by Peter Varhol on 10/07/20060 comments


Air Travel Becomes Easier Again – Not

In my role, I do a modest amount of travel, mostly long distances away from my enclave in the wilds ofNew Hampshire. And I actually like flying; I was an early private pilot, and only my severely deficient eyesight kept me out of Air Force pilot training twenty-five years ago. My favorite flying experience was as a passenger in an Air Force jet preparing to make a wheels-up landing because the pilots couldn't get a green light on the gear lock indicator (ultimately, after dumping fuel, they were able to confirm a gear lock, and the landing proceeded normally, to my dismay).

These days, flying is less fun. Much of that is understandable. To make matters worse, there are persistent rumors that laptops may at some point be banned from carry-on baggage, a fact that would kill all business travel for me and for many others. This restriction actually occurred for a couple of weeks in the UK after the incident this summer.

But the major problem is that the security restrictions that we all put up with defy rational analysis. While it is impossible to prove a negative, it is not at all clear that existing security measures can consistently and accurately deal with a threat.

A recent example serves to illustrate both the capricious nature of our security restrictions (http://www.cnn.com/2006/US/09/28/idiot.baggie/index.html). In this case, a man placed his liquids in a clear plastic bag, as required, but wrote Kip Hawley is an Idiot on the bag (Hawley is Director of the Transportation Security Agency). The passenger was detained, and the police were called (to no surprise, police determined that no crime was committed, and the man was ultimately released).

Aviation security must be able to distinguish between mildly obnoxious behavior and threatening behavior. No doubt the screeners were annoyed, but their response in this instance was arbitrary and capricious. In a reliable security system where lives are at stake, this should not happen. I have always been willing to give the system and its participants the benefit of doubt. Until now.

Posted by Peter Varhol on 10/01/20060 comments


Some Data from Eclipse

I was at the Eclipse members meeting in Dallas this week, along with about forty other representatives from the Eclipse community. While the meetings at EclipseCon get better attendance, the members meetings tend to be less formal and offer better opportunities to learn what others are up to.

In this case, we received briefings from two new strategic members, one of which was Motorola. There was also a briefing from new add-in provider ARM (Advanced RISC Machines, although I don't think they go by that full name any more), a processor IP developer. ARM processors, actually implemented and manufactured by others, power many cell phones and PDAs available today.

It certainly says something about the popularity and utility of Eclipse that an embedded processor designer finds value in participating. The same can be said of Motorola. Both companies have found value in hosting development and configuration tools inside Eclipse.

We also got presentations from the Rich AJAX Platform (RAP) and the AJAX Tools Framework (ATF) projects. These projects promise to make AJAX development both easy and productive using Eclipse.

Rich Client Platform (RCP) remains a significant initiative, and Eclipse claims that more and more users are making use of this framework in building standalone applications. It is difficult to confirm this claim as a trend, but it does make some sense. For enterprise developers who want the responsiveness of a rich client but do not want the hassles of platform dependencies, the RCP seems to be a natural solution.

A marketing survey done for Eclipse by Evans Data Corporation (and co-sponsored by FTP) showed that Eclipse use was reported by about 65 percent of the respondents. The primary barrier to Eclipse use is training. Not training in the tool, per se, but training in how to write plug-ins, how to configure a specific environment, and how to support it. This sounds like a business opportunity to me.

Eclipse director of marketing Ian Skerritt said that 30 new projects had started over the past year. This speaks of both the opportunities and challenges for the Eclipse Foundation. For those seeking information on these projects for possible participation or to use their output, it can be difficult to discover that they actually exist in the first place.

The Eclipse Foundation is in the process of redesigning its site to address some of these issues. FTPOnline will also do its part, by regularly publishing descriptions and status reports of the various projects. We are working with the leadership of these projects to get the latest information on where they stand and how you might make use of this code.

Posted by Peter Varhol on 09/26/20060 comments


Turbo Lives!

To those seeking a return to the days when it was possible to be a serious hobbyist programmer with dreams of writing a breakout commercial product, the Turbo products announced by Borland back in August are now available for download. These include Turbo C++, Turbo Delphi, Turbo Delphi for .NET, and Turbo C#, and come in two packages, an Explorer version that can be downloaded for free or purchased on CD for a nominal charge, and a Professional version that will price at less than $500.

Make no mistake; these products are fully modern IDEs, with UML modeling, drag and drop user interfaces, and full data access objects. Developers can use the integrated Borland Database Engine (based on DBASE and Paradox), or third-party databases such as MySQL. It is possible to build serious applications using these products. The difference between the Explorer and Professional editions is that the Professional editions provide the ability to write your own user controls, and to install and use third party user controls.

Go to www.turboexplorer.com for the free downloads. And happy coding!

Posted by Peter Varhol on 09/25/20060 comments


Simulation May Be Key for Multicore Debugging

I hate race conditions. Developers of multithreaded applications who have bugs that are dependent upon the timing both within and between threads curse race conditions. They go against every instinct that a developer has about computers and software – they are, or at least appear to be, nondeterministic. It depends on when a particular thread finishes a particular task, and it can vary depending on the execution path taken, or the amount of data read and written. Or it can vary based on the time of day, or phase of the moon.

Just kidding about the last part, but to many developers it can seem like a reasonable statement.

Most of us have encountered bugs in clean builds that mysteriously disappear when debug information is added. This is often related to the race condition, in that debugging information tends to slow down execution, changing timings and getting rid of the problem. Of course, the problem is still there, but it cannot be found while running the debugger.

These were the thoughts that perked me up as I participated in a briefing from Paul McLellan of Virtutech (www.virtutech.com). Virtutech makes software, primarily for embedded systems development, that enables developers to simulate the underlying hardware and operating systems. This is especially important for embedded development, such as cell phone software, where the hardware may not even be ready when application development is in full swing.

But there are lessons here for PC developers, who are increasingly facing problems that can be addressed by debugging on simulated systems. The lessons come from, of all things, debugging software. You can run the simulator backwards, explained McLellan. Just get to the point where the bug occurs, then step backwards.

That is an extremely powerful debugging technique, because you do not have to guess at where to start stepping forward using conventional debugging techniques. But it does not end there. If you have a multicore system, McLellan continued. You can run one CPU at ten times the clock rate of the other. You can change the timing until the bug disappears.

Of course, this by itself does not find the bug for you, but it does give you one more tool in your arsenal, and it is a tool that isn't possible when you are debugging on real hardware. We would find good uses for such a tool in practical debugging situations.

I commented a few weeks ago (http://www.ftponline.com/weblogger/forum.aspx?ID=11&DATE=8/20/2006) that programming multiprocessor systems represents one of the fundamental problems of computer science for the foreseeable future, perhaps requiring new languages or new techniques that are not yet in common practice. (NB – The distinction I am making between multiprocessor and multicore systems is that in a multiprocessor system, the system has multiple complete processors on separate dies, whereas in multicore systems the processor has multiple CPUs on the same die).

One of those techniques may well be running multiprocessor and multicore application software on a simulated processor and operating system platform. Consider how this might work. You write a threaded application in which different threads are designed to run on different CPU cores. In the course of debugging, you come across an application crash that seems to occur almost randomly, even in different parts of the code.

On a simulated platform, you can do a debug build and engage the debugger. Then let the application run until the crash, and back up the execution of the simulator, watching thread activity and variable values on different cores as you normally would in the debugger, except in reverse. Once you've established that a race condition exists, you can speed up one of the processors until it disappears, confirming the condition and getting a better idea of the timing involved.

This will not make the code easier to write, of course. The developer is still responsible for locking resources and data from access and change by multiple threads, and for blocking thread execution when it can be harmful. But I cannot help but think that simulated platforms will play some role in multiprocessor and multicore application development in the future.

Posted by Peter Varhol on 09/18/20060 comments


Getting Out of Software Development

I ran into two former colleagues earlier this week at SD Expo Best Practices (www.sdexpo.com) in Boston. These gentlemen (don't let them know I called them that) now work at high-level software engineering management jobs at a major systems and software vendor (we now know which of us succeeded professionally and which of us didn't). The last time I was in touch with them, approximately two years ago, they were the vice president and software development director of an enterprise messaging product line for this vendor. Today, they make up the entirety of a two-person department whose job it is to standardize software engineering practices across the vendor's software groups.

What happened?

It was at their choice, they informed me. Their enterprise messaging and message management product was failing for technical reasons. Our product worked fine in the lab, one explained. Then it started failing at customer sites. We went to investigate, and found that people were forwarding messages to 10,000-person distribution lists, with twenty-two attachments. And they told me several other tales of e-mail that were just as absurd.

How do you archive something like that, and make it searchable? And a good part of the problem is that the APIs they were using (especially MAPI on the Microsoft side) didn't deal with these edge cases well. But there were also issues in parsing Microsoft and Unix formats to come up with a common management approach.

All of these are probably solvable, given enough time and code. But they are not particularly enjoyable problems to solve. And they are the kind of things that we would normally consider bug fixes, not features. Worse, they take time out of the schedule, which was probably overly aggressive to begin with.

So these gentlemen (there I go again), both well-paid and highly qualified professionals in their mid-40s, choose to work to improve development processes instead, rather than deliver production software. In any other field, these types of people would be reaching up to the next rung of responsibility. What is it about software?

When I relayed the story to a colleague, he remarked, People do crazy things with e-mail. That is undoubtedly true. But if we provide the feature, it should work consistently under all circumstances.

It is a problem of our own making. Unlike just about other engineering profession, we have failed to standardize on shared infrastructure. Instead, those things that should be implemented, repaired, and enhanced once across all platforms are instead implemented separately. They work differently, depending on your language, operating system, and perhaps even your intended application.

If it is our problem, it is up to us to fix it. The infrastructure that multiple vendors depend upon must undergo some standardization. If a piece of infrastructure has a bug, or is not robust enough, it need be fixed only once to service all of its users.

If we don't do so, other talented software development leaders will abdicate the responsibility of delivering quality software.

Posted by Peter Varhol on 09/15/20061 comments


The Status Quo Continues

Hewlett Packard announced that board chairperson Patricia Dunn will be relinquishing that role in January, but still remaining on the board of directors (http://money.cnn.com/2006/09/12/news/companies/hp_dunn/index.htm?postversion=2006091217).

This is wrong, wrong, wrong. A crime has been committed, theCalifornia attorney general has concluded, and people both inside and outside of HP appear vulnerable to indictment. That crime was committed under the direction of Patricia Dunn, even though she was undoubtedly not the person who committed the actual crime. The same goes for the rest of the board. That seems to be the out that the HP board is counting on.

CEO Mark Hurd will take over, vowing that the probe's methods "have no place in HP."

Wrongo, Colleen. You can't prove that one by the phone records of the journalists that HP illegally obtained. There is a lot of blame here, and the HP board seems to be determined to take none of it. Once again we find corporate executives who see themselves as above the law. We just never thought we would see them at HP. This is the sort of behavior that would get anyone summarily fired, even if they were able to avoid prison time. Unless you are on the board of HP, apparently.

Hurd's message might well be that clearly these methods do have a place in HP, unless you are caught. Then we will blame the contractors and exonerate ourselves.

Perhaps justice will be done here. But the board has spoken, and I don't think so.

Patricia Dunn, you have no integrity.

HP, you still have a serious problem. And you don't seem to care.

Posted by Peter Varhol on 09/13/20060 comments


Accessibility More Than a Software Issue

Earlier this week Computerworld reported that a class action lawsuit was filed against Target concerning the lack of accessibility of its Web site to blind users (http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9003129). Target sought to have the suit thrown out, arguing that today's accessibility laws relate to physical, not virtual, accessibility. The judge declined to throw out the case, so it is moving forward.

That is not to say that Target will lose, at least this time. But this is clearly a point in time where online accessibility is starting to get a similar standing to physical accessibility. At some point, a court will rule that the two are close enough so that existing accessibility laws will apply to Web sites.

Deciding which Web sites have to have handicapped accessibility may be a more difficult issue, but I suspect that any entity with a physical presence requiring accessibility must also have accessibility for its Web site. For those entities such as Amazon.com, with no physical presence, the decision will be made based on their similarities to other businesses. There is unlikely to be any dispensation for companies with only an online storefront.

When we talk about accessibility to physical structures, we typically speak of architecture and construction for ramps, elevators, and wide corridors. In the virtual world, physical infirmity is less an issue, but blindness and deafness are. Today, this means more text and fewer bells and whistles on sites. In time, it may mean that we have entirely separate pages for handicapped use, in a manner similar to what sites do for mobile content today.

But the time is coming where the Web will have to be accessible, even more accessible than a physical location. If you haven't yet done so, it is time to get started.

Posted by Peter Varhol on 09/12/20060 comments


A Look Back Five Years

Many of us are likely to remember vividly what we were doing on this day five years ago. I was a product manager at Compuware, in the NuMega Lab in Nashua, New Hampshire. Just before a 9:00 AM meeting, I happened to glance at the CNN.com home page, where there was a photo of smoke coming from the one of the World Trade Center towers, along with a headline that said simply that a plane had struck. The damage didn't seem bad, so it looked like a simple accident, although in retrospect the cloudless sky over Manhattan should have tipped me off.

I went off to my meeting, and it was only afterwards that we realized the magnitude of the disaster. And there was still more. Perhaps an hour later, we came to realize that two of our colleagues, Graham and Myra, were on the two planes that had departed Logan for Los Angeles that morning, making an unscheduled and permanent stop in south Manhattan. Bob, who was Myra's boss and had the office next to Graham's, didn't move from his desk all day, and just stared into his computer screen in shock.

I went out to late lunch with a friend, and we each had a drink. Well, two drinks. There wasn't much to be said, so it was a quiet lunch. Both of us knew that the world had changed that day. This Compuware lab still has a remembrance ceremony.

I like to think of myself as a student of history. In 1992, Francis Fukuyama published a book entitled The End of History and the Last Man, an expansion of an essay he wrote in the international affairs journal The National Interest. In the book, Fukuyama argues the controversial thesis that the end of the Cold War signaled the end of the progression of human history.

History, it seems, goes on, although perhaps not in the sense that Fukuyama meant. He was referring to the inevitability of history, the progression from one stage to another, that Karl Marx had postulated as the social development of the human condition.

But stuff still happens. Some of it is important stuff, and will be remembered as history.

I suppose this has more to do with life than with IT, but there are lessons to both. We assume that we are in a safe career, yet we are a part of the world around us and face the same dangers and uncertainties. Sometimes you are simply in the wrong place, and get caught up with something that is completely out of your control.

Most of us also correlate this tragedy with the dotcom bust and the loss of the seemingly endless progression of exciting IT jobs. The two are approximately contiguous in time, but there is little or no causation. The dotcom era (or perhaps more accurately, the first dotcom era) had played itself out almost a year earlier, and it took that long to make its absence felt. Certainly the events of this day accelerated the process, but there is no denying the process itself.

The constant may be the cycles of IT activity and opportunity. It took some time, but the industry came back. While I am by nature an optimist, I have no doubt that the cycle will repeat.

Posted by Peter Varhol on 09/11/20060 comments


Will HP Do the Right Thing?

The stench continues to emanate from the boardroom of Hewlett Packard. At the same time, expressions of shock and denials of responsibility are flung from that redoubt onto the company and the public in general.

I have rarely seen such a poor and hapless example of leadership inside a major American corporation. I want to ask the question What were they thinking? except that I am afraid that they knew exactly what they were thinking, and doing. Here is why that is bad.

Secrecy is a deeply flawed concept. Any information exchanged between two people should be considered to be public knowledge. That is not a reflection on individuals' ability to keep secrets, but rather a reality that information cannot be held under wraps. Any belief to the contrary is pathologic. And if the cat is out of the bag, it does no good to expend corporate resources to assign blame. It is simply stupid and vindictive, and a waste of those resources.

Hear no evil, see no evil is not a management strategy. It is an excuses strategy. It is an especially bad strategy for a corporate board of directors in the Sarbanes Oxley era. Bad in a legal sense, certainly, but also bad in an ethical sense. If the highest level of a company cannot take responsibility for illegal behavior performed in the course of the execution of its strategy, then those individuals consider themselves a law unto themselves.

I have read that there are calls for the resignation of the nonexecutive chairman, Patricia Dunn. I disagree. CEO Mark Hurd, who is a newcomer and probably blameless in the fiasco, should demand, and receive, the resignation of every other board member. There is a culture in the governance of HP that has to be killed off right now.

The stockholders of HP deserve better. The tens of thousands of employees of HP deserve much, much better. Should this board of directors be allowed to continue, the culture and reputation of this great technology company will be irreparably damaged.

The problem is that a corporate board of directors has the ability to entrench itself. It can simply ignore the consequences of its actions, and continue as though nothing has happened.

We all need to stand for something in our lives. The board of one of the most important companies in our industry stands for secrecy and deceit. How they can live with that is beyond my comprehension.

We all deserve better.

Posted by Peter Varhol on 09/09/20060 comments


Subscribe on YouTube