In
my last blog I discussed how Jason Short bought VistaDB in 2007 -- a product we
reviewed in June and liked very much. Last month, I interviewed Jason via e-mail and my first question seemed obvious to me.
Peter Vogel: What's it like being a product vendor?
Jason Short: It is honestly rough. I think that developers are a very hard crowd to reach and please. I am amazed at how many developers will spend 1,000 hours on customizing something they got for free rather than pay for a tool that works. The "not invented here" syndrome still seems to be alive and well.
I think the 100 percent managed code mark has been completely missed by a lot of companies because Microsoft doesn't do it themselves. SQL CE uses unmanaged code under the hood, but most users don't have to care because Microsoft can run their code in ways we can't (policy files are nice when you own the framework).
Sticking to only C# code for the entire engine has put us at a disadvantage in some ways, but we try to make up for it by playing to the strengths of managed code. You can't have a single assembly of mixed code that will run on 32 and 64 bit machines, we can do it because we are 100% managed.
PV: How do you compete in this market?
JS: We have tried very hard to differentiate ourselves by being the only 100 percent managed code database you can embed into your application. Your users don't have to know we exist, and because it is 100 percent managed code we can run in a lot of scenarios where SQL CE and Sqlite cannot.
Our goal was to build a database that desktop developers could use to build applications, and then later easily scale up to SQL Server when they need it. We are the only third party database to allow T-SQL and CLR Procedures that will run in SQL Server. You can't bring everything from SQL Server down to our level, but we are a subset that completely works in SQL Server. We have a lot of small businesses who use VistaDB for the desktop tool, and then also offer a business version that will work with SQL Server using one codebase. This is great for them to be able to reach more than one market with a single set of code.
PV: What matters to you about the VistaDB product?
JS: There are two key things that have always driven all of our design for VistaDB.
Deployment: Having fully managed code means we have the easiest deployment possible (xcopy with no COM, no registry access, etc). Even at the sacrifice of speed, stay 100 percent managed code. In the long run I honestly feel it will lead to a better, more portable engine that customers can deploy. One customer told us how they saved over $40,000 a year in support costs because VistaDB was so much easier to deploy than using SQL Server Express.
Compatibility: We try very hard to ensure our subset of T-SQL is compatible with SQL Server. We want people to be able to scale up at some point without feeling the pain of a total rewrite. I have worked on many projects over the years that when it came time to upgrade the entire app had to be rewritten because of a change in backend database. Sometimes customers need a server, we want to make that transition as easy as possible for them to step up when the time comes.
...
Two months after this interview, Jason is putting his company up for sale. In my next blog I'll look at why.
Posted by Peter Vogel on 07/21/20101 comments
In
my last blog post I talked about how many developers build a product and then try to market it. Isn't the news full of articles about people making it big writing iPhone apps?
The poster boy for this process should be Jason Short of VistaDB (we reviewed VistaDB in the July issues of Visual Studio Magazine). Jason's story sounds like the archetypal story of the developer turned product vendor: he acquired the VistaDB product technology after using it to build some major systems at the company he worked for. The owner of VistaDB at that time was looking to move on and Jason was looking for something new to do.
How archetypal is this story? I remember it thirty years ago! Back then, it was for Remington shavers and it featured the company owner staring into the camera and saying, "I liked the product so much, I bought the company."
But, after three years, it's now Jason who's looking at selling the company and his story is worth telling -- and worth stretching out over several blog entries. I interviewed Jason last month, before he decided to back out of the company, and he talked about the difficulties of running VistaDB. My next blog will be that interview: a snapshot of the company before the end.
Posted by Peter Vogel on 07/19/20100 comments
I get requests from software developers asking me to review a software product that they've developed and are now marketing (in fact, I'm following up on one of those products now). I recognize that the choice of what we review here is essentially capricious. We are driven by what we think developers are interested in and what's happening in the Visual Studio/.NET toolspace. But selecting the next product to review is hardly a scientific process and, if something interesting turns up, we'll follow up on it.
The problem is that most of the requests that I get from independent developers don't sound interesting. Many of the requests I get don't do a good job of making what I think of the "value proposition" -- they don't tell me why a developer would care about this product.
Essentially, what I do get are a lot of technical specs but no explanation of why anyone would want to spend money on the product. After reading the request, I don't know how a developer would benefit from buying the product. Actually (as I pointed in an earlier blog), while I get lots of technical specs, I often can't figure out what the product does.
A variation on these patterns is a request to review a "me too" product: one that sounds a whole lot like some other product (and, often, like components of Visual Studio). The request might mention some technical feature that other products don't have. But, again, it's not clear to me why this additional technical feature has value to developers.
So, if you want to attract the attention of a customer (or a reviewer) you have to stop thinking like a product developer. You have to abandon what helps you create a great product and start thinking like a customer. Only once you're thinking like a customer can you figure out what, in the customer's eyes, makes your product valuable. And that's what you need to talk about -- that's what will make your product interesting.
And, while you're thinking like a customer, remember that the customer won't know what your product does until you tell them.
Posted by Peter Vogel on 07/16/20100 comments
There are some areas of the Visual Studio/.NET toolspace that I'll probably never review (or we'll recruit a guest reviewer who knows what they're talking about). I wouldn't feel comfortable about reviewing tools supporting developers creating Geographical Information Systems (GIS), for instance, because I don't think that I know enough about the field to have an opinion. I would, on the other hand, feel comfortable about reviewing tools that allow developers to incorporate GIS output into a business application because that is a kind of application that I've created for clients in the past.
Another example of an area that I'd feel uncomfortable reviewing is "application generator tools." I just got a press release for Iron Speed Designer, for instance, which generates .NET database and reporting applications. You provide the database and it provides the application. The product is timely (always a good feature in a review) and the price isn't unreasonable (actually, at $1,000 to $2,000 it's cheap for an application generator).
The problem is that I don't believe in application generators (I also don't believe in Ruby on Rails or Dynamic Data). Fundamentally, I don't think that application development has reached the stage yet where it's possible to create applications automatically from a set of specifications (a database design, an object model, a UML diagram). I think that we're still at the "craft" stage of this profession and not yet at the mass production, "assembly line" stage.
Perhaps I've just been burned as a child. Almost two decades ago, I did some work with an application generator that was intended to reduce the programming required to create a new application. It did reduce the amount of code we wrote (at the time, I was programming in PL/1). But, instead of programming in code, we just ended up programming in the specifications we submitted to the generator. We moved the problem around but didn't actually solve it. I've worked (briefly) with some application generators since then and faced the same problem.
And with all of those tools, no matter what I did I never really got what I wanted in the resulting application. There's a problem from a reviewer's point of view as well. Perhaps I wanted the wrong things. I always tell people that, whenever you adopt a new tool, you should first figure out what the designers of the tool thought you were going to do with it. Once you know that, you should do what the tool wants.
I've seen any number of developers frustrated because they were trying to "trick" some tool into doing something it wasn't designed to do. Life would have been much easier (and more productive) for those developers if they just did what the tool wanted. So, perhaps, with application generators, I've had the wrong set of expectations.
For whatever reason, I worry that I wouldn't give a fair review to an application generator -- not out of ignorance but out of prejudice. What's your experience? Are application generators something that makes sense to you? Am I just being narrow minded?
Posted by Peter Vogel on 07/08/20105 comments
In
my last blog I posted the first paragraph from three vendor news releases that failed to tell me much about the product they were selling. Here are the answers:
- The FTP tutorial in my first example was the introduction for a COM control (i.e. not for .NET, though you wouldn't discover that until the fifth paragraph) for uploading and downloading files.
- The news release that began by telling me about the product's internals turned out to be doing something with extracting information from and then displaying PDF documents.
- The product sold by the company that was advertising itself: A POP3 client control. I could actually use a good e-mail control, but the company was too busy selling itself to tell me that they had one.
Posted by Peter Vogel on 07/07/20100 comments
If I was selling a product, I'd want everyone to know what it does. I'm constantly impressed by the number of news releases that don't tell me
what the product does! To honor those products, I created the "Guess That Product!" game. To play the game, I provide you with real news releases (names changed to protect the guilty) -- you guess what the product does. And no fair Googling or Binging key phrases to find the original release! I played this game once on my
technical writing blog, but now you can play it here!
The first one's easy. This product appears to do something with FTP but you have to guess what it does with FTP and what platform/technologies it supports:
Transfer of files across networks is widely used to update applications and to exchange information in bulk. For the developer, FTP (File Transfer Protocol) and HTTP (Hypertext Transfer Protocol) provide a reliable means for sending and receiving files where the protocols are based on well-known and widely used standards.
In this release, the company was very proud of their product's internals. I think it does something with documents. You guess what:
OURRastafarian.NET is a 100% .NET component written entirely in C#. It has no external dependencies and consists of just one assembly making deployment truly simple. This component has a simple and straightforward object model consisting of just 3 classes: Document, Pages and Page.
This release is all about the company. What's the product they're selling (grammatical errors also corrected)?
The Lunar Software company was founded in 2007. We create, market and distribute software components and applications for the Microsoft .NET Framework. Our company also provides software development and outsourcing services using technologies such as: .NET 2.0 (ASP.NET), Java, Delphi, C++, PHP, HTML.
Answers in the next blog.
Posted by Peter Vogel on 07/06/20100 comments
I sometimes worry that, looking back at the reviews we do here, that I seem unremittingly positive about most of the products that we review. There's a couple of reasons for that.
First, generally speaking, we pick out products that we think have real benefit for developers and that developers are interested in. There's not much point in reviewing products that we know aren't great and that no one's buying. We do dig around looking for products to review that developers may overlook but, even then, we don't pick a product unless it we think it has real merit. Life is too short to beat up on products that no one cares about.
Second, when we review a product, we review it "on its merits." In other words, I look at the surrounding marketing material and talk to the company producing the product about what the product's goals are. My goal is to measure how well the product lives up to its goals and ferret out what developers might reasonably expect from a product that has those goals.
It seems foolish to me, for instance, in a review of a set of server-side only controls to complain about the lack of AJAX support. If I thought that developers expect that all controls should have AJAX support I wouldn't review the product -- it would be a waste of everyone's time. If I thought developers would value "server-side only" controls but might assume that the controls included AJAX support, then I would make clear that these controls don't have the AJAX support. But I wouldn't criticize a product for not having AJAX support: It wasn't part of the product's goals. I would criticize the product if the controls didn't work well as "server-side only" controls, though.
And I can also critique the product's goals: Lots of products seem to be built to do things that just don't seem to need to be done.
Even if a product does live up to the company's intent, I think it's fair for me to critique products on the basis of "value for money". If a product is delivering $50 worth of value for $200 worth of cost... well, that's just wrong.
That's not to say that if the product achieves goals that the company doesn't list that we can't assess how well the product achieves those "other" goals. It's the 'Shania Twain' effect. In an interview in a magazine, Ms. Twain was asked why her skin looked so great (and, you have to admit, the woman has great skin). She said it was the result of using Bag Balm, a product originally developed to deal with chapped cow udders (my wife speaks highly of the product also). I won't speak to Bag Balm's other uses (apparently some people eat it, believing it extends their lives and improves their health) but if I was reviewing Bag Balm -- and after seeing how it actually worked on cow's udders -- I might consider its effect on human skin.
What I can do, as a practicing .NET developer, is talk about how well the product actually works for a real developer building real applications for real clients in the world that we all live in.
Posted by Peter Vogel on 06/29/20100 comments
Several readers pointed out that, in my
review of Visual Studio 2010, I gave short shrift to the extensions available in the Visual Studio Gallery from the Tools | Extension Manager menu. Actually, those readers said that I was overlooking one of the best features of Visual Studio 2010. Many of those tools also work in Visual Studio 2008 and 2005 (Visual Studio 2010's Gallery just makes it easier to get to them), so it really is a shame not to look at them.
Here are some of my favorites from the Gallery: one that works only in 2010 and two that work in earlier versions of Visual Studio. Recognize that this list is driven almost entirely my own preferences and the kind of things that I do with Visual Studio -- your list would almost certainly be different.
Visual Studio 2010 Power Tools
Several of the Gallery downloads are grab bags of useful tools. It's worth investigating all of them but I like this package best because of (and I know this going to sound trivial) the new options it provides for tabs. For instance, with Power Tools installed, when you reach the maximum number of tabs that will fit at the top of your editor window (the "document well"), Visual Studio removes the least recently used tab rather than the tab furthest to the right. You can also pin tabs for the documents you don't want to ever go away. Putting these two features together means that the tabs you want are far more likely to be still on the screen when you want them.
Beyond the tabs, Power Tools provides a new, searchable Add Reference dialog, which saves you scrolling down to the start of the "System" libraries (but isn't available for "projectless" ASP.NET Web sites). My favorite other feature: Align Assignments, which lines up the equal signs over multiple lines containing assignment statements. Unfortunately, you have to invoke it with a keyboard macro, but it's one I'll probably actually learn. After all, neatness counts!
tangible T4 Code Editor
I wrote about code generation in my book and spent a chapter and a case study covering T4. There's still no native T4 editor for Visual Studio, but you can get the free version of tangible's T4 editor from the Gallery (versions are also available for earlier versions). You really need to be investigating code generation and T4 is a good place to start. tangible's editor makes it that little bit easier to write templates.
Open Data Protocol Editor
I discussed WCF Data Services in the Practical ASP.NET column -- it makes returning Entity Framework objects to clients ridiculously easy. The format that's used for moving those EF entities around is ODP and this visualizer lets you see what your client is actually getting.
My only complaint: I can only use the Visualizer in a client program and there've been times when, creating the service, I've wanted to know what my clients will actually be getting. This is another tool that works in multiple versions of Visual Studio.
JavaScipt Parser
This tool also works with Visual Studio 2005 and 2008. As the amount of JavaScript in my aspx files has increased, just finding my way to the JavaScript routine I want has become a significant part of my programming time. JavaScript Parser provides a new tool window that lists all of my JavaScript routines and prototypes in one place and lets me jump to them just by double clicking on it. It's not the most polished tool I've ever worked with (hover your mouse over the refresh button and it displays the tooltip "toolstripButton3", for instance) but it works for me.
Posted by Peter Vogel on 06/28/20100 comments
Numerous products have failed because the company wasn't able to move the product's base platform forward in the face of changing technology. The most notorious example is dBase, a product that was late to the Windows party and (when it finally did show up) was already slurring its speech and dropping things into the punchbowl.
But the problem is that, no matter what platform a company uses when they first launch their product, there will come a time when that platform is hanging around like an unwanted boyfriend. You know, the one still dressing like the lead singer in A Flock of Seagulls when you're reading to move on to the singer from ColdPlay.
And delivering a product on the new platform is only half the problem. After you upgrade, what do you do about all the applications built with the old platform? It's an old joke: Why was God able to create the world in just six days? No installed base. While Microsoft Access developers got used to the fact that every time a new version of Access came out their old database format was rendered unusable, not all users are so forgiving.
More recently, Nevron Diagram is in the midst of it (we reviewed Nevron Diagram in April). For the last four years Nevron has been developing a new visualization platform for their product and expect to release products based on the new platform in the next two or three months. This includes a new serialization format that is intended to support backward compatibility, formula based shapes (think Visio), and improved performance.
It's easy to see what's driving the new platform. Nevron wants to create a single API that supports the variety of .NET technologies/platforms: WinForms, WPF, Silverlight and even work on mobile devices. The only platform that Nevron is unsure about supporting is ASP.NET.
That "single API for all platforms" isn't an easy goal to hit: It implies that the product can integrate with the native controls on any environment and that the product save output in one environment to load it in another. To make the problem that little bit more interesting, Nevron wants to improve performance, add support for undo/redo, roll in Adobe vector formats, and deliver some commercial grade data analysis tools on top of the new platform.
Nevron describes it as the most important release in the company's lifetime. I'd agree that. So it will be interesting to check in after the summer and see whether (a) there is a new product and (b) whether it actually, you know, works. That whole working thing is sort of a feature.
Posted by Peter Vogel on 06/24/20100 comments
Oleg Stepanov, software developer at JetBrains, continues our conversation about creating add-ins for Visual Studio 2010.
Peter Vogel: What are the major technical issues in creating add-ins for Visual Studio?
Oleg Stepanov: I won't go into detail about how complicated it is to create code analysis technology working in real time as a user enters code in the editor. Instead, let's talk about VS integration. The first thing one must remember when integrating with Visual Studio is that it is still mostly a single-threaded C++ COM application. So you have to mind CCW-s, RCW-s, message loops and STA.
One specific thing I should mention is that Visual Studio uses a lot of different allocators to gain memory, and this quickly fragments the virtual address space. As a result, at some point CLR is no longer able to allocate a continuous block of memory for its heap even though the there's still memory available in small chunks. This forced us to rethink some of our memory allocation patterns and even move out some of the data structures to an external process.
PV: Were there any special issues in getting out the new version of ReSharper for VS 2010?
OS: The most visible innovation in VS 2010 is of course the new shiny WPF code editor. We spent a good chunk of time just moving to the new API. Keeping in mind that ReSharper 5 supports a range of Visual Studio versions from 2005 to 2010, we had to make sure our abstraction layer over the VS API works consistently across the different versions of Visual Studio.
Another change, though not that visible, was the adoption of Managed Extensibility Framework (MEF) to manage components in Visual Studio, which has also become a new integration point for us. Happily, the VS development team was very helpful in the course of integrating ReSharper with Visual Studio 2010. They fixed a number of issues which were affecting us and generally helped us to improve stability and performance of ReSharper.
Posted by Peter Vogel on 06/21/20100 comments
JetBrains recently released a new version of its Visual Studio add-in Resharper (which
we reviewed last year). I got the chance to ask some questions of Oleg Stepanov, software developer at JetBrains about creating add-ins for Visual Studio.
Peter Vogel: What are the major issues in competing in the "VS Add-in market"?
Oleg Stepanov: The market is very segmented and we're playing in the productivity tools niche where competition is very specialized. Productivity tools aim to create a productive environment for developers where programmers and tools join to become a whole. Tool vendors try to provide features that give developers superpowers, yet are very easy to use.
Approaches are different. ReSharper stretches its functionality in five main areas: navigation, code analysis and fixes, refactoring, code generation and testing. These areas are supported for code written in C#, VB.NET, ASP.NET, ASP.NET MVC, resource files, etc. Some other tools on the market focus on specific areas such as code templates. Some competitors try to mimic certain ReSharper features in their products, and often our challenge is to demonstrate that ReSharper is "the real thing."
PV: How does ReSharper compete in this market--what is ReSharper's competitive position?
OS: ReSharper's competitive advantage stands on three pillars: Technology, Usability and Community. Being a code-centric tool means deeply understanding the developer's code and modifying it with great care. Over the last seven years, we've developed a very advanced parsing technology, which tackles even minor peculiarities of C# and Visual Basic to obtain very precise understanding of the code semantics.
Some people think that since most of the code written is quite simple, simplified techniques will perform well in most cases. The reality is that when you refactor code, you don't want to double-check after your tool to make sure your code still does the same thing. Add to that the complexities caused by third-party libraries, and you'll see that you need a tool that can be trusted.
ReSharper was inspired by IDEA, JetBrains' world-famous Java IDE that is still our flagship product. ReSharper inherited part of its user experience, which has been well accepted by software developers. Over time, ReSharper's path diverged from that of IDEA to answer the needs of its own community. Still, the growth of our user base confirms that ReSharper delivers a very pleasant coding experience to .NET developers. In fact, some developers find the way ReSharper extends the Visual Studio experience so natural, they simply become unable to work productively in bare Visual Studio anymore.
Since early 2004, when the ReSharper project first became marginally useful, we've been running an Early Access Program (EAP). This program allows anyone to download pre-release builds of ReSharper, use them for free and provide feedback about the product. This program has helped us build a unique community of top-notch developers around our products. These folks have suggested many features, created a number of free and commercial plug-ins for ReSharper and generally helped us make sure we're on track with the latest trends in the community. At most of the conferences we attend, we see that a lot of presenters have ReSharper installed, and this underscores the fact that the tool is adopted by key influencers.
Posted by Peter Vogel on 06/17/20100 comments
In my last blog post I discussed the essential criteria that I'll be applying for our review of local storage databases, VistaDB and db4o, to appear in our July issue. In this blog post, I'll talk about some less essential criteria.
For my local storage database, I want as much compatibility with my existing database engine as possible. I don't want to have to use a different version of SQL to work with the local database than I use with my central database -- that's just an accident waiting to happen. I'll eventually get confused and use the wrong syntax with one of the database engines and not discover my error until I attempt to execute my SQL statement. This is the reason that I'm less interested in a Jet database solution. For instance, doing Joins across databases in Jet requires a different syntax than I use with any other database engine.
Of course, if I can use LINQ (and Entity Framework, presumably) to access my local storage database then this issue goes away: I can leave it to LINQ to handle generating the appropriate SQL for whatever database I'm connected to. I assume that I'm going to have to manage connection strings, anyway, so managing the right LINQ connections will be similar. Even in this scenario, though, I need a local database that will support all the names I've given my tables and columns.
Overall, the more that the local database implements the functionality that I expect to find in my central database, the happier I will be. While having views and stored procedures isn't essential, having them means that I have the full range of tools that I use in creating applications available to me to support local processing. Stored procedures is another blow against Jet (unless I use the version that comes with Office 2010). The problem with stored procedures is that I'd probably have to rewrite them from whatever language I use on my central database (or learn a new stored procedure language).
Replication would also be nice to support synchronizing local storage with the central database because, otherwise, I have to write the code myself. However, expecting a replication process to work across the Internet through Web services is probably too much to ask for. On the 'ubiquitous' criteria that I mentioned in my last post, a solution that runs anywhere that .NET runs would be nice, though it's not unreasonable to expect all of my client to be running Windows. This is another problem with SQL Server and Jet, which are "Windows-only" products.
And, of course, I want as much integration with Visual Studio as possible. After all, it's the name of the magazine.
As I asked at the end of the blog posts that described the scenarios that I'll be testing for: Do these criteria sound sane to you? What criteria would you want to apply when looking at local databases?
Posted by Peter Vogel on 06/16/20100 comments