When Microsoft released Silverlight 3 last week, there was much attention paid to its ability to support rich Internet applications outside the browser. But what does that mean for data-driven applications?
At the MIX 09 conference back in March, Microsoft announced .NET RIA Services. In a blog posting at the time, Brad Abrams, group program manager for Microsoft's .NET Framework explained:
Microsoft .NET RIA Services simplifies the traditional n-tier application pattern by bringing together the ASP.NET and Silverlight platforms. The RIA Services provides a pattern to write application logic that runs on the mid-tier and controls access to data for queries, changes and custom operations. It also provides end-to-end support for common tasks such as data validation, authentication and roles by integrating with Silverlight components on the client and ASP.NET on the mid-tier.
So that begs the question: Will .NET RIA Services be preferred over ADO.NET DataServices for Silverlight data access? Scott Guthrie, corporate VP of Microsoft's .NET Developer Platform group, in an interview at last week's launch event, said "no."
"The bits that are being released today for RIA Services, actually build on top of ADO.NET DataServices," he said. "So you can think of ADO.NET DataServices as providing a kind of lower layer RAW/REST API, and then RIA Services as a layer on top. We definitely think that there are scenarios where you would want to have a pure REST service model. And then the .NET RIA Services gives you things like the validation, cross-tiering, and higher-level services on top. We’ve worked hard to layer them nicely, so that RIA Services isn’t a competitive technology, but actually just builds on top of ADO.NET Data Services." A complete copy of the interview is available here.
Andrew Brust, chief of new technology at twentysix New York welcomed the fact that the team is working to integrate RIA Services with ADO.NET Data Services, based on his review of Microsoft's newly released NET RIA Services overview paper. "This is certainly welcome news," said Brust in an e-mail.
"With the Entity Framework and ADO.NET Data Services joining bare ADO.NET and DataSets, there are already plenty of data access technologies to go around and we certainly didn't need another separate model. It looks like what they're doing with RIA Services is making it a value-added business logic/validation UI toolkit for Silverlight that works on top of ADO.NET Data Services."
What's your take on .NET RIA Services? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 07/16/2009 at 1:15 PM2 comments
Microsoft once again is renaming its forthcoming data-oriented cloud services. The company's SQL Services will be called SQL Azure, while SQL Data Services are now called the SQL Azure Database. It's the third name for the platform originally known as SQL Server Data Services.
The company made the announcement today on its SQL Server Blog. The new names do not reflect any changes to the underlying services, Microsoft said. "By standardizing our naming conventions, we're demonstrating the tight integration between the components of the services platform," according to the blog posting.
SQL Azure is the forthcoming set of services that let users conduct relational queries, search, reporting, and synchronization, while the SQL Azure Database will provide the cloud-based relational database platform.
Oakleaf Systems' Roger Jennings described the move as purely cosmetic. "The change from entity-attribute-value to relational tables for SQL [Server] Data Services has been in the works for months," said Jennings, who wrote this month's cover story in Visual Studio Magazine, "Targeting Azure Storage."
Meanwhile, Microsoft this week released the July Community Technology (CTP) of its PHP SDK for Windows Azure. The CTP provides Windows Azure support for the Zend Framework and it also includes PHP-based Windows Azure Table Storage APIs, according to Microsoft's [email protected] blog.
Microsoft has indicated it will announce Azure pricing and licensing terms at its Worldwide Partner Conference in New Orleans next week.
Posted by Jeffrey Schwartz on 07/08/2009 at 1:15 PM0 comments
A few weeks ago, Microsoft disclosed it will be issuing a fourth Community Technology Preview (CTP) of its in-memory data caching software, code-named Project Velocity in mid-September.
As reported, that means it will be released to manufacturing later than Microsoft had hoped. But that is no doubt good news to third party providers of in-memory data caching software, such as Giga Spaces, Gemstone Systems, Oracle and ScaleOut Software, and quite a few others who will ultimately find themselves competing with Microsoft’s free offering.
I recently caught up with William Bain, ScaleOut’s CEO, who was quite appalled when Microsoft unexpectedly announced the effort a year ago. In my more recent meeting with him, I asked how he intends to compete with free. "It’s going to be a strong competitive threat to us," he admitted. "However I think we have some strong differentiators that position us well to co-exist with Velocity."
Bain’s company has introduced a new feature to its ScaleOut StateServer Grid Computing Edition (SOSS/GCE) platform called parallel method invocation, or PMI. Bain explained that PMI lets applications reach peak performance by allowing the distributed cache to run on the local systems where the data is hosted, thereby reducing the need for the data to move. Secondly it creates a map-reduce framework designed to sharply minimize development time while utilizing the capacity of a compute grid.
This should ease the development of a data parallel program such as a typical financial services application, where creating an app for data analysis to be easily implemented, Bain said. That’s because "the developer can focus on the application code and not on having to write code to either abstract parallelism or to connect to the cache explicitly," Bain explained. The other benefit is high performance.
ScaleOut, along with Lab49, a technology consulting firm specializing in financial services industry solutions, pared up to create a case study on the performance benefits of PMI in a financial services app, which can be downloaded here.
But at a price of $1,600 per server, it begs the question: can companies like ScaleOut compete with free? Bain points out that capabilities such as PMI and support for both .NET and Java -- in fact the company demonstrated this feature at the recent JavaOne conference -- are key differentiators.
"Velocity will be competing with nothing," said Marc Jacobs, a director at Lab49. "It will be the choice between having no distributed cache and one that’s free. I don’t see it competing with any of the commercial products simply because they operate in a different region of support, functionality and customer confidence, particularly in financial services. I think the likelihood of Velocity being adopted for any of these large scale distributed cache scenarios is just unlikely."
That may be true for a segment of large scale applications but Microsoft has a history of doing pretty well when it adds freebies to the mix. Just look at SQL Server Integration Services and SharePoint, to name just two. It will be interesting to see of companies like ScaleOut can carve a large enough niche or whether they are merely a precursor to capabilities Microsoft and its larger rivals will offer in subsequent releases.
Bain has been down that road. An earlier company he founded, Valance Research, was acquired by Microsoft more than a decade ago. That company’s IP load balancing software that is now the network load balancing feature embedded in Windows Server.
Are you testing Velocity? How do you see it impacting the development of your data driven apps? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/25/2009 at 1:15 PM1 comments
Microsoft's decision to remove the Oracle data provider from its ADO.NET roadmap has generated a lot of buzz, with some saying it was wise for Redmond to cut bait on it, and others wondering whether it's going to mean lots of code re-writing.
As reported yesterday, Microsoft is discontinuing its System.Data.OracleClient. Though it will be available in .NET Framework 4, it will be labeled as "deprecated."
That was disappointing news to Ayub Patel, a vice president and senior technical specialist at a major New York bank that has ASP.NET 2.0 applications that need to connect to Oracle databases. Patel wants to move to Entity Framework for the improved performance. "Entity Framework is more robust and its C# class-based. We want to leverage that part," he said. Using third-party tools is not an option, he added, so he will just wait until Oracle or Microsoft heed the call.
But it's at third parties that Microsoft is pointing to fill the gap for its now discontinued ADO.NET data provider for Oracle. Companies such as DataDirect Technologies and Devart (formerly known as Core Lab) offer such tools. In addition, Oracle's provider, called Oracle Data Provider for .NET (ODP.NET), is by many accounts better than Microsoft's System.Data.OracleClient.
"We already have Oracle Data Provider for .NET, which is much better than [the Microsoft] version," writes Ravi Santlani from Birmingham, U.K.
"Microsoft is dropping duplicated efforts to maintain a driver that Oracle already does better," adds Lynn Crumbling of Lancaster, Pa.
"ODP.NET provides more comprehensive support for Oracle and demonstrates a more subtle understanding of, and fidelity with, Oracles data types," notes Andrew Brust, chief of new technology at twentysix New York.
Brust adds that it all boils down to this: "If we could just keep Microsoft from churning its data APIs so much, we wouldn't have to keep repeating this cycle. The journey from ODBC to OLE DB to ADO.NET has taken us through several cycles of going from broad to sparse support for and by other databases. With the advent of LINQ and the Entity Framework, we are essentially going through yet another such cycle."
If you were among those who have used the System.Data.OracleClient or if you were hoping to use the Entity Framework to connect to Oracle, let me know how this impacts you and how you're going to move forward. Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/17/2009 at 1:15 PM5 comments
It looks like there will be another test release of Microsoft's in-memory data caching software, code-named Project Velocity. Microsoft today said it will release a fourth Community Technology Preview (CTP) in mid-September, leading to the conclusion that the company will miss its goal of a summer release to manufacturing.
Announced a year ago, Project Velocity is designed to provide scalable performance of data-driven applications by reducing the number of calls the app has to make to the data source. According to Microsoft, it offers high-speed access to data developed in .NET via partitioned, replicated or local caches. It does so by fusing memory across multiple servers to provide a single, unified cache view to apps.
Microsoft has said it will offer Project Velocity free of charge. It was expected to RTM this summer but in a blog posting by the Project Velocity team, Microsoft revealed plans for CTP 4. The new CTP will offer improved stability and security.
CTP 4 will also include at least two new features: performance monitor counters and support for setup and configuration changes. The performance counters will be available for both the host and the cache. Microsoft's Sharique Muhammed provided details in a separate posting today.
Meanwhile, CTP 3 has been in the hands of testers for over the past two months, and Microsoft today posted code samples on its MSDN site today.
What's your take on Project Velocity? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/10/2009 at 1:15 PM2 comments
With the release last week of the Visual Studio 2010 beta and the .NET Framework 4, developers are getting their first peek at Microsoft's next generation IDE. As reported yesterday in RDN Express, many are delving into the new WPF editor. But it also gives developers a first peek at the ADO.NET Entity Framework version 2, which is Microsoft's preferred model for building applications that access databases.
Given much of the backlash about the first release, the update has been eagerly awaited. EFv2 adds support for n-tier APIs and templates, increases Plain Old CLR Objects (POCO) coverage and improves Persistence Ignorance, in addition to other improvements, according to Microsoft.
"The Entity Framework itself has pretty much undergone a radical transformation," said Stephen Forte, chief strategy officer a Telerik, who has tested the new IDE. "It really addressed some of the concerns of the community in that respect."
Ultimately, the question is, will it win over LINQ to SQL developers, who were up in arms last year when Microsoft made clear that it was putting all its eggs in one basket -- that is, with the Entity Framework. "As far as I am aware, we haven't seen any votes of no confidence," Forte, said, referring to last summer's petition to Microsoft by those who had issues with the architecture.
When it comes to ADO.NET Entity Framework, Nagarro Inc., a San Jose, Calif. provider of outsourced software development services, decided to not use the first version of the Entity Framework in any of its projects. "We were pretty happy with LINQ to SQL." said Vaibhav Gadodia, a .NET architect at Nagarro, in an e-mail. "We had invested a lot of training effort as well, since it was (and is) such a shoe-in for that ORM layer."
Of course, many who invested in LINQ to SQL felt jilted last fall when Microsoft shifted its focus to the EF. With the v2 release of the EF, Gadodia said a lot of the features of LINQ to SQL are now available within Entity Framework. "It almost feels as if Microsoft copied some of these into EF from L2S (as a result of developer feedback)," he noted. "We will continue to use L2S in the short term (our current projects); for the simple reason that most of our developers are trained in the technology."
Among other features he likes in the updated EF are things like the modeling support (which now generates DDL based on the model). "There are other changes which are useful (for instance POCO support). We are developing a new internal application framework to use across projects, and changes in the EF have put it in the front-running for using it in our framework," he noted.
Still Nagarro won't migrate to the EF until there is a "well supported migration path from L2S to EF, as well as complete compatibility between the two," he said. "Until then we are going to be limited to trying EF out in new initiatives, but not offer it to all customers until we have clarity on where it is going to stabilize."
What's your take on the Entity Framework v2? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/03/2009 at 1:15 PM1 comments
As reported by my colleague Kathleen Richards, the Visual Studio 2010 and .NET Framework 4 Beta 1 bits were released to MSDN subscribers Monday, with the public beta set for release today.
For those developing data-driven applications, the beta is expected to give developers a first look at "Entity Framework version 2" or what Microsoft dubs "Entity Framework 4" As reported, EF 4 adds support for n-tier templates, Plain Old CLR Objects (POCO) and persistence ignorance.
Developers can learn more about the updated functionality in the ADO.NET blog.
Some of the other key improvements to the Entity Framework, according to Microsoft, include integration with the ADO.NET Entity Framework Designer and T4 Templates in Visual Studio for customized code generation. Microsoft also added lazy loading and stored procedure mapping, improved LINQ support and SQL generation readability. The new release is expected to boost T-SQL performance, Microsoft said, among numerous other upgrades.
With this next generation of tooling, Microsoft increases its support for third-party databases beyond IBM's DB2. Quest Software is developing a database schema provider for Oracle that supports Visual Studio Team System 2010.
In the coming days and weeks, the editors of Redmond Developer News and Visual Studio Magazine want to hear your observations. Please drop me a line at [email protected].
Posted by Jeffrey Schwartz on 05/20/2009 at 1:15 PM0 comments
When it comes to Microsoft's Azure cloud services, it seems developers are either quite immersed in the technical underpinnings of Redmond's next-generation platform or they're ambivalent about it.
Microsoft this week showcased some of its customers' various mindsets on cloud computing at its Enterprise Developer and Solutions Conference in New York. The customers -- which include The New York Times, Merrill Lynch, Raytheon and Netsoft USA -- joined Doug Hauger, general manager of Microsoft's cloud infrastructure services, onstage Tuesday during the event's keynote address. Naturally, given the host was Microsoft, they were enthusiastic about the prospects for cloud computing and Azure. But they were tempered in their assessments by concerns over security, privacy, availability, reliability, compliance and other relevant issues.
Take John Slaby, chief engineer of defense contractor Raytheon, who pointed out that there's a movement in the Department of Defense to develop a global information grid. While cloud computing at some point will be a key solution, security is still an issue.
"You've got to get to the point where the cloud can be secured so it can handle highly classified kinds of events, but I think it is the direction that they are moving," Slaby said.
Others had similar assessments. "You do have to think about those things as you move to the cloud," Hauger said.
Despite those concerns about cloud computing, one key component of the cloud, Software as a Service, is poised to grow 22 percent this year, Gartner said in a report released today. Revenues for applications running on such services will total $9.6 billion, and will top $16 billion by 2013, Gartner said.
"Software as a Service has become a more acceptable deployment alternative in general," said Gartner analyst Sharon Mertz. "It's more often being considered as another sourcing strategy for many companies."
For Microsoft, things with Azure will heat up in the coming months. Hauger said Microsoft is on track to announce pricing at its Worldwide Partner Conference in July and roll out the service at its Professional Developers Conference in November.
Microsoft is taking every opportunity it can to get developers to download the Azure SDK to start building applications.
Its most recent effort launched this week is the Azure Developer Challenge, a contest the company hopes will showcase applications designed to run on Microsoft's Azure cloud platform. It's available to .NET and PHP developers. Steven Martin, developer platform product manager at Microsoft, announced the contest on his blog Monday.
"We're looking for innovative apps developed with the user experience in mind that are applicable to the real-world and highlight new opportunities cloud computing brings to developer," Martin wrote.
The company is offering awards of up to $5,000 and has categories for both .NET and PHP developers. In the .NET category, the app should use ASP.NET or Silverlight, and bring in other Azure sevices such as .NET Services or Live Services, in addition to other Microsoft and third party apps, Martin noted.
In the PHP category, interoperability will be a key measuring point both with other cloud platforms and APIs.
Judging the entries will be RedMonk analyst Michael Cote and Om Malik, founder of GigaOM Network. Rules and entry information are available here.
For those who have tested the Azure SDK, have you done so on your own time, or is your company encouraging you to put it through the rigors for potential deployment? Express yourself by posting to this blog or at [email protected].
Posted by Jeffrey Schwartz on 05/07/2009 at 1:15 PM0 comments
When Oracle stunned the IT world last week and snapped up Sun Microsystems from right underneath IBM in its $7.4 billion deal, I posed the question: What will happen to the open source MySQL database platform
? But the bigger question many developers are asking is: What impact will Oracle have on the future of Java?
As the new steward of the Java brand, will Oracle make it a proprietary platform like Microsoft's .NET or will it embrace and advance the existing Java Community Process (JCP) and assure that it does not become fractured? No one will know for sure until Oracle closes the deal. In the meantime, stakeholders are holding their collective breath.
Java is Sun's most valuable asset and Oracle could change its course on Java, according to a research note by Gartner analysts last week. That Sun owns the Java trademark, Oracle stands to retain influence over the JCP, which play a key role in the evolution of Java standards.
"Vendors were comfortable with Sun because it is a benevolent dictator over Java," said Gartner analyst Mark Driver, in an interview. "They influenced it but there was nothing in Sun's business model that was outwardly and obviously opposed to what Oracle or IBM, BEA or SAP was. All of a sudden with Oracle acquiring Java, you do have a case where Oracle and IBM [and others] compete much more heavily."
So the dichotomy lies in the fact that Oracle's key rivals are dependent on Java. If Oracle maintains and extends the JCP, all should be fine. "Despite the hype of write once run everywhere, Java has been remarkably successful in establishing a big binary compatible platform," Driver said. "Technically and politically, my enemy controls a technology that I depend upon. So if Oracle doesn't placate those concerns, IBM will become more aggressive in forging its own open source efforts."
For example, he said IBM could decide to focus on the Apache Harmony Project, a clone of Java. IBM hasn't done much with it, he pointed out, because there was no need to date. "If there is any issue with Oracle, either the perception or the reality, that it is manipulating Java for its individual benefit or does anything to unlevel the playing field, we could get fragmentation, we would lose a Java brand or it becomes another proprietary stack," Driver said. "It would be in Oracle's interest to open it up more."
It could do that by addressing one of the biggest complaints about the JCP: The fact that each working group has a specification lead typically represented by a single vendor that has substantial influence over where a specific piece of Java goes. "They may very well need to evolve the JCP to address those concerns, open it up even more," Driver said.
Wayne Citrin, CTO and founder of JNBridge, a supplier of software that links Java and .NET applications, agrees, saying there are issues with the way Java Specification Requests (JSRs) are handled. "JSRs that get implemented are kind of messy and not particularly coherent," said Citrin, who like Driver, is betting that Oracle won't look to hijack Java. "I think if Oracle were really heavy handed, people might just drift away from Java, but I think there's so much invested in it that that's unlikely to happen, and I think Oracle knows the lure of Java is that everyone is using it. It works both ways."
The larger question is what will Oracle's acquisition of Java will mean for Web services, asked Rich Wolski, founder and CTO of Eucalyptus Systems, a provider of open source software to enable hybrid public-private cloud-based services. The company, which announced its formation yesterday with $5.5 million in capital from Benchmark Capital and BV Capital., said its service leverages commodity Web services that are Java-based. "Java is so entrenched in the whole Web service arena that people really are anxious about how that's going to break," Wolski said. Ironically, he said in a worst case scenario, tying to Microsoft's .NET could be an alternative. "The .NET Web service infrastructure is very, very powerful, and we could easily port in that direction if Java no longer became viable. There's a question of how much of that we can use as part of our open source mission but technologically, it's very feasible."
Many believe there is no way Oracle will let Java splinter, such as Tony de la Lama, who was at Borland Software at the time it became the third licensee of the Java platform in 1995. "I 100 percent believe that Oracle is going to ensure that Java remains a viable and growing platform," said de la Lama, who recently joined Embarcadero Technologies (which acquired Borland's CodeGear tools business last year) as senior VP of R&D. "It has its own business interest to make sure that happens."
Posted by Jeffrey Schwartz on 04/30/2009 at 1:15 PM0 comments
While there is no shortage of questions surrounding what Oracle has in store for Sun Microsystems, perhaps the most intriguing one is what Oracle will do with MySQL. Will it live or will Oracle, which gains MySQL as a result of its $7.4 billion acquisition of Sun, throw it under the bus?
There is plenty of reason to believe Oracle would not want to in any way, shape or form let MySQL cannibalize the licensing revenues Oracle has enjoyed for so many years from its flagship proprietary database platform. There's the school of thought that Oracle doesn't walk the walk when it comes to open source.
"While Oracle has displayed an ability to participate in and benefit from open source software, I think its expectations and aspirations for open source software are limited," wrote 451Group analyst Jay Lyman.
But lest we forget, while Oracle CEO Larry Ellison talks up how acquiring Sun is a key entre for Oracle to further its assault on IBM, his real nemesis is Microsoft. While it is unlikely MySQL was a huge factor (perhaps not even a reason at all) in Oracle's decision to make its surprise bid, the company's decision to put some emphasis on the open source database could be an opportunity to go after Microsoft in a way it could never do with its flagship database.
In fact, that's exactly what MySQL founder Marten Mickos told Forbes yesterday, arguing they serve two different application types. "Microsoft's database business is the fastest growing," Mickos told Forbes. "Oracle can use MySQL to achieve a stronger developer community."
Forrester analyst Noel Yuhanna agrees. "If Oracle plays its cards right, this could be a great move, since it continues to struggle against Microsoft SQL Server especially in the small- to moderate-sized database market, where Microsoft SQL Server enjoys dominance," Yuhanna said in an e-mail. "A combination of MySQL and Oracle DBMS can cover all bases, and put MySQL against Microsoft SQL Server more competitively. Also, we see that as databases become more automated (which is already happening), the need for tighter integration with hardware and bundling will further grow -- therefore having a database appliance (database machine) will become critical."
While the installed base of MySQL pales in comparison to SQL Server, Microsoft is well aware of the momentum around it and the open source database movement, especially for lower-end Web applications. That's why Microsoft has developed its own PHP Driver for SQL Server and last month released its PHP on Windows Training Kit, which includes technical material, best practices and code samples for building PHP applications that run on Windows, IIS 7 and SQL Server 2008.
"Microsoft is going after those folks in a pretty serious way," said Andrew Brust, a director of new technology at twenty six New York, and a Microsoft regional director. "Read what you want into that but it shows how seriously Microsoft takes MySQL."
While most MySQL applications are PHP-based, it also supports .NET applications, Brust noted. "MySQLhas done a pretty good job at working nicely with Windows and ADO.NET," he said. "But I think by and large it is PHP developers."
According to Sun's internal surveys, SQL Server is the number one platform that customers migrate from when moving to MySQL, said Robin Schumacher, MySQL's director of product management. "People using MySQL on Windows makes a very nice alternative to SQL Server," Schumacher said. For enterprise implementations, Linux is still the largest platform for MySQL "but Windows is right behind it," he added.
"They have to see the value of MySQL in the ability for it to continue to gain on the SQL Server marketplace," added Ian Abramson, president of Independent Oracle Users Group (IOUG) and a director at Toronto-based Thoughtcorp, a data warehousing and BI consultancy, who said the Oracle user community welcomes MySQL joining the fold.
Meanwhile Sun this week coincidently announced the preview of the next release -- My SQL 5.4, which it says will be far more scalable than the current version.
What impact do you think Oracle's acquisition of Sun will have on MySQL, open source databases and SQL Server? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 04/22/2009 at 1:15 PM0 comments
Even though the release of Microsoft's SQL Server 2008 SP1 last week didn't generate much buzz, it is a noteworthy turning point for Microsoft's key database platform.
As reported by Kurt Mackie, Microsoft released SP1, which includes Cumulative Updates 1 through 3 all rolled up in the service pack. Microsoft also added some administrative improvements, including a slipsteam facility, a service pack uninstall capability and Report Builder 2.0 Click Once.
The latter, actually released in October, lets users query SQL Server and build reports using Microsoft Office Tools.
Those incremental improvements aside, the release of SP1 gives a green light to IT organizations that insist on these key updates before putting new applications in production. It is well known that many enterprises won't put business-critical applications onto a major new software platform until that first service pack arrives.
But is this green light going to be enough to open the floodgates and encourage organizations to upgrade their older databases to SQL Server 2008? Even those who subscribe to Microsoft's Software Assurance plan, meaning the upgrade is already paid for, are looking to hold the line on costs that exceed the software license.
And as noted last week, many organizations are looking to open source alternatives to address costs. Open source databases, still a small slice of the market, are probably not expected to have a huge impact on the larger scale database market despite its growth, but they are a looming factor.
Cost issues notwithstanding, SQL Server 2008 does offer higher levels of performance, scalability, policy management and security, as well as its improved T-SQL for developers and support for the ADO.NET Entity Framework. While not all developers have welcomed some of these new features with open arms, SQL Server 2008 will also be an important component of the IDE Evolution Microsoft is embarking on and is the cover story of this month's Visual Studio Magazine by executive editor Kathleen Richards. In that piece, Richards points out that those migrating to Visual Studio Team System 2010 will need to take a hard look at SQL Server 2008:
VSTS 2010, which includes role-based client tools that incorporate VS Professional and a license to TFS, is the first major upgrade to the collaboration environment since its debut in VS 2005. TFS will drop support for SQL Server 2005 as the back-end source control system and thus require an upgrade to SQL Server 2008.
Team System also rolls up the former Developer edition into the Database edition, resulting in Architect, Tester and Database roles in addition to Team Suite, which includes all of the aforementioned functionality in a single SKU.
What's your take on SQL Server 2008? Does the release of SP1 have much affect on your organization? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 04/15/2009 at 1:15 PM3 comments
It seems the open source world is gunning for a bigger piece of the SharePoint pie these days.
As Alfresco Software Inc. continues to emerge as the leading provider of open source software enterprise collaboration software, rival open source vendors are stepping up their efforts.
For its part, Alfresco last week said it finished 2008 with 103 percent year-over-year revenue growth, as well as 92 percent year-over year growth during the last quarter of 2008 for the period ended February 28. Since it didn't disclose its' revenues, it's hard to get too excited about that stat, but the company does appear to be on a roll.
Alfresco said it has added 270 enterprise customers such as Federal Express, Fox Broadcasting, the New York Police Department, the State of Kansas, Sun Microsystems and Virgin Mobile.
I attended the local Industry Association of Software Architects meeting in New York a few weeks ago, where the topic was enterprise content management. It was hosted in Microsoft's offices and the speaker, coincidentally, was Jean Barmash, Alfresco's director of technical services.
While this was a vendor-neutral technology presentation, Barmash pointed out that Sharepoint 2007 rearranged the competitive stakes for ECM players. "They entered the collaboration space and all of a sudden it was a billion dollar industry and all of a sudden everyone in the industry needed to have some kind of SharePoint strategy," Barmash told attendees.
One that is making a push is Paris-based Nuxeo Corp., which last month moved into the North American market. The company, founded in 2000, offers what it calls a complete ECM suite that carries no license fees using the LGPL open source license.
Like Alfresco, it positions itself as a SharePoint Server alternative -- the company has just rolled out the Nuxeo Enterprise Platform 5.2, which adds SharePoint services support. A feature called MS WSS allows developers to implement file-based services, allowing Nuxeo to be seen as a Sharepoint Server via Windows Explorer and Office. "Users can save their documents into Nuxeo as if it were SharePoint," said Nuxeo CEO Eric Barroca.
The new release also includes a SQL-based storage repository, allowing for integration at the SQL level with business intelligence and ETL tools. Also new is WebWorkspace, which allows developers to create workspaces, including wikis and blogs and other collaborative Web sites.
The Paris-based company is not well-known in the United States, but it hopes to change that in the coming year, Barroca said.
Another player that is targeting the open source collaborative space is MindTouch Inc., which is focused more as a wiki-based collaborative application development platform. The company launched at last week's Web 2.0 Conference MindTouch 2009, which it describes as a developer platform for building rich collaborative apps and communities.
MindTouch was founded by two researchers who worked under Craig Mundie at Microsoft, Redmond's chief research and strategy officer. One of them is Aaron Fulkerson, MindTouch's CEO and founder.
"Collaboration is very inefficient and unproductive because you have to plug in a dozen-plus disconnected application data silos, getting access is incredibly painful and time consuming," Fulkerson said. "We built MindTouch to provide a collaborative canvas to stretch across all of your existing technological assets, inside your infrastructure, Web services, services-oriented architectures, databases and applications. We have this connective tissue for all these disconnected systems."
There are many others. If you're with an enterprise developing some of these new apps using wikis and other new capabilities to enable new forms of collaboration, we'd like to hear some of the successes and challenges you are experiencing from a development and deployment perspective. Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 04/08/2009 at 1:15 PM3 comments