The big news for .NET developers this week is the release of beta 2 of Visual Studio 2010 and the .NET Framework 4, which became generally available today (see VS2010 and .NET 4 Beta 2 Go Live). For developers of data-driven applications, beta 2 included some major improvements to the ADO.NET Entity Framework, Microsoft's new interface for providing access to apps using object-relational modeling (ORM) rather than programming directly against relational schema.
First released with the Visual Studio 2008 and the .NET Framework 3.5 SP1, the Entity Framework has become a polarizing issue among database developers. As covered extensively, the Entity Framework has been a godsend to many programmers who welcome the move to model-driven development.
However it has also been much maligned by those who prefer the tried-and-true ADO.NET data access mechanism or those who feel that there are better object-relational mapping technologies than Microsoft has to offer such as the NHibernate or Spring.NET.
Nonetheless Microsoft's Entity Framework stands out in another unique way: it is part of Visual Studio and the .NET Framework. And like it or not, it's here to stay.
"The Entity Data Model represents such a key strategic direction for Microsoft that spans all of our products," said Elisa Flasko, program manager in Microsoft's data modeling group in an interview. It initially spanned SQL Server, Visual Studio and the .NET Framework, while this week support for SharePoint Server was announced (see Microsoft Gives SharePoint A Facelift).
Flasko said the older data access languages and frameworks remain important. "We want to enable our customers and partners to increase their productivity, and better enable the integration across their applications, across Microsoft applications, and across the many different data sources," she said. "That said ADO.NET is a core part of the .NET Framework, as well, the Entity Framework, which builds on top of the traditional ADO.NET SQL client and provider model APIs in general."
With this week's release of beta 2, Microsoft has added a laundry list of features that appear to be welcome additions. Among them is support for Foreign Key Associations, improvements to POCO support, support for Lazy Loading by default in new models, support for binary keys, improvements to the Object Services API designed to allow n-Tier and self tracking entities, new extensibility APIs and improvements in LINQ to Entities. Flasko provides a much more extensive rundown in her blog posting.
Still to come though is the Feature CTP 2 version, which Flasko said should be released shortly.
Beta 2 also addresses a major complaint of the Entity Framework: it's weak of support for complex stored procedures, which is said to be remedied in this new release. For example, the user interface can now detect columns returned from a stored procedure and create a corresponding complex type, Flasko wrote in a blog posting. "In addition, existing complex types can be updated when a stored procedure definition changes," she noted.
"There is some heavy hitting stuff in this list," said Julie Lerman, an independent .NET consultant, who is writing an update to her book Programming Entity Framework (O'Reilly and Partners) covering the EF 4. "For example having access to foreign keys and having foreign keys in the model is going to be a game-changer for a lot of people. They think it's so important that they've made it the default, that's a really, really big thing in this new version that we didn't have in beta 1."
The new lazy loading update will be welcome by some, criticized by others, Lerman tells me. "People who are used to using ORMs absolutely expect lazy loading to be there, while for people who are not used to using ORMs, this might be unexpected behavior for them. They can turn it off for sure, easily, but it's one of those issues that has been widely debated. Everyone using LINQ to SQL is used to lazy loading working, so it's not an insignificant change."
Overall, Lerman said, beta 2 is a significant update and she will have more to say once she spends some time testing it. 'These changes are going to have a huge impact on things that we've been looking forward to from moving from Entity Framework 1 to the new version," she said.
But Entity Framework has its critics. I was chatting with Andrew Brust, chief of new technology at the consulting firm twentysix New York, and he is not completely sold on the Entity Framework. "People don’t like the objects that the Entity Framework generates, they're not quite everything they would do if they wrote their own, so they are putting objects around the objects, and now you've got an extra layer of indirection," said Brust, who also writes a column for Visual Studio Magazine. (For his take on the Entity Framework, see The Value of Known Entities)
"Because instead of just writing your own objects around the ADO.NET code that would talk to the database directly, you are writing your objects around an Entity Framework object which then uses ADO.NET to talk to the database," Brust added. "That's the problem."
Flasko said she begs to differ noting the support for T4 templates and improvements to POCO. "You are writing objects that then become the Entity Framework objects, so the T4 [templates] will actually generate the Entity Framework objects but it will generate them customized to your specifications," she said. "It will generate them, shape the way that you actually want them shaped for your business practices or for your applications and then with the POCO objects you are creating very, very simple plain CLR objects that become the Entity Framework objects in that instance as well."
Those partial to other ORM technologies were even more critical in their assessment of the Entity Framework. "Even with EF4, it still hasn't begun to get close to the feature set that NHibernate had 4-plus years ago, not to mention the features in NHibernate 2.1 and Fluent NHibernate," said .NET consultant Chad Myers, in an email. Myers was part of the original team that invented Fluent NHibernate.
"As far as I'm concerned, EF is a marked step backwards in progress in the .NET space and represents the 'wrong' side of the internal struggle within Microsoft to be more open, engaged, and cooperative with the flourishing .NET open source ecosystem that has erupted in spite of Microsoft's efforts," he said.
I will be writing more on this and would like to hear your views. Please drop me a line at [email protected] or comment below.
Posted by Jeffrey Schwartz on 10/21/2009 at 1:15 PM0 comments
There's good news for those interested in the Code Only extensions Microsoft started adding to the ADO.NET Entity Framework earlier this year. The company this week broadened its support for Code Only POCO (Plain Old CRL Object) entities in the Entity Framework.
In a status-update posted on Microsoft's Entity Framework Design blog, Microsoft outlined some new features coming to the next test build of the Entity Framework, which further its support for Code Only development.
"Code Only is now looking much more complete,"noted Microsoft Entity Framework program manager Alex James in the blog posting announcing the Code Only updates. "It isn’t completely finished yet, we are still working on the rough edges."
The Code Only support in the first CTP was, by Microsoft's own account, limited. The latest enhancements to Code Only support were well-received. "Code Only is their way of allowing Entity Framework to be a technology for people who do not want to do modeling, which is more typical domain-driven developers," said independent .NET consultant Julie Lerman, who is writing an update to her book Programming Entity Framework (O'Reilly and Partners) covering the EF 4.
Lerman told me that while the potential audience of those preferring the Code Only features might be smaller than those using the model-driven approach, Microsoft's decision to further embrace POCO promises to broaden the reach of the Entity Framework. "Because the Entity Framework now supports those ways of programming, I think more developers like me, who are data centric, are going to start adopting these types of practices," she said.
When testers get to see the next test release, they will see the Entity Framework will support Foreign Keys (FKs), which provide more simplified mapping of data sets, James wrote back in March. "FK Associations need no mapping, simply mapping the Entity(Set) is sufficient," he noted at the time. "This will simplify life for a lot of people who need the foreign keys available to them," Lerman said.
According to James, among the other Code Only enhancements coming are the ability to add fake navigation to address missing navigation properties, support for complex type configurations, associated mapping and the ability to extract EDMX produced by Code Only either from an XMLWriter or as an XDocument.
As indicated by Microsoft earlier, Code Only will not be in .NET 4.0, except possibly DLL features, James noted. "The rest of Code Only will continue to evolve, and we will ship another preview or two, before getting it rolled into .NET as soon as we can after 4.0 ships."
What's your take on the Entity Framework? Please comment below or drop me a line at [email protected] .
Posted by Jeffrey Schwartz on 10/13/2009 at 7:36 PM4 comments
Microsoft last week talked up the release of a bridge designed to let Java developers utilize Microsoft's ADO.NET Data Services.
The company that actually developed the bridge, Noelios Technologies, is an obscure French-based consulting services firm. But it's noteworthy because Microsoft provided funding to the company for the tool and announced the release on its Interoperability Blog. The bridge, called Restlet 2.0 M5, is based on an extension to the open source Restlet Framework, designed to allow Java developers to create RESTful applications when building Web 2.0-type applications. (see "Bridge Connects Java to ADO.NET Data Services").
Wayne Citrin, CTO of JNBridge, an established supplier of .NET to Java bridges, argues Noelios' new offering will offer "narrow" appeal. "If you are writing Java and are calling ADO.NET Data Services great, but if you are calling something else or you're writing .NET code and want to call Java code, you're going to be disappointed," Citrin said in an interview.
However the bridge may attract more interest than Citrin thinks, some observers said. "JN Bridge is doing hard interoperability at the protocol level, which is important to many enterprise customers in terms of merging the worlds of .NET and Java," said IDC analyst Al Hilwa in an email. But REST could prove to be a viable alternative for bridging Java and .NET using Web services, he noted.
"REST is definitely catching on as a new way of structuring loosely coupled applications as interoperable parts of other applications," Hilwa said. "It certainly allows for divergent technologies to work together even though in their guts they are made up in completely different ways."
Citrin argued his company's JNBridgePro 4.1 also provides links to both ADO.NET and ADO.NET data services, saying the former is a mainstream interface for data connectivity while the latter is "pretty new."
But the Noelios release underscores Microsoft's emphasis of ADO.NET Data Services as a mainstream vehicle for interoperability, said Andrew Brust, chief of New Technology at twentysix New York. "It's based on REST so nearly any platform can talk to it with a bit of work," Brust said in an email.
"Microsoft is making it even easier for PHP and now Java developers to use ADO.NET Data Services because they don't even need to deal with the REST interface at all. Instead they just generate proxy classes and write their code as if they're talking to local objects."
Also worth pointing out, Brust noted, is that ADO.NET Data Services acts as a wrapper not just to Entity Framework data, but also to data storage on Microsoft's forthcoming Azure cloud platform.
Furthermore, the forthcoming release of SQL Server Reporting Services, due out next year, will also expose its content via ADO.NET Data Services, he noted. "This means that developers on many platforms will be able to consume data from a number of Microsoft server products and many .NET custom applications," he said. "All without needing to install Microsoft software on their side of the conversation."
Do you see ADO.NET Services as a viable means of linking your Java applications to .NET data? Drop me a line at [email protected] or feel free to comment here.
Posted by Jeffrey Schwartz on 10/08/2009 at 7:36 PM0 comments
The fate of MySQL has been top of mind since Oracle agreed to acquire Sun Microsystems earlier this year for $7.4 billion. Will Oracle spin it off, treat it as a strategic asset or let it die a slow death?
Well, Oracle CEO Larry Ellison this week finally shed some light on that question during an interview by none other than Ed Zander, who was once president and COO of Sun. Ellison made his remarks during the interview, at The Churchill Club, a non-profit Silicon Valley forum.
"We're not going to spin it off," Ellison told Zander (video courtesy of TechPulse360). "The U.S. government cleared this, we think the Europeans are aware we are not going to spin it off." As reported earlier this month, the European Commission said it is investigating the deal based on concerns about the impact it will have on MySQL. The move came just weeks after the U.S. Department of Justice approved the deal.
"MySQL and Oracle don't compete," Ellison said. Rather Oracle competes with IBM's DB2, Microsoft's SQL Server and databases from Sybase and Teradata, among others.
Forrester analyst Noel Yuhanna believes MySQL could become a strategic asset to Oracle. "MySQL has become a major force and a threat to Oracle and Microsoft," Yuhanna said in an email. He points out that companies like Facebook, Twitter, Google, Skype, Safeway, Comcast, and others that are already a major users of MySQL.
"And many others are considering looking at making it part of their database strategy, including some large Fortune 100 companies," Yuhanna noted. That said, MySQL fills an important gap in Oracle’s market, which is in the small to medium sized applications, where Microsoft SQL Server has dominated for years, he added.
"We believe MySQL will be positioned against SQL Server and also offering a migration path to Oracle databases, so this acquisition, especially MySQL, is critical for Oracle and I am sure Microsoft is watching is very closely."
And perhaps in this case Microsoft is in the ironic position of routing for the European Commission?
What's your take? Drop me a line at [email protected] or post a comment below.
Posted by Jeffrey Schwartz on 09/25/2009 at 7:36 PM2 comments
Earlier this month, Sentrigo, Inc, issued a warning pointing to a vulnerability in Microsoft's SQL Server database where unencrypted passwords could be accessed by unauthorized individuals. The way the vendor put it, someone could retrieve the passwords by reviewing the contents of SQL Server process memory using widely available tools.
Sentrigo described this as a serious issue -- so serious that it is offering a free tool to remedy the situation. That same day, Microsoft posted a bulletin saying it is not classifying the issue as a vulnerability.
"We checked with the security researchers who reported the issue and they confirmed that this is an information disclosure issue requiring the attacker to first have administrative control of the installation," according to its posting. "Therefore, we do not consider this a bulletin class vulnerability."
But the folks at Sentrigo argue Microsoft is missing the point. In a FAQ posted on its Web site, the company acknowledged that the perpetrator must have administrative privileges for the vulnerability to be a threat. But in most organizations, the vendor argued, more than one individual has administrative access. Also as many apps are run with administrative privileges, SQL injections could also reveal passwords. By running its tools, the first and last characters of passwords are erased.
Gary McGraw, chief technology of consulting firm Cigital that specializes in software security, told me the best solution is simply not to give out administrative privileges.
"Of course administrators can do evil things. That's why you shouldn't allow everyone to be an administrator," McGraw said. "A lot of people to this day tend to run all sorts of things, including database engines, Web servers, and programs, as an administrator, or as root, and that's an extremely bad idea. The good news is modern versions of Windows and modern versions of other operating systems are making it easier to run programs with much less privileges."
Still some bloggers argue Microsoft is sweeping the issue under the rug. Do you think Microsoft needs to take a more proactive stance here or do organizations need to take a look at how they assign administrative rights to those who run SQL Server and related applications?
Feel free to comment, or drop me a line at [email protected].
Posted by Jeffrey Schwartz on 09/17/2009 at 7:36 PM2 comments
Oracle yesterday released a major new upgrade to its flagship database that pushes the envelope with advances in clustering and grid computing, storage management and support for large queries from data warehouses.
The availability of Oracle 11gR2 comes two years after the release of its last upgrade and is important for any Oracle shop that is looking to have the latest and greatest in the company's database technology (a look at those features can be found here).
The question is can Oracle, regarded as the database leader, convince its installed base to upgrade to the latest and greatest features? Or in these days of reduced IT spending, will enterprises migrate to alternatives including Microsoft's SQL Server, IBM's DB2 or open source databases such as those offered from EnterpriseDB, sponsor of Postgres, Ingres, and others affiliated with the burgeoning Open Database Alliance, initiated months back by MySQL founder Monty Widenius (see "MySQL Creators Move to Keep MySQL Open")?.
"Oracle continues to dominate the DBMS industry in market share and technology innovation," said Forrester Research analyst Noel Yuhanna. "Oracle has the most number of database features and no one disputes their technology leadership in this space. However, SQL Server and DB2 have been catching up with more advanced features which is definitely putting a lot of pressure on Oracle. But for now, they continue to enjoy their dominance, but have to be careful in the coming years."
Indeed, IBM and Microsoft are coming after Oracle from different vantage points. On one suggest example, as reported last month, I visited IBM's Thomas J. Watson Research Center in Hawthorne, N.Y., where the company launched its Smart Analytics System and coincidently reached an agreement to acquire leading predictive analytics supplier SPSS for $1.2 billon.
Meanwhile, Microsoft last week released the first Community Test Preview (CTP) of what it describes as its own "massively scalable" data warehouse solution, known as Project Madison. Based on Microsoft's acquisition of DATAllegro, Madison is the company's planned data warehouse appliance that the company said lets organizations scale their warehouses from as little as 50 gigabytes to over a petabyte of data. "Unlike its competitors, Madison offers hardware flexibility with configurations from the major hardware vendors and low cost through industry standard hardware," said Val Fontama, a product manager in Microsoft's SQL Server group in a blog posting last week.
Oracle's answer to that, the Oracle Database Machine, is an appliance-based package offered in partnership with Hewlett-Packard based on HP's Xadata server and storage components launched last year. A user can query information that will cache the data across all the servers and run the in-memory parallel query, according to Oracle. "If you have a 10-node cluster, you can have a terabyte or two of memory in those clusters," said Andy Mendelsohn, Oracle's senior vice president of database server technologies during a conference call Tuesday announcing the release. "If you have data in your database that can be cached in a terabyte or so of memory, this in-memory parallel query technology will engage and it can be used."
But the battle will also be fought on the low end as well. SQL Server continues to grow in share and capability, while EnterpriseDB, which in addition to going after disaffected MySQL customers, was founded on the premise of convincing customers to move their Oracle databases to Postgres. "Our original founding was around providing an alternative database to Oracle, untangling the lock in that is created between the application and the database," said EnterpriseDB CEO Ed Boyajian. On the other end of the spectrum, Boyajian told me that they are seeing nearly 10,000 downloads of its migration wizard for those looking to move apps from MySQL to Postgres. "It's not insignificant, and that's just for the migration wizard," he said, admitting it remains to be seen how many actually make the switch.
As Oracle gets ready to close its acquisition of Sun Microsystems things should get interesting. With all of these options, what's on your plate? Is your shop considering a migration or staying put? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 09/02/2009 at 1:15 PM0 comments
For Oracle shops that run .NET applications, there are two commercial alternatives: DevArt offers dotConnect for Oracle (formerly known as OraDirect .NET) which boasts Entity Framework support. Then Progress DataDirect last week released its Progress DataDirect Connect for ADO.NET Entity Framework provider for Oracle. All of the features are based on 100 percent managed code architecture and it doesn't require the Oracle client, noted Elisa Flasko, Microsoft's program manager for data programmability in a blog posting.
Progress DataDirect will not reveal pricing for its new tooling but suffice to say, many developers are waiting to see if Oracle will offer one free. "The question is 'where is Oracle and when is it coming,' " said Julie Lerman, an MVP and author of Programming For Entity Framework in an interview. "I don’t believe it’s a question of if, they keep saying, we don’t have any dates yet," Lerman said.
Indeed I have heard from those who don't want to pay for amount for connectivity. When Microsoft in June said it was discontinuing its Oracle data provider, the common wisdom was: 'no worries, Oracle has its own that is quite well received by .NET developers, and an updated on is coming shortly.'
Still there are plenty of large enterprises that need the connectivity who will invest in third party tools. "It’s a pretty serious product aimed at big enterprise teams and I am sure that it's is priced accordingly," Lerman said. "I've talked to one team that I know is using it and they seem to be happy with it."
It bears noting that the current release from Progress Data Direct only supports the current Entity Framework 1, not the forthcoming updated release which will be part of Visual Studio 2010 and the .NET Framework 4.
If you're working with the Entity Framework and need to connect to Oracle databases, are you looking at these third party tools or are you holding out to see what Oracle delivers? Drop me a line at [email protected]
Posted by Jeffrey Schwartz on 08/26/2009 at 1:15 PM3 comments
Microsoft this week is rolling out a twin bill of previews that will be noteworthy to any developer interested in the future of Redmond's SQL Server platform. One includes a preview of the next release of its traditional premises-based database, SQL Server, while the other will give a look at Microsoft's cloud-based relational database offering.
The community technology preview of its SQL Azure Database is slated to be available today, Wednesday August 19, though as of this afternoon, OakLeaf Systems principal Roger Jennings, said he and others have yet to receive a token.
"They are being very hazy on how long it's going to take them for people who had SSDS [SQL Server Data Services] or SDS [SQL Data Services] accounts tokens for SQL Azure," Jennings said in an interview. In fact Microsoft's Zach Owens warned Jennings it could take up to two weeks to get tokens.
"Over the next week or two everyone who has already signed up for a SQL Azure Invitation Code should be receiving an email sent to the address associated with your Live ID containing the token and a link to redeem it," Owens wrote in a blog posting today. "We understand that everyone would like their tokens yesterday but we need to work through the list and ramp up the service. Once the list of current requests has been processed, new requested will be fulfilled within a day or two."
But you can be sure once he gets his hands on it Jennings, who blogs everything cloud and SQL (among other things) will put it through the ringer just as he has with Azure and the former SDS. See his recent Visual Studio Magazine cover story Targeting Azure Storage. Updated August 20th: Jennings informed me he received his invitation today.
For those that haven't been following Microsoft's cloud-based database initiative, SQL Azure Database will be the company's relational database based on SQL Server. It will run atop of Microsoft's Azure cloud service. "With SQL Azure Database, you can easily provision and deploy relational database solutions to the cloud, and take advantage of a globally distributed data center that provides enterprise-class availability, scalability, and security with the benefits of built-in data protection, self-healing and disaster recovery," the company said in a blog posting. Developers can register here.
One thing not to look for is support for Transparent Data Encryption, or TDE, which Microsoft already said would not be supported in this release of SQL Azure Database. But Jennings also noted it will not support column level encryption.
"I don’t think its going to be a big issue because not many people are storing personally identifiable information in clouds yet," Jennings said, though it will be a hindrance to large organizations who want to process credit card payments, he said.
Microsoft's other key release is the CTP of SQL Server 2008 R2, which Microsoft said will add support for complex event processing (CEP) with a technology it calls StreamInsight. As reported here, Microsoft at TechEd announced plans to offer CEP, which enables algorithmic trading, fraud detection and click stream web analytics, among other real-time analytic capabilities.
As you look at either or both of these CTPs, let me know what you think at [email protected].
Posted by Jeffrey Schwartz on 08/19/2009 at 1:15 PM1 comments
Many database developers have loudly bemoaned Microsoft's decision late last year to marginalize LINQ to SQL in favor of its ADO.NET Entity Framework.
The angst played out as many who build applications designed to access Microsoft's SQL Server felt left holding the bag as reported here . Make no mistake: the Entity Framework is Microsoft's object relational mapping (ORM) technology of choice and that will become even more evident next year with the release of Entity Framework 4, Visual Studio 2010 and the .NET Framework 4.
Hence there will be no major emphasis on LINQ to SQL from Microsoft other than some fixes and occasional tweaks. To many that decision was an unfortunate turn of events because LINQ to SQL is faster and easier to work with than the Entity Framework, many developers say. And if all you're trying to connect to is SQL Server, it does the trick, whereas the Entity Framework at some point will become more appealing to those looking to connect with multiple vendors back-end databases.
But there have been some positive developments for those who intend to stick with LINQ to SQL. For one, Microsoft did make some improvements to it in the .NET 4 Framework, noted Damien Guard, a software development engineer within Microsoft's Data Programmability Group (see his June blog posting).
For those who don't feel that's enough, Tony Sneed, a consultant and trainer, believes those looking to use an ORM for SQL Server will be better off using LINQ to SQL rather than the Entity Framework. In a blog posting last week, he pointed to a client, Credit Solutions, who successfully used LINQ to SQL for a line-of-business application "You're going to get up and running much more quickly with LINQ to SQL than you would with Entity Framework," Sneed said in an interview.
But the question he pointed to in the blog posting: "who is adding new features to LINQ to SQL?" The first viable alternative he has come across is PlinqO, developed by code generation tool supplier, CodeSmith. Sneed explained in his posting:
The purpose of PlinqO is to generate LINQ to SQL entities that replace those created when you add a dbml file to a Visual Studio project. In fact, PlinqO will generate the dbml file for you, placing each entity in a separate .cs file under a common folder. Actually, PlinqO creates two files: one for the code-generated entity, and another for your own custom metadata that will not be overwritten by the code-generator (for example, attributes that can drive a dynamic data web site).
I spoke with Shannon Davidson, general manager at CodeSmith, to see if he is expecting a broad market to provide code-generation for LINQ to SQL. "We've had a lot of people tell us they are still not happy with the Entity Framework and that LINQ to SQL is a lighter framework, and is easier to use, and the performance speeds have been faster on LINQ to SQL than Entity Framework so far," he said.
Still LINQ to SQL has its own quirks, Davidson said, such as creating original DBML (database markup language), the XML-based language that describes databases. As Sneed pointed out, developers who use PlinqO don't have to use the designer to create the original DBML, nor do they have to remove and re-add entities to the DBML designer. "When using regular DBML, I noticed that it's a little flakey as far as how it re-generates the code and I also noticed it deleted files that I didn’t want to delete," Davidson said. With the CodeSmith templates you can make your data changes, you write-click and re-generate and your DBML is automatically updated, he said.
Also addressing a big complaint that LINQ to SQL didn't let developers detach entities, PlinqO supports detaching of entities, he said. The templates are free to those CodeSmith tools users (the professional edition costs $299). While Davidson sees a lot of interest in LINQ to SQL intends to offer tools for Entity Framework 4 and just last week released Nhibernate templates (see last week's update on Nhibernate here)
Indeed while PLinqO is getting a lot of buzz in Twitter land, it's not the only option for improving LINQ To SQL. MVP Jim Wooley, an expert in both LINQ to SQL and the Entity Framework and author of the book LINQ In Action pointed out in an email interview there are other options. One alternative is from Huagati, a company in Bangkok, which Wooley noted offers some of the same features that PLinqO has.
Wooley pointed to some other worthy options for the initiated. One is the LINQ to SQL Templates for T4 Project, designed for those looking to develop customized code generators (see Microsoft's CodePlex site). "I think the L2ST4 project has a lot to offer, particularly in terms of Visual Studio integration," he noted. "Many of those that are attracted to PLinqO like it for some of the built in functionality that they don't want to customize. Both of these alternatives are limited in their abilities as many of the customizations that people want require changes to the underlying provider which is not exposed publically in the framework."
Other options, Wooley noted, include the IQueryable Toolkit, contributed to CodePlex by Matt Warren, a software architect on Microsoft's C# programming language product team and the Mono LINQ to SQL implementation for people interested in extending LINQ to SQL. Wooley also said developers could try to combine these projects with the Entity Framework Provider sample to try to implement enhancements to Entity Framework, should they choose to go down that path.
Don’t look to any of these options to become a broad trend, according to Wooley. "None of these options are trivial undertakings, thus I wouldn't be surprised if less than 1 percent of the developer population would consider going down this route," he noted.
"The bottom line," according to Sneed in his blog posting, "is that LINQ to SQL is a perfectly viable alternative when you can guarantee that the database will be Microsoft SQL Server (2000 or later). It has support for POCO (persistence ignorance), stored procedures, lazy loading, and concurrency management, and it works well with SOA (n-tier) architectures."
Beneath this backdrop, do you plan to continue your efforts with LINQ to SQL? Do you see ultimately becoming proficient in the Entity Framework or are you going to focus on Nhibernate. Granted this isn't an either or decision for many but we'd like to hear your opinions and issues so we can further report our findings. Feel free to post them here, or drop me a line at [email protected].
Posted by Jeffrey Schwartz on 08/10/2009 at 1:15 PM10 comments
A new release of NHibernate, the open source .NET port of the Java Hibernate object relational mapping tool, is now available.
NHibernate 2.1.0, which went GA last week, is comparable to Hibernate 3.2.6/3.3.1 in terms of features, according to a posting on the Soureceforge site, where developers can download the new release.
"It’s a wide array of little things rather than any one big thing," said Stephen Bohlen, one of the NHibernate contributors and organizer of the New York City ALT.NET user group.
Among them, the upgrade features a decoupling of the dynamic proxy engine that was previously hard coded to only work with Castle Windsor's dynamic proxy engine, Bohlen told me. "That was changed so it was possible for adopters to swap in whatever dynamic proxy engine they want, whether that be one from Spring.NET, among others."
The new release also introduces a refactoring process that will lead to the introduction of the LINQ to NHibernate provider that will ultimately appear in a subsequent release, probably NHibernate 3.0, Bohlen said.
"A lot of the syntax parsing for how it actually takes the object model and turns it into executable SQL that gets sent to various different databases, is now configurable in the engine," Bohlen said. "So you can use the older parser if you need to for compatibility reasons but the newer parser is also available as a configuration choice so you can begin to get familiar with how that new parsing engine is going to work with LINQ to NHibernate."
That doesn’t' mean that LINQ to NHibernate is in this release, he cautioned, but it does mean a lot of the reshaping of the way the code works is able to actually support that now, he said. "So it's kind of a foundational work for what will eventually be in 3.0 for LINQ to NHibernate."
So what is the significance if LINQ to NHibernate? There are two core ways to structure queries in NHibernate and the two that are supported in the application today using HQL, the Hibernate Query Language. HQL is a text based language usually assembled using strings in the same way a developer would assemble SQL strings in any other data environment. "The advantage of HQL is its query is stated in terms of your objects rather than in terms of your database," Bohlen said.
"But it is still text based and therefore in a text based query language, meaning there's an opportunity to get the syntax wrong. In Visual Studio, there is no IntelliSense support for writing HQL queries."
The second way NHibernate developers can structure queries today is using the Criteria API, a set of operations and methods on the criteria class that enables using methods to construct queries. "The advantage of using methods of course is that it's strongly typed and you get IntelleSense support and it's much easier to construct those things," he said.
The problem with the Criteria API, he said, is it comes with a learning curve. LINQ to NHiberate will be a LINQ provider that will enable developers to use what is starting to become a familiar skill sets in terms of writing LINQ queries, whether people are using LINQ to Objects or using LINQ to SQL or LINQ to Entities or even the ADO.NET Entity Framework, he said.
"As more and more people know how to structure the queries using LINQ, once LINQ to NHibernate is constructed, that will enable someone to write queries against the NHibernate object model using standard LINQ syntax, the same standard LINQ operators that people are familiar with in today writing LINQ to SQL, LINQ to Entities or the Entity Framework. They will be able to use the right queries against the NHibernate object models. "You still won't be writing database queries in the sense that you won't say 'select from table so-and-so' but you will be saying 'select from this collection of objects where the following values are' and so on."
So that begs the question, will LINQ to NHibernate make NHibernate more appealing as an alternative to Microsoft's own offerings including the Entity Framework?
In terms of structuring queries, it will bring it into parity with the learning curve of an object relational mapper, according to Bohlen. "There's obviously a whole host of things that you have to get as a new adopter to ORM technology to get your head around, but certainly the query language supported by the object relational mapper is one of those areas," he said. "Once LINQ to NHibernate is actually fully baked, I think there will be a much greater parity there in terms of once you've mastered the LINQ query language you'd be able to apply those skills directly to NHibernate in the same way that today to a degree you are able to apply those skills directly to Entity Framework."
Many question how many enterprise developers will favor NHibernate over the Entity Framework once Visual Studio 2010 and the Entity Framework 4 ship next year. That release adds POCO support, template-based code generation and LINQ to Entities improvements, among other features.
Nevertheless Stephen Forte, director of technology at Telerik, which itself offers an ORM tool, wonders how quickly developers will use the Entity Framework, even though it is being baked into the .NET Framework 4.
"If you look at all these new features and say 'oh that sounds cool, but when do I use it, why should I move from ADO.NET or why should I move from LINQ To SQL," Forte told me. Microsoft still needs to make the case for that, something he is hopeful will happen around the PDC 2009 timeframe in November. Forte had more to say on all these ORM and data access technologies during a recent .NET Rocks interview.
Are you partial to either ORM technology, do you see using a hybrid of both, or are you sticking with ADO.NET and/or LINQ To SQL? Drop me a line and let me know what you think at [email protected].
Posted by Jeffrey Schwartz on 07/29/2009 at 1:15 PM1 comments
Developers can now take a look at the Oracle Database Schema Provider that will plug into Microsoft's Visual Studio Team System, thanks to the release of the first beta of the DSP this week.
Microsoft announced back in February at the VSLive! conference in San Francisco that the Oracle database plug-in to VSTS 2010 would be offered as an option by Quest Software, maker of, among other things, the widely used Toad for Oracle tools. Quest launched the beta of the new tool, dubbed Project Fuze.
"Anybody that is interested in seeing what Oracle development will look like in Visual Studio Team System 2010 and beyond can take a look at the Project Fuze data and this will let Oracle developers start to participate in the richness of Oracle application lifecycle management," said Daniel Wood, Quest's head of development.
"Project Fuze is intended for those that have already adopted VSTS and Team Foundation Server as their ALM platform for .NET and SQL Server development. Within that community we also have Oracle installs, but they would like this same tool set. On the flip side are shops that have Oracle deployments and are looking for some type of ALM solution. Many of them do use Visual Studio Team System on the application side, so they would like to bring that over to the Oracle side."
The question is, will shops be willing to pay for this option? Quest is not revealing pricing. When (I first wrote about the deal between Microsoft and Quest to provide the DSP, a reader pointed out to Oracle's Developer Tools for Visual Studio, or ODT. "Why pay for a third-party plug-in when you can get it for free from the source?" the reader asked.
Wood explained: "The Oracle plug-in for Visual Studio is not meant to integrate with Team System or Team Foundation Server," he said. "On the ALM side, it's there as an extension of Oracle Data Provider, to demonstrate basic functionality of browsing a database and viewing objects within the database; it will not operate with the database professional features in Visual Studio."
The beta can be downloaded here. If you test it, let me know what you think at [email protected].
Posted by Jeffrey Schwartz on 07/23/2009 at 1:15 PM0 comments
When Microsoft released Silverlight 3 last week, there was much attention paid to its ability to support rich Internet applications outside the browser. But what does that mean for data-driven applications?
At the MIX 09 conference back in March, Microsoft announced .NET RIA Services. In a blog posting at the time, Brad Abrams, group program manager for Microsoft's .NET Framework explained:
Microsoft .NET RIA Services simplifies the traditional n-tier application pattern by bringing together the ASP.NET and Silverlight platforms. The RIA Services provides a pattern to write application logic that runs on the mid-tier and controls access to data for queries, changes and custom operations. It also provides end-to-end support for common tasks such as data validation, authentication and roles by integrating with Silverlight components on the client and ASP.NET on the mid-tier.
So that begs the question: Will .NET RIA Services be preferred over ADO.NET DataServices for Silverlight data access? Scott Guthrie, corporate VP of Microsoft's .NET Developer Platform group, in an interview at last week's launch event, said "no."
"The bits that are being released today for RIA Services, actually build on top of ADO.NET DataServices," he said. "So you can think of ADO.NET DataServices as providing a kind of lower layer RAW/REST API, and then RIA Services as a layer on top. We definitely think that there are scenarios where you would want to have a pure REST service model. And then the .NET RIA Services gives you things like the validation, cross-tiering, and higher-level services on top. We’ve worked hard to layer them nicely, so that RIA Services isn’t a competitive technology, but actually just builds on top of ADO.NET Data Services." A complete copy of the interview is available here.
Andrew Brust, chief of new technology at twentysix New York welcomed the fact that the team is working to integrate RIA Services with ADO.NET Data Services, based on his review of Microsoft's newly released NET RIA Services overview paper. "This is certainly welcome news," said Brust in an e-mail.
"With the Entity Framework and ADO.NET Data Services joining bare ADO.NET and DataSets, there are already plenty of data access technologies to go around and we certainly didn't need another separate model. It looks like what they're doing with RIA Services is making it a value-added business logic/validation UI toolkit for Silverlight that works on top of ADO.NET Data Services."
What's your take on .NET RIA Services? Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 07/16/2009 at 1:15 PM2 comments