"I have seen simple select statements [in Entity Framework] with 4 or 5 includes result in nearly 5,000 line SQL statements when an equivalent hand-written SQL statement is [about] 15 lines."
So reads the first of 21 comments on Microsoft's "ADO.NET Entity Framework (EF) Feature Suggestions" site, where developers can post, vote and comment on proposed EF enhancements. "Improved SQL Generation" is by far the No. 1 feature suggestion, with some 1,400 votes.
Several developer complaints about SQL generation concerned bloated code or slow performance. You saw an example of the former; the latter is exemplified by this comment:
"I just documented a case where the Contains() operator reduces EF query performance by a factor of 300. Further diagnosis revealed that the slowdown occurs in the query-generation portion of the request, so I agree that this needs some serious attention."
Microsoft's Diego Vega, program manager for DataFx, responded that the core EF library update coming in .NET 4.5 will provide better SQL generation in some cases, but not for all of the listed scenarios. "I am going to leave this idea as open for now, until we figure out a better way to track individual scenarios, which we need to do to get more actionable data," Vega said.
Speaking of .NET 4.5 and EF core libraries, some comments concerned the attempted separation of EF releases from .NET. The EF team shipped the first two releases as part of the .NET Framework, but then released separate updates to get them out to developers faster, as mentioned in this October blog post. That post said the June 2011 CTP was the first attempted full EF shipment separate from the .NET Framework. But technical problems resulted, so several feature improvements -- such as support for enums, spatial types and others -- that rely on .NET core library updates will have to wait for next EF preview that will ship along with the upcoming public preview of .NET 4.5.
When Vega reiterated this on the feature request page, a couple of readers found fault with that, as evidenced by this comment:
"Diego, No offense but moving to v4.5 of the framework is really not an option for ... most corporate software."
"I guess because EF core library is included in .NET framework we should wait until then! That is awful, please consider put EF libraries out of .NET framework. Good ORM needs frequent release cycle."
Umm ... isn't that what the team said they were trying to do back in October?
Why, yes, replied Vega in a response to the two reader comments that were posted earlier this month:
"We have indeed looked at taking the whole of EF out of .NET Framework for this same reason. The June 2011 CTP of EF was a first attempt that showed us it is going to be harder than we thought because of the impact on existing applications, partner teams and ADO.NET provider writers."
You can read more about the long, strange trip to EF 5.0 here and bone up on coming EF 5.0 improvements here.
Besides improved SQL generation, the top five feature requests include "Batch CUD Support (1,179 votes); "EF Support for Second Level Cache" (655 votes); "Support for multiple databases" (595 votes); and "Entity Designer: Speed up & Optimize for using with 200+ entities" (524 votes).
What feature requests would you like to see in EF? Comment here or drop me a line.
Posted by David Ramel on 02/27/2012 at 1:15 PM6 comments
The Microsoft ADO.NET team today announced that the upcoming Entity Framework 5.0 could boost application performance by some 67 percent over EF 4.0.
Reducing data access overhead in the O/RM and other performance tweaks resulted in one internal test that showed "repeat execution time of the same LINQ query has been reduced by around 6x," the team said in a blog post.
One improvement involves automatically compiling LINQ to Entities queries. Developers have long been able to compile such queries explicitly via a CompiledQuery.Compile method call, but apparently they weren't aware of the feature or found the API more difficult to work with than regular LINQ, the team said. (You can read about the nuts and bolts of automatic compilation here.) Now the translation of inline LINQ queries is automatically cached and CompiledQuery is no longer necessary.
In comparison tests of EF 4.0 and EF 5.0 involving a repeated query executed with ADO.NET, LINQ to SQL and various EF methodologies, "we've improved the performance of LINQ to Entities queries nearly 600%," the team said.
Of course, you shouldn't expect that kind of improvement in your real-world applications. "It would be amazing if we could give you 600% improved performance across the board," the team said. "Unfortunately, most applications do more interesting things than run the same query repeatedly."
However, a suite of tests simulating typical usage in real-world apps showed a 67 percent performance increase after the server was upgraded from EF 4.0 to EF 5.0. "These numbers are based on an internal build, but we're working hard to get a build available for public verification of these improvements," the team said.
Just upgrading to .NET 4.5 will provide these performance improvements to EF 4.0 apps. The ADO.NET team has stated it wants to start shipping EF versions separately from the .NET Framework, but there were some upgrade issues that required core library updates, so some EF improvements have to wait for .NET 4.5. "This includes support for Enum Types, Spatial Types, Table-Valued Functions, Stored Procedures with Multiple Results and Auto-Compiled LINQ Queries," an earlier blog post said.
Also, last week the final go-live release of EF 4.3 was announced, including improvements to Code First migrations, some bug fixes and several other tweaks.
The ADO.NET team blog post about EF 5.0 improvements is brand-new at the time I write this, so no readers have weighed in with comments yet, but I'll keep an eye out for any developer concerns or interesting feedback. In the meantime, you can comment here or drop me an e-mail.
Posted by David Ramel on 02/14/2012 at 1:15 PM1 comments
Along with the "SQL Azure Security Services" mentioned earlier this week by Kurt Mackie at Redmondmag.com, Microsoft has also released a SQL Azure Compatibility Assessment.
Microsoft offers an introductory video tutorial for the service, with the following description:
"More and more companies are moving to the cloud. If often starts with moving data that already exists on premises. 'SQL Azure Compatibility Assessment' is a first step in solving the problem of knowing how easily your data can move from SQL Server on premises into the cloud."
A SQL Azure account isn’t required, just a Windows Live ID.
If you’re considering moving to the cloud, give it a try and let us know how it works by commenting here or dropping me a line
Posted by David Ramel on 02/02/2012 at 1:15 PM0 comments
Tutorials, forums, tips-and-tricks sites and the like abound on the Web and I use them constantly to improve my developer skills, but they often leave much to be desired.
For example, it can take a long time to find just what I'm looking for. One of my pet peeves are tutorials that seem to offer just what I'm looking for but are undated, or the publication date is hard to find, so I waste time checking them out only to find that the content has been rendered obsolete for various reasons.
And the quality of information can vary greatly, as can the presentation. Well-meaning and informative tutorials can be spoiled when the author's grasp of English is so lacking that it becomes distracting and problematic. Also, where do you go if you have questions about the content? Authors may or may not respond to e-mail inquiries or comments, and forum responses can also be hit-and-miss. And how do you judge how well you've learned the material? How does your performance rank with others? What if you want to go more in-depth and really drill down into similar material?
I'm thinking that free online courses offered by some of the top universities in the world -- such as Stanford and MIT -- might solve a lot of these issues.
For example, Stanford just finished up a course titled Introduction to Databases. Instructor Jennifer Widom said that "Over 90,000 accounts were created, 25,000 students submitted at least some work for grading, and 6,500 students did well enough to receive a 'statement of accomplishment.' " You can still access all the online resources at the course Web site if you want to get a taste of the experience, which will soon be improved by optimizing the site for "self-serve" learning, Widom said.
And Stanford next month will offer up Computer Science 101. Of course, these are introductory courses, but other new courses this year include Machine Learning, Game Theory and Design and Analysis of Algorithms I (alas, no database-specific offerings are on tap for you database developers).
At MIT, Introduction to Computer Science and Programming is the No. 1 most-visited course in the school's OpenCourseWare initiative. This program "is a web-based publication of virtually all MIT course content" and doesn't offer certificates and structured teacher-pupil interaction. Other popular MIT offerings include Introduction to C++ and Introduction to Algorithms. Again, these are introductory, but you could also delve into Performance Engineering of Software Systems or the graduate-level Spatial Database Management and Advanced Geographic Information Systems. You can check out the OpenCourseWare Consortium for information on a huge amount of courseware available from hundreds of other schools and organizations.
Even better, MIT this spring will launch a more structured online learning program, similar to Stanford's, called MITx.
Carnegie Mellon University will be offering Secure Coding , Principles of Computing and other courses through its Open Learning Initiative (OLI) program. The university will release details when they become available.
The Harvard University Extension School participates in the OLI with courses such as Bits: The Computer Science of Digital Information.
These are just a few examples and there's a lot more out there, so fire up your browser and take a look. And let me know what you find by commenting here or by dropping me an e-mail.
Posted by David Ramel on 01/26/2012 at 1:15 PM1 comments
Microsoft last week shipped the Entity Framework 4.3 Beta 1, with some NuGet integration enhancements and bug fixes in preparation for the final go-live release, expected in the next couple of months.
"We are planning for this to be the last pre-release version of migrations and our next release will be the final RTM of EF 4.3," said a post on the ADO.NET team blog. It said the team is "still on-track to get a full supported, go-live, release of EF 4.3 published this quarter (first quarter of 2012)."
EF 4.3 Beta 1 includes some Code First migration work done last November in a separate beta, now integrated into the EF NuGet package. The Code First Migrations enhancements include new commands, command-line tools, XML documentation, improved logging and more.
The new beta also fixes some bugs in the DbContext API and Code First.
Another piece of good news for database developers was the reminder of upcoming--and long sought-after--enum support: "EF 5.0 (Enum support is coming... finally!)" Along with enums, the 5.0 release will include spatial data types and performance improvements. "As soon as the next preview of the .NET Framework 4.5 is available we will be shipping EF 5.0 Beta 1, which will include all these new features," the blog said.
But some readers were concerned with more than technical nuts and bolts. The disparate versioning system resulting from the separation of Entity Framework and .NET Framework releases led to several comments, such as:
"Oh? EF 5.0 in .NET 4.5? You renamed EF 2.0 to 4.0 to match .NET version, now you are jumping ahead without reasons? Why not keep EF and .NET versions synced? So .NET 4.5 has C# 5.0 and EF 5.0, that's a mess."
"EF 5.0 is comming. What does it mean? In one of your previous articles you mentioned that EF in .NET 4.5 will be renamed to EF Core libraries and DbContext API will be renamed to EF. So what is EF 5.0? Is it DbContext API supporting new features from EF Core libraries included in .NET 4.5? Or is it EF Core libraries."
Microsoft's Diego B. Vega addressed the issue in his own comment:
"EF 5 will ship as a new version of the EntityFramework NuGet package at the same time NET 4.5 and VS 11 ship.
We are not really trying to differentiate EF from .NET. We are simply trying to evolve EF at a different pace. Indeed, we have decided to version EF separately from .NET and to follow semantic versioning rules strictly."
Vega went on to discuss the issue in more detail, and the entire EF versioning issue is explained in this blog post, if you're curious.
The ADO.NET team also posted some EF 4.3 hands-on tips and guidance in a Code-Based Migrations Walkthrough, an Automatic Migrations Walkthrough and a post on EF 4.3 Configuration File Settings.
How do you database developers like the new EF enhancements such as Code First and the DbContext API? Or do those mere technicalities pale in comparison to the odd versioning structure? Comment here or drop me a line.
Posted by David Ramel on 01/19/2012 at 1:15 PM4 comments
A recent salary survey indicates that database-related jobs provide good job security, and don't rank too badly on the salary side of things, either.
Visual Studio Magazine's 2012 .NET Developer Salary Survey noted that, "In terms of top job functions for security and retention, database administrator/developer ranked highest (46.5 percent), followed by senior engineer/senior software developer (43.5 percent) and software architect (43 percent)."
As far as technologies that provided perceived job security/retention, SQL Server was No. 2.
Salary-wise, the average base for database administrator/developer types was $91,276, pretty much aligned with the median base salary of all respondents, $92,000.
That compares to a $95,212 average base salary reported by database developers in Redmondmag.com's 2011 Windows IT Salary Survey last August. Interestingly, in that survey, the data devs' salary had fallen from No. 1 the previous year to No. 4.
Some more tidbits for you data types in the new .NET developer survey:
"Only 4.2 percent of survey respondents categorized their role as database administrator/developer. However, 67.5 percent of 1,104 respondents reported a background -- they had worked on a project for at least six months -- in database development: 45.3 percent in database administration and 24.2 percent in data warehousing."
It seems to me in this still-shaky economic climate that high job security is comparatively better than a high salary. Remember, if you're a working database developer, you're lucky to have a job, and probably thousands of equally qualified unemployed workers would gladly trade places with you at just about any salary.
Or, as one respondent put it, "There is a salary freeze and I do not anticipate any changes (which is fine with me ... I'm employed)."
What is it about the database field that provides (relative) job security? Comment here or drop me a line.
Posted by David Ramel on 01/12/2012 at 1:15 PM2 comments
As Microsoft continues to make news about opening up its developer technologies (the latest being opening its Windows Azure cloud platform to Linux servers), it's easy to forget how the process works both ways. Witness last week's under-the-radar release by Oracle of the production data provider "for Entity Framework and LINQ developers." This lets Oracle developers do all their work in Visual Studio for certain projects while taking advantage of almost all the latest Microsoft database APIs.
My, how open source has changed things. Remember the old days when proprietary software vendors fought tooth and nail to convert users to their proprietary technologies? For you database developers, it used to be Microsoft (SQL Server) vs. Oracle vs. Borland vs. Sybase, and, on a broader scale, it evolved into .NET vs. Java. Developers were firmly entrenched in one camp or the other and felt free to viciously (and usually anonymously) flame the non-believers in forums, comments and blog posts.
Now, it seems, every software development tool will soon just work with every other software development tool. We're heading for one big, happy family of developers.
Anyway, back to the news of special importance to you data developers. I guess Oracle decided to bury the announcement of "ODAC 11.2 Release 4 and Oracle Developer Tools for Visual Studio (184.108.40.206.0)" because the beta has been out for quite some time. The 11.2 Release 3 download was posted exactly a year earlier.
Release 4 "introduces tools and data provider support for ADO.NET Entity Framework, Language Integrated Query (LINQ), and WCF Data Services," according to an Oracle data sheet (PDF here).
The release's database client works with Oracle Database 9.2 and above. On the Microsoft side, it supports Visual Studio 2010 and the .NET Framework 4, with support for Entity Framework 4.1 and 4.2. It also supports OData, LINQ to Entities and "implicit REF CURSOR parameter binding." However, it doesn't support some of the newer Entity Framework features, such as Code First and (apparently) DbContext. (Non-support of the latter isn't mentioned explicitly in the latest announcement, but it wasn't included in earlier versions.)
To show developers how to use the Entity Framework with the data provider, Oracle has posted this article and an "Entity Framework, LINQ and Model-First for the Oracle Database" tutorial. Much more related information can be found at the Oracle Data Provider for .NET Developer's Guide.
The new production release comes in 32-bit and 64-bit downloads, with different installer/deployment options, including Xcopy.
The Oracle data provider is just one of about a dozen third-party ADO.NET providers, including MySQL.
What do you think of Oracle's support for Entity Framework and move toward more interoperable technologies in general? Comment here or drop me a line.
Posted by David Ramel on 01/05/2012 at 1:15 PM0 comments
It was about two years ago when I first wrote about the exciting development possibilities of "Mining the Cloud," with new data markets such as the "Dallas" project on Windows Azure.
Well, Dallas has matured into the Windows Azure Marketplace, and at least one forward-looking research organization is predicting the fruition of that effort into something really big. One of O'Reilly Radar's "Five big data predictions for 2012" published last week is the "Rise of data marketplaces." It reads:
"Your own data can become that much more potent when mixed with other datasets. For instance, add in weather conditions to your customer data, and discover if there are weather related patterns to your customers' purchasing patterns. Acquiring these datasets can be a pain, especially if you want to do it outside of the IT department, and with some exactness. The value of data marketplaces is in providing a directory to this data, as well as streamlined, standardized methods of delivering it. Microsoft's direction of integrating its Azure marketplace right into analytical tools foreshadows the coming convenience of access to data."
Indeed, from the "dozens of feeds" I discovered in my initial exploration of Dallas, Windows Azure Marketplace now boasts "thousands of subscriptions and trillions of data points," with more coming online regularly, such as historical weather data and a "Stock Sonar Sentiment Service" added last month.
Two years ago I demonstrated how easy it was to subscribe to a data feed and incorporate it into custom reports and visualizations. Imagine what developers can do now.
While Microsoft may be the vanguard of new data-centric initiatives, it's not alone, of course. ReadWriteWeb summarized the emerging data market ... uh, market that developers might tap into in this July piece, and reviewed some of the other players such as Datamarket.com, Factual, CKAN Data Hub and Kasabi. But looks like Microsoft is indeed the frontrunner. The site even wondered "Is Microsoft's Future in Data-as-a-Service?"
But one worrisome trend that could curtail this movement is the possible loss of hundreds of thousands of raw data sources that come from the federal government as the tanking economy threatens to impose cost-cutting measures that will eliminate or severely curtail services such as Data.gov. "When the current budget cuts were revealed to include cuts to the e-government fund that supports Data.gov, everyone starting questioning Data.gov's value," reads a blog posting from the Sunlight Foundation last April when budget cuts were announced. "The cuts could spell the end of Data.gov," warned a Washington Post blog at the time. And this is with a Democrat in the White House!
The site is still up for the time being, but it's somewhat alarming that the last blog posting on the Data.gov site's Open Data section announced the resignation of the program executive last summer. And there's little activity on the forums in the "Developer's Corner" section of the site.
But with demand, there will be supply, of course, so data markets such as Windows Azure Marketplace will continue to provide valuable information that can be incorporated into exciting new development opportunities -- you just might have to pay more for less. But that's nothing new these days.
What do you think about the Windows Azure Marketplace and data markets and opportunities for development of new apps? What's the coolest app you've found that utilizes this data? Do you think the government should continue to fund sites such as Data.gov in this dire economy? Comment here or drop me a line.
Posted by David Ramel on 12/20/2011 at 1:15 PM1 comments
There were a few database-related goodies in Microsoft's announcement today about multiple Windows Azure updates, including a new Metro-like UI for the management portal, SQL Azure Federation, increased database size and lower cost-per-gigabyte for the biggest databases.
The Metro-style UI for the SQL Azure Management Portal includes new features such as "new workspaces with the ability to more easily monitor databases, drill-down into schemas, query plans, spatial data, indexes/keys, and query performance statistics," said an announcement in a Windows Azure blog post by Bob Kelly. The post explained that the updates were part of the new "SQL Azure Q4 2011 Service Release," the details of which were posted on another page, by Gregory Leake.
The size of the largest allowable increases to 150GB from 50GB, Leake said, while a new price cap will decrease the cost-per-gigabyte by 67 percent for the biggest databases. The cap is $499.95 per month.
SQL Azure Federation means "databases can be elastically scaled out using sharding based on database size and application workload," the post said. Federation will be supported in the new portal.
Other improvements include an updated CTP for the DAC Import/Export Service, which reportedly fixes several issues and allows easy import and export of databases between SQL Azure and BLOB storage.
Also, user-controlled collations are now supported, which means users can choose which type of collation to use when creating databases.
Microsoft said to stay tuned for more posts explaining SQL Azure Federation and the new management portal in more detail.
Posted by David Ramel on 12/12/2011 at 1:15 PM2 comments
It took about three years from the release of the first Windows-specific SQL Server to a kind of opening up of the architecture with the inclusion of an Open Database Connectivity (ODBC) driver with SQL Server 7.0 in 1998. Some 13 years later, Microsoft has released the first preview of an ODBC driver for Linux.
Announced at the PASS conference in October, the Linux driver was released earlier this week. Specifically, it's a 64-bit driver (32-bit is planned) only for Red Hat Enterprise Linux 5, but it's a start.
This is just the latest in an openness campaign underway (or what it calls "Microsoft's Commitment to Interoperability") at Microsoft, something that would've been unheard of not that long ago, it seems. At about the same time as the Linux announcement, the company dropped the CTP3 of the JDBC Driver 4.0.
In August 2010, Microsoft Drivers for PHP for SQL Server 2.0 were released, for the first time including the addition of the PDO_SQLSRV driver, which supports PHP Data Objects.
A few months ago, Microsoft announced it was jumping all the way onto the ODBC bandwagon and planning to phase out the OLE DB technology it invented.
And, of course, I recently wrote about another opening up of SQL Server: the discontinuation of the LINQ to HPC project, replaced by support for the open source Apache Hadoop "big data" technology.
You can read more about Microsoft's database connectivity initiatives for ODBC, Java, PHP and more here. The company just continues to embrace new technologies and attract new developers. Welcome to the party.
What's the next open source move you'd like to see Microsoft make? Comment here or drop me a line.
Posted by David Ramel on 12/01/2011 at 1:15 PM2 comments
The SQL Server world was abuzz lately with last week's announcement that Microsoft was discontinuing its LINQ to HPC (high performance computing) "big data" project in favor of supporting the open source Apache Hadoop in Windows Server and Windows Azure.
This was an interesting development in the larger context of Microsoft's turn-around embrace of the open source world and many who have questioned its motives and commitment (remember long-ago headlines such as "Microsoft raps open-source approach"?).
But if Denny Lee is representative of Microsoft's motives and commitment, it seems pretty genuine to me. Check out the blog he posted earlier this week, "What's so BIG about 'Big Data'?"
"We are diving deeper into the world of Big Data by embracing and contributing to the open source community and Hadoop," Lee said. And under a heading of "Openness - yes, we're serious about it!", he said "A key aspect is openness and our commitment to give back to the Open Source community." He then talks about Microsoft's participation in last week's "ultimate open source conference," ApacheCon North America 2011.
Lee said Hadoop is important to his Customer Advisory Team because "it is important for our customers," which may sound like marketing-speak, but he notes "we work on some of the most complex Tier-1 Enterprise SQL Server implementations" and goes on to discuss technical aspects of projects such as Yahoo's "largest known cube."
Lee explained more on his personal blog about why he left the BI world to so enthusiastically embrace open source: "It's about the openness of the Open Source community (apologies for the pun) that allows us to focus on solving the actual problem instead of trying to understand how a particular system works."
So say what you will about Microsoft and its marketing strategies, it looks to me like the company has some good people who are doing good work to solve problems that affect real-world users, regardless of the technology used. Sure, it might be a matter of survival in the new IT world, but if it benefits you, take it and run.
What do you think about Hadoop? Comment here or drop me a line.
Posted by David Ramel on 11/17/2011 at 1:15 PM1 comments
Coinciding with a new SQL Server 2012 licensing model, OpSource Inc. introduced a cloud-based service that offers developers and others purportedly cheaper pay-as-you-go access to major database systems.
Called OpSource Cloud Software, the new product offers access to Microsoft SQL Server 2008 R2 Standard and other software. OpSource said the cloud service is "ideal for testing and development" in a news release.
While SQL Server 2012 comes with two licensing options--"one that is based on computing power, and one that is based on users or devices," according to a six-page datasheet--Cloud Software is available with hourly and monthly on-demand charges, OpSource said. According to a company Web site, SQL Server 2008 R2 costs 66 cents per hour per server. The pricing scheme is a little confusing to me, however. Although the news release stated: "Per Server priced Cloud Software incurs a specific rate per hour when a server is running and a specific rate per hour when a server is stopped," I couldn’t find any information about the rate for a stopped server. So I chatted with Chris, who kind of cleared it up a little, maybe, I think:
You are now chatting with 'Chris'
Chris: Thank you for your interest in OpSource. How may I help you?
me: I'm interested in the Microsoft SQL Server 2008 R2 Cloud Software product. How much is the hourly rate for a stopped server?
Chris: Well for the SQL server license, it has a built in rate of 0.66 cents per hour
Chris: And there will be additional costs for the device footprint as well
Chris: In regards to storage, CPU, and RAM
Chris: In a stopped state, you only pay for the storage
me: For a running server or stopped server? Your news release said there are two different rates for these?
Chris: You will pay the cost of the storage footprint in a standby state
me: What is that pricing structure, for storage?
Chris: However, you will be committed to a 0.66 cent rate even if the device is on standby for SQL
Chris: Well you are only being charged based on the footprint
Chris: Generally, the cost is close to 21.6 cents per GB
Chris: Per month
me: OK. One more question: Do you plan on offering SQL Server 2012 when it's available next year?
Chris: I'm not sure at this moment, I would anticipate us keeping up to date with that version in our new Application Layers
Chris: If you are interested, I can provide you with some trial credit to sandbox the environment
me: No thanks. That's all I had. Bye.
Chris: If you apply our promo code, you can get $200 worth of credit
Chris: Thank you for visiting. Please contact us at anytime.
Our questions and answers got a little out of sync (the chat box didn’t have one of those helpful "Chris is typing" indicators, so I asked more questions before I knew he wasn’t done replying), but you might get the idea, sort of, I hope.
The Cloud Software service also offers several editions of Oracle database products, with "monthly pricing based on number of processors, sockets and server configuration."
OpSource said the SQL Server product "supports up to four processors, up to 64 GB of RAM, one virtual machine, and two failover clustering nodes." It comes bundled with a Windows Server 2008 R2 image.
What do you think? Could this be a cheaper way for developers to test their SQL apps in a pseudo-production environment? Or would you be likely to forget to turn off a server and get one of those nasty cellphone-service-like bill shocks? Comment here or drop me a line.
Posted by David Ramel on 11/10/2011 at 1:15 PM1 comments