Ready for SQL Azure? New Service Tells You

Along with the "SQL Azure Security Services" mentioned earlier this week by Kurt Mackie at Redmondmag.com, Microsoft has also released a SQL Azure Compatibility Assessment.

Microsoft offers an introductory video tutorial for the service, with the following description:

"More and more companies are moving to the cloud. If often starts with moving data that already exists on premises. 'SQL Azure Compatibility Assessment' is a first step in solving the problem of knowing how easily your data can move from SQL Server on premises into the cloud."

A SQL Azure account isn’t required, just a Windows Live ID.

If you’re considering moving to the cloud, give it a try and let us know how it works by commenting here or dropping me a line.

Posted by David Ramel on 02/02/20120 comments


A University Education -- for Free

Tutorials, forums, tips-and-tricks sites and the like abound on the Web and I use them constantly to improve my developer skills, but they often leave much to be desired.

For example, it can take a long time to find just what I'm looking for. One of my pet peeves are tutorials that seem to offer just what I'm looking for but are undated, or the publication date is hard to find, so I waste time checking them out only to find that the content has been rendered obsolete for various reasons.

And the quality of information can vary greatly, as can the presentation. Well-meaning and informative tutorials can be spoiled when the author's grasp of English is so lacking that it becomes distracting and problematic. Also, where do you go if you have questions about the content? Authors may or may not respond to e-mail inquiries or comments, and forum responses can also be hit-and-miss. And how do you judge how well you've learned the material? How does your performance rank with others? What if you want to go more in-depth and really drill down into similar material?

I'm thinking that free online courses offered by some of the top universities in the world -- such as Stanford and MIT -- might solve a lot of these issues.

For example, Stanford just finished up a course titled Introduction to Databases. Instructor Jennifer Widom said that "Over 90,000 accounts were created, 25,000 students submitted at least some work for grading, and 6,500 students did well enough to receive a 'statement of accomplishment.' " You can still access all the online resources at the course Web site if you want to get a taste of the experience, which will soon be improved by optimizing the site for "self-serve" learning, Widom said.

And Stanford next month will offer up Computer Science 101. Of course, these are introductory courses, but other new courses this year include Machine Learning, Game Theory and Design and Analysis of Algorithms I (alas, no database-specific offerings are on tap for you database developers).

At MIT, Introduction to Computer Science and Programming is the No. 1 most-visited course in the school's OpenCourseWare initiative. This program "is a web-based publication of virtually all MIT course content" and doesn't offer certificates and structured teacher-pupil interaction. Other popular MIT offerings include Introduction to C++ and Introduction to Algorithms. Again, these are introductory, but you could also delve into Performance Engineering of Software Systems or the graduate-level Spatial Database Management and Advanced Geographic Information Systems. You can check out the OpenCourseWare Consortium for information on a huge amount of courseware available from hundreds of other schools and organizations.

Even better, MIT this spring will launch a more structured online learning program, similar to Stanford's, called MITx.

Carnegie Mellon University will be offering Secure Coding , Principles of Computing and other courses through its Open Learning Initiative (OLI) program. The university will release details when they become available.

The Harvard University Extension School participates in the OLI with courses such as Bits: The Computer Science of Digital Information.

These are just a few examples and there's a lot more out there, so fire up your browser and take a look. And let me know what you find by commenting here or by dropping me an e-mail.

Posted by David Ramel on 01/26/20121 comments


Entity Framework 4.3 Gets Final Tune-Up -- Enum Support Promised in 5.0

Microsoft last week shipped the Entity Framework 4.3 Beta 1, with some NuGet integration enhancements and bug fixes in preparation for the final go-live release, expected in the next couple of months.

"We are planning for this to be the last pre-release version of migrations and our next release will be the final RTM of EF 4.3," said a post on the ADO.NET team blog. It said the team is "still on-track to get a full supported, go-live, release of EF 4.3 published this quarter (first quarter of 2012)."

EF 4.3 Beta 1 includes some Code First migration work done last November in a separate beta, now integrated into the EF NuGet package. The Code First Migrations enhancements include new commands, command-line tools, XML documentation, improved logging and more.

The new beta also fixes some bugs in the DbContext API and Code First.

Another piece of good news for database developers was the reminder of upcoming--and long sought-after--enum support: "EF 5.0 (Enum support is coming... finally!)" Along with enums, the 5.0 release will include spatial data types and performance improvements. "As soon as the next preview of the .NET Framework 4.5 is available we will be shipping EF 5.0 Beta 1, which will include all these new features," the blog said.

But some readers were concerned with more than technical nuts and bolts. The disparate versioning system resulting from the separation of Entity Framework and .NET Framework releases led to several comments, such as:

"Oh? EF 5.0 in .NET 4.5? You renamed EF 2.0 to 4.0 to match .NET version, now you are jumping ahead without reasons? Why not keep EF and .NET versions synced? So .NET 4.5 has C# 5.0 and EF 5.0, that's a mess."

Another wrote:

"EF 5.0 is comming. What does it mean? In one of your previous articles you mentioned that EF in .NET 4.5 will be renamed to EF Core libraries and DbContext API will be renamed to EF. So what is EF 5.0? Is it DbContext API supporting new features from EF Core libraries included in .NET 4.5? Or is it EF Core libraries."

Microsoft's Diego B. Vega addressed the issue in his own comment:

"EF 5 will ship as a new version of the EntityFramework NuGet package at the same time NET 4.5 and VS 11 ship.

We are not really trying to differentiate EF from .NET. We are simply trying to evolve EF at a different pace. Indeed, we have decided to version EF separately from .NET and to follow semantic versioning rules strictly."

Vega went on to discuss the issue in more detail, and the entire EF versioning issue is explained in this blog post, if you're curious.

The ADO.NET team also posted some EF 4.3 hands-on tips and guidance in a Code-Based Migrations Walkthrough, an Automatic Migrations Walkthrough and a post on EF 4.3 Configuration File Settings.

How do you database developers like the new EF enhancements such as Code First and the DbContext API? Or do those mere technicalities pale in comparison to the odd versioning structure? Comment here or drop me a line.

Posted by David Ramel on 01/19/20124 comments


Database Jobs Provide Job Security, Says Survey

A recent salary survey indicates that database-related jobs provide good job security, and don't rank too badly on the salary side of things, either.

Visual Studio Magazine's 2012 .NET Developer Salary Survey noted that, "In terms of top job functions for security and retention, database administrator/developer ranked highest (46.5 percent), followed by senior engineer/senior software developer (43.5 percent) and software architect (43 percent)."

As far as technologies that provided perceived job security/retention, SQL Server was No. 2.

Salary-wise, the average base for database administrator/developer types was $91,276, pretty much aligned with the median base salary of all respondents, $92,000.

That compares to a $95,212 average base salary reported by database developers in Redmondmag.com's 2011 Windows IT Salary Survey last August. Interestingly, in that survey, the data devs' salary had fallen from No. 1 the previous year to No. 4.

Some more tidbits for you data types in the new .NET developer survey:

"Only 4.2 percent of survey respondents categorized their role as database administrator/developer. However, 67.5 percent of 1,104 respondents reported a background -- they had worked on a project for at least six months -- in database development: 45.3 percent in database administration and 24.2 percent in data warehousing."

It seems to me in this still-shaky economic climate that high job security is comparatively better than a high salary. Remember, if you're a working database developer, you're lucky to have a job, and probably thousands of equally qualified unemployed workers would gladly trade places with you at just about any salary.

Or, as one respondent put it, "There is a salary freeze and I do not anticipate any changes (which is fine with me ... I'm employed)."

What is it about the database field that provides (relative) job security? Comment here or drop me a line.

Posted by David Ramel on 01/12/20122 comments


Oracle Developers Playing in the Microsoft Sandbox? Indeed.

As Microsoft continues to make news about opening up its developer technologies (the latest being opening its Windows Azure cloud platform to Linux servers), it's easy to forget how the process works both ways. Witness last week's under-the-radar release by Oracle of the production data provider "for Entity Framework and LINQ developers." This lets Oracle developers do all their work in Visual Studio for certain projects while taking advantage of almost all the latest Microsoft database APIs.

My, how open source has changed things. Remember the old days when proprietary software vendors fought tooth and nail to convert users to their proprietary technologies? For you database developers, it used to be Microsoft (SQL Server) vs. Oracle vs. Borland vs. Sybase, and, on a broader scale, it evolved into .NET vs. Java. Developers were firmly entrenched in one camp or the other and felt free to viciously (and usually anonymously) flame the non-believers in forums, comments and blog posts.

Now, it seems, every software development tool will soon just work with every other software development tool. We're heading for one big, happy family of developers.

Anyway, back to the news of special importance to you data developers. I guess Oracle decided to bury the announcement of "ODAC 11.2 Release 4 and Oracle Developer Tools for Visual Studio (11.2.0.3.0)" because the beta has been out for quite some time. The 11.2 Release 3 download was posted exactly a year earlier.

Release 4 "introduces tools and data provider support for ADO.NET Entity Framework, Language Integrated Query (LINQ), and WCF Data Services," according to an Oracle data sheet (PDF here).

The release's database client works with Oracle Database 9.2 and above. On the Microsoft side, it supports Visual Studio 2010 and the .NET Framework 4, with support for Entity Framework 4.1 and 4.2. It also supports OData, LINQ to Entities and "implicit REF CURSOR parameter binding." However, it doesn't support some of the newer Entity Framework features, such as Code First and (apparently) DbContext. (Non-support of the latter isn't mentioned explicitly in the latest announcement, but it wasn't included in earlier versions.)

To show developers how to use the Entity Framework with the data provider, Oracle has posted this article and an "Entity Framework, LINQ and Model-First for the Oracle Database" tutorial. Much more related information can be found at the Oracle Data Provider for .NET Developer's Guide.

The new production release comes in 32-bit and 64-bit downloads, with different installer/deployment options, including Xcopy.

The Oracle data provider is just one of about a dozen third-party ADO.NET providers, including MySQL.

What do you think of Oracle's support for Entity Framework and move toward more interoperable technologies in general? Comment here or drop me a line.

Posted by David Ramel on 01/05/20120 comments


Microsoft's Windows Azure Leads the Data Revolution

It was about two years ago when I first wrote about the exciting development possibilities of "Mining the Cloud," with new data markets such as the "Dallas" project on Windows Azure.

Well, Dallas has matured into the Windows Azure Marketplace, and at least one forward-looking research organization is predicting the fruition of that effort into something really big. One of O'Reilly Radar's "Five big data predictions for 2012" published last week is the "Rise of data marketplaces." It reads:

"Your own data can become that much more potent when mixed with other datasets. For instance, add in weather conditions to your customer data, and discover if there are weather related patterns to your customers' purchasing patterns. Acquiring these datasets can be a pain, especially if you want to do it outside of the IT department, and with some exactness. The value of data marketplaces is in providing a directory to this data, as well as streamlined, standardized methods of delivering it. Microsoft's direction of integrating its Azure marketplace right into analytical tools foreshadows the coming convenience of access to data."

Indeed, from the "dozens of feeds" I discovered in my initial exploration of Dallas, Windows Azure Marketplace now boasts "thousands of subscriptions and trillions of data points," with more coming online regularly, such as historical weather data and a "Stock Sonar Sentiment Service" added last month.

Two years ago I demonstrated how easy it was to subscribe to a data feed and incorporate it into custom reports and visualizations. Imagine what developers can do now.

While Microsoft may be the vanguard of new data-centric initiatives, it's not alone, of course. ReadWriteWeb summarized the emerging data market ... uh, market that developers might tap into in this July piece, and reviewed some of the other players such as Datamarket.com, Factual, CKAN Data Hub and Kasabi. But looks like Microsoft is indeed the frontrunner. The site even wondered "Is Microsoft's Future in Data-as-a-Service?"

But one worrisome trend that could curtail this movement is the possible loss of hundreds of thousands of raw data sources that come from the federal government as the tanking economy threatens to impose cost-cutting measures that will eliminate or severely curtail services such as Data.gov. "When the current budget cuts were revealed to include cuts to the e-government fund that supports Data.gov, everyone starting questioning Data.gov's value," reads a blog posting from the Sunlight Foundation last April when budget cuts were announced. "The cuts could spell the end of Data.gov," warned a Washington Post blog at the time. And this is with a Democrat in the White House!

The site is still up for the time being, but it's somewhat alarming that the last blog posting on the Data.gov site's Open Data section announced the resignation of the program executive last summer. And there's little activity on the forums in the "Developer's Corner" section of the site.

But with demand, there will be supply, of course, so data markets such as Windows Azure Marketplace will continue to provide valuable information that can be incorporated into exciting new development opportunities -- you just might have to pay more for less. But that's nothing new these days.

What do you think about the Windows Azure Marketplace and data markets and opportunities for development of new apps? What's the coolest app you've found that utilizes this data? Do you think the government should continue to fund sites such as Data.gov in this dire economy? Comment here or drop me a line.

Posted by David Ramel on 12/20/20111 comments


SQL Azure Gets Tune-Up

There were a few database-related goodies in Microsoft's announcement today about multiple Windows Azure updates, including a new Metro-like UI for the management portal, SQL Azure Federation, increased database size and lower cost-per-gigabyte for the biggest databases.

The Metro-style UI for the SQL Azure Management Portal includes new features such as "new workspaces with the ability to more easily monitor databases, drill-down into schemas, query plans, spatial data, indexes/keys, and query performance statistics," said an announcement in a Windows Azure blog post by Bob Kelly. The post explained that the updates were part of the new "SQL Azure Q4 2011 Service Release," the details of which were posted on another page, by Gregory Leake.

The size of the largest allowable increases to 150GB from 50GB, Leake said, while a new price cap will decrease the cost-per-gigabyte by 67 percent for the biggest databases. The cap is $499.95 per month.

SQL Azure Federation means "databases can be elastically scaled out using sharding based on database size and application workload," the post said. Federation will be supported in the new portal.

Other improvements include an updated CTP for the DAC Import/Export Service, which reportedly fixes several issues and allows easy import and export of databases between SQL Azure and BLOB storage.

Also, user-controlled collations are now supported, which means users can choose which type of collation to use when creating databases.

Microsoft said to stay tuned for more posts explaining SQL Azure Federation and the new management portal in more detail.

Posted by David Ramel on 12/12/20112 comments


Linux Added to the SQL Server Driver Parade

It took about three years from the release of the first Windows-specific SQL Server to a kind of opening up of the architecture with the inclusion of an Open Database Connectivity (ODBC) driver with SQL Server 7.0 in 1998. Some 13 years later, Microsoft has released the first preview of an ODBC driver for Linux.

Announced at the PASS conference in October, the Linux driver was released earlier this week. Specifically, it's a 64-bit driver (32-bit is planned) only for Red Hat Enterprise Linux 5, but it's a start.

This is just the latest in an openness campaign underway (or what it calls "Microsoft's Commitment to Interoperability") at Microsoft, something that would've been unheard of not that long ago, it seems. At about the same time as the Linux announcement, the company dropped the CTP3 of the JDBC Driver 4.0.

In August 2010, Microsoft Drivers for PHP for SQL Server 2.0 were released, for the first time including the addition of the PDO_SQLSRV driver, which supports PHP Data Objects.

A few months ago, Microsoft announced it was jumping all the way onto the ODBC bandwagon and planning to phase out the OLE DB technology it invented.

And, of course, I recently wrote about another opening up of SQL Server: the discontinuation of the LINQ to HPC project, replaced by support for the open source Apache Hadoop "big data" technology.

You can read more about Microsoft's database connectivity initiatives for ODBC, Java, PHP and more here. The company just continues to embrace new technologies and attract new developers. Welcome to the party.

What's the next open source move you'd like to see Microsoft make? Comment here or drop me a line.

Posted by David Ramel on 12/01/20112 comments


Microsoft Says It's Serious About Hadoop

The SQL Server world was abuzz lately with last week's announcement that Microsoft was discontinuing its LINQ to HPC (high performance computing) "big data" project in favor of supporting the open source Apache Hadoop in Windows Server and Windows Azure.

This was an interesting development in the larger context of Microsoft's turn-around embrace of the open source world and many who have questioned its motives and commitment (remember long-ago headlines such as "Microsoft raps open-source approach"?).

But if Denny Lee is representative of Microsoft's motives and commitment, it seems pretty genuine to me. Check out the blog he posted earlier this week, "What's so BIG about 'Big Data'?"

"We are diving deeper into the world of Big Data by embracing and contributing to the open source community and Hadoop," Lee said. And under a heading of "Openness - yes, we're serious about it!", he said "A key aspect is openness and our commitment to give back to the Open Source community." He then talks about Microsoft's participation in last week's "ultimate open source conference," ApacheCon North America 2011.

Lee said Hadoop is important to his Customer Advisory Team because "it is important for our customers," which may sound like marketing-speak, but he notes "we work on some of the most complex Tier-1 Enterprise SQL Server implementations" and goes on to discuss technical aspects of projects such as Yahoo's "largest known cube."

Lee explained more on his personal blog about why he left the BI world to so enthusiastically embrace open source: "It's about the openness of the Open Source community (apologies for the pun) that allows us to focus on solving the actual problem instead of trying to understand how a particular system works."

So say what you will about Microsoft and its marketing strategies, it looks to me like the company has some good people who are doing good work to solve problems that affect real-world users, regardless of the technology used. Sure, it might be a matter of survival in the new IT world, but if it benefits you, take it and run.

What do you think about Hadoop? Comment here or drop me a line.

Posted by David Ramel on 11/17/20111 comments


Developers Offered Pay-For-Use Database Cloud Service

Coinciding with a new SQL Server 2012 licensing model, OpSource Inc. introduced a cloud-based service that offers developers and others purportedly cheaper pay-as-you-go access to major database systems.

Called OpSource Cloud Software, the new product offers access to Microsoft SQL Server 2008 R2 Standard and other software. OpSource said the cloud service is "ideal for testing and development" in a news release.

While SQL Server 2012 comes with two licensing options--"one that is based on computing power, and one that is based on users or devices," according to a six-page datasheet--Cloud Software is available with hourly and monthly on-demand charges, OpSource said. According to a company Web site, SQL Server 2008 R2 costs 66 cents per hour per server. The pricing scheme is a little confusing to me, however. Although the news release stated: "Per Server priced Cloud Software incurs a specific rate per hour when a server is running and a specific rate per hour when a server is stopped," I couldn’t find any information about the rate for a stopped server. So I chatted with Chris, who kind of cleared it up a little, maybe, I think:

You are now chatting with 'Chris'

Chris: Thank you for your interest in OpSource. How may I help you?
me: I'm interested in the Microsoft SQL Server 2008 R2 Cloud Software product. How much is the hourly rate for a stopped server?
Chris: Well for the SQL server license, it has a built in rate of 0.66 cents per hour
Chris: And there will be additional costs for the device footprint as well
Chris: In regards to storage, CPU, and RAM
Chris: In a stopped state, you only pay for the storage
me: For a running server or stopped server? Your news release said there are two different rates for these?
Chris: You will pay the cost of the storage footprint in a standby state
me: What is that pricing structure, for storage?
Chris: However, you will be committed to a 0.66 cent rate even if the device is on standby for SQL
Chris: Well you are only being charged based on the footprint
Chris: Generally, the cost is close to 21.6 cents per GB
Chris: Per month
me: OK. One more question: Do you plan on offering SQL Server 2012 when it's available next year?
Chris: I'm not sure at this moment, I would anticipate us keeping up to date with that version in our new Application Layers
Chris: If you are interested, I can provide you with some trial credit to sandbox the environment
me: No thanks. That's all I had. Bye.
Chris: If you apply our promo code, you can get $200 worth of credit
Chris: Thank you for visiting. Please contact us at anytime.

Our questions and answers got a little out of sync (the chat box didn’t have one of those helpful "Chris is typing" indicators, so I asked more questions before I knew he wasn’t done replying), but you might get the idea, sort of, I hope.

The Cloud Software service also offers several editions of Oracle database products, with "monthly pricing based on number of processors, sockets and server configuration."

OpSource said the SQL Server product "supports up to four processors, up to 64 GB of RAM, one virtual machine, and two failover clustering nodes." It comes bundled with a Windows Server 2008 R2 image.

What do you think? Could this be a cheaper way for developers to test their SQL apps in a pseudo-production environment? Or would you be likely to forget to turn off a server and get one of those nasty cellphone-service-like bill shocks? Comment here or drop me a line.

Posted by David Ramel on 11/10/20111 comments


Developers Can Test 'Denali' in Amazon Cloud

Microsoft and Amazon are collaborating to offer developer testing of the next version of SQL Server in the Amazon cloud, promising an easier and cheaper evaluation than you could get with a local implementation.

The marriage of Microsoft SQL Server "Denali" (now, SQL Server 2012) and the Amazon Elastic Compute Cloud means developers only have to pay standard Amazon Web Services (AWS) rates to test the beta database software, currently in Community Technology Preview 3. AWS pricing for "standard on-demand instances" ranges from 12 cents to 96 cents per hour.

An AWS site promises easy deployment in five minutes. "With AWS, companies can utilize the Cloud to easily test the new functionality and features of 'Denali,' without having to purchase and manage hardware," the site says. "This provides customers with faster time to evaluation, without any of the complexity related to setting up and configuring a test lab for beta software."

Sounds good to me. I earlier wrote about how a beta evaluation of SQL Server nearly wrecked my system and caused hours of frustration (for me and many others) when I tried to remove it and install the free, Express version.

The Denali program is part of a broader initiative in which Microsoft has developed Amazon Machine Images (AMI) for testing of Web-based products such as WebMatrix and database-related software--basically SQL Server 2008 R2--all running on Windows Server 2008 R2. The Denali AMI was created just a couple weeks ago.

Have you tried testing any Microsoft products on the Amazon cloud? We'd love to hear about your experience. Comment here or drop me a line.

Posted by David Ramel on 10/27/20110 comments


Some Bumps in the Separation of Entity Framework and .NET Framework

It's almost like a feuding spouse who leaves the partner only to find out how much they're missed and decides not to cut ties completely and maybe hang out with each other now and then. Well, almost.

The Entity Framework team disassociated itself from the .NET Framework release schedule after EF 4.0 was released with .NET 4.0. The first manifestation of that new policy came last spring when the EF team released an update, EF 4.1, with developer-requested improvements such as Code First capability and a DbContext API.

"This is the first time we've released part of the Entity Framework as a stand-alone release and we're excited about the ability to get new features into your hands faster than waiting for the next full .NET Framework release," said a posting on the ADO.NET team blog announcing EF 4.1. That was followed up in August with the release of the EF 4.2 Beta 1 preview.

But today comes news that the trial separation didn't work so well and some new EF features--including much-wanted enum support--will have to wait for a full .NET Framework upgrade.

"Our new features that require updates to our core libraries will need to wait for the next .NET Framework release. This includes support for Enum Types, Spatial Types, Table-Valued Functions, Stored Procedures with Multiple Results and Auto-Compiled LINQ Queries" reads an entry on the ADO.NET team blog. [Editor's note: The preceding italicized text was changed due to an error; the italicized text that follows was also changed and refers to this same blog post. We apologize for the errors.]

The post explained that the EF team at first wanted to address these core library updates with a separate, full release of EF instead of waiting for .NET 4.5. The June EF Community Technology Preview was the result, offering up that "The Enum data-type is now available in the Entity Framework."

Well, not so fast. "While we are still pursuing this option it has become clear that from a technical standpoint we are not ready to achieve this immediately," the post said. No details about the technical problems were mentioned. The aforementioned list of EF enhancements "will reappear in a preview of Entity Framework that we will ship alongside the next public preview of .NET 4.5," the post said. The post didn't indicate when that might be.

The .NET Framework 4.5 developer preview was introduced in September at the BUILD conference.

What do you think of the EF and .NET Framework previews? When do you think you'll finally get that enum support? Comment here or drop me a line.

Posted by David Ramel on 10/20/20110 comments


Subscribe on YouTube