Linux Added to the SQL Server Driver Parade

It took about three years from the release of the first Windows-specific SQL Server to a kind of opening up of the architecture with the inclusion of an Open Database Connectivity (ODBC) driver with SQL Server 7.0 in 1998. Some 13 years later, Microsoft has released the first preview of an ODBC driver for Linux.

Announced at the PASS conference in October, the Linux driver was released earlier this week. Specifically, it's a 64-bit driver (32-bit is planned) only for Red Hat Enterprise Linux 5, but it's a start.

This is just the latest in an openness campaign underway (or what it calls "Microsoft's Commitment to Interoperability") at Microsoft, something that would've been unheard of not that long ago, it seems. At about the same time as the Linux announcement, the company dropped the CTP3 of the JDBC Driver 4.0.

In August 2010, Microsoft Drivers for PHP for SQL Server 2.0 were released, for the first time including the addition of the PDO_SQLSRV driver, which supports PHP Data Objects.

A few months ago, Microsoft announced it was jumping all the way onto the ODBC bandwagon and planning to phase out the OLE DB technology it invented.

And, of course, I recently wrote about another opening up of SQL Server: the discontinuation of the LINQ to HPC project, replaced by support for the open source Apache Hadoop "big data" technology.

You can read more about Microsoft's database connectivity initiatives for ODBC, Java, PHP and more here. The company just continues to embrace new technologies and attract new developers. Welcome to the party.

What's the next open source move you'd like to see Microsoft make? Comment here or drop me a line.

Posted by David Ramel on 12/01/2011 at 1:15 PM2 comments


Microsoft Says It's Serious About Hadoop

The SQL Server world was abuzz lately with last week's announcement that Microsoft was discontinuing its LINQ to HPC (high performance computing) "big data" project in favor of supporting the open source Apache Hadoop in Windows Server and Windows Azure.

This was an interesting development in the larger context of Microsoft's turn-around embrace of the open source world and many who have questioned its motives and commitment (remember long-ago headlines such as "Microsoft raps open-source approach"?).

But if Denny Lee is representative of Microsoft's motives and commitment, it seems pretty genuine to me. Check out the blog he posted earlier this week, "What's so BIG about 'Big Data'?"

"We are diving deeper into the world of Big Data by embracing and contributing to the open source community and Hadoop," Lee said. And under a heading of "Openness - yes, we're serious about it!", he said "A key aspect is openness and our commitment to give back to the Open Source community." He then talks about Microsoft's participation in last week's "ultimate open source conference," ApacheCon North America 2011.

Lee said Hadoop is important to his Customer Advisory Team because "it is important for our customers," which may sound like marketing-speak, but he notes "we work on some of the most complex Tier-1 Enterprise SQL Server implementations" and goes on to discuss technical aspects of projects such as Yahoo's "largest known cube."

Lee explained more on his personal blog about why he left the BI world to so enthusiastically embrace open source: "It's about the openness of the Open Source community (apologies for the pun) that allows us to focus on solving the actual problem instead of trying to understand how a particular system works."

So say what you will about Microsoft and its marketing strategies, it looks to me like the company has some good people who are doing good work to solve problems that affect real-world users, regardless of the technology used. Sure, it might be a matter of survival in the new IT world, but if it benefits you, take it and run.

What do you think about Hadoop? Comment here or drop me a line.

Posted by David Ramel on 11/17/2011 at 1:15 PM1 comments


Developers Offered Pay-For-Use Database Cloud Service

Coinciding with a new SQL Server 2012 licensing model, OpSource Inc. introduced a cloud-based service that offers developers and others purportedly cheaper pay-as-you-go access to major database systems.

Called OpSource Cloud Software, the new product offers access to Microsoft SQL Server 2008 R2 Standard and other software. OpSource said the cloud service is "ideal for testing and development" in a news release.

While SQL Server 2012 comes with two licensing options--"one that is based on computing power, and one that is based on users or devices," according to a six-page datasheet--Cloud Software is available with hourly and monthly on-demand charges, OpSource said. According to a company Web site, SQL Server 2008 R2 costs 66 cents per hour per server. The pricing scheme is a little confusing to me, however. Although the news release stated: "Per Server priced Cloud Software incurs a specific rate per hour when a server is running and a specific rate per hour when a server is stopped," I couldn’t find any information about the rate for a stopped server. So I chatted with Chris, who kind of cleared it up a little, maybe, I think:

You are now chatting with 'Chris'

Chris: Thank you for your interest in OpSource. How may I help you?
me: I'm interested in the Microsoft SQL Server 2008 R2 Cloud Software product. How much is the hourly rate for a stopped server?
Chris: Well for the SQL server license, it has a built in rate of 0.66 cents per hour
Chris: And there will be additional costs for the device footprint as well
Chris: In regards to storage, CPU, and RAM
Chris: In a stopped state, you only pay for the storage
me: For a running server or stopped server? Your news release said there are two different rates for these?
Chris: You will pay the cost of the storage footprint in a standby state
me: What is that pricing structure, for storage?
Chris: However, you will be committed to a 0.66 cent rate even if the device is on standby for SQL
Chris: Well you are only being charged based on the footprint
Chris: Generally, the cost is close to 21.6 cents per GB
Chris: Per month
me: OK. One more question: Do you plan on offering SQL Server 2012 when it's available next year?
Chris: I'm not sure at this moment, I would anticipate us keeping up to date with that version in our new Application Layers
Chris: If you are interested, I can provide you with some trial credit to sandbox the environment
me: No thanks. That's all I had. Bye.
Chris: If you apply our promo code, you can get $200 worth of credit
Chris: Thank you for visiting. Please contact us at anytime.

Our questions and answers got a little out of sync (the chat box didn’t have one of those helpful "Chris is typing" indicators, so I asked more questions before I knew he wasn’t done replying), but you might get the idea, sort of, I hope.

The Cloud Software service also offers several editions of Oracle database products, with "monthly pricing based on number of processors, sockets and server configuration."

OpSource said the SQL Server product "supports up to four processors, up to 64 GB of RAM, one virtual machine, and two failover clustering nodes." It comes bundled with a Windows Server 2008 R2 image.

What do you think? Could this be a cheaper way for developers to test their SQL apps in a pseudo-production environment? Or would you be likely to forget to turn off a server and get one of those nasty cellphone-service-like bill shocks? Comment here or drop me a line.

Posted by David Ramel on 11/10/2011 at 1:15 PM1 comments


Developers Can Test 'Denali' in Amazon Cloud

Microsoft and Amazon are collaborating to offer developer testing of the next version of SQL Server in the Amazon cloud, promising an easier and cheaper evaluation than you could get with a local implementation.

The marriage of Microsoft SQL Server "Denali" (now, SQL Server 2012) and the Amazon Elastic Compute Cloud means developers only have to pay standard Amazon Web Services (AWS) rates to test the beta database software, currently in Community Technology Preview 3. AWS pricing for "standard on-demand instances" ranges from 12 cents to 96 cents per hour.

An AWS site promises easy deployment in five minutes. "With AWS, companies can utilize the Cloud to easily test the new functionality and features of 'Denali,' without having to purchase and manage hardware," the site says. "This provides customers with faster time to evaluation, without any of the complexity related to setting up and configuring a test lab for beta software."

Sounds good to me. I earlier wrote about how a beta evaluation of SQL Server nearly wrecked my system and caused hours of frustration (for me and many others) when I tried to remove it and install the free, Express version.

The Denali program is part of a broader initiative in which Microsoft has developed Amazon Machine Images (AMI) for testing of Web-based products such as WebMatrix and database-related software--basically SQL Server 2008 R2--all running on Windows Server 2008 R2. The Denali AMI was created just a couple weeks ago.

Have you tried testing any Microsoft products on the Amazon cloud? We'd love to hear about your experience. Comment here or drop me a line.

Posted by David Ramel on 10/27/2011 at 1:15 PM0 comments


Some Bumps in the Separation of Entity Framework and .NET Framework

It's almost like a feuding spouse who leaves the partner only to find out how much they're missed and decides not to cut ties completely and maybe hang out with each other now and then. Well, almost.

The Entity Framework team disassociated itself from the .NET Framework release schedule after EF 4.0 was released with .NET 4.0. The first manifestation of that new policy came last spring when the EF team released an update, EF 4.1, with developer-requested improvements such as Code First capability and a DbContext API.

"This is the first time we've released part of the Entity Framework as a stand-alone release and we're excited about the ability to get new features into your hands faster than waiting for the next full .NET Framework release," said a posting on the ADO.NET team blog announcing EF 4.1. That was followed up in August with the release of the EF 4.2 Beta 1 preview.

But today comes news that the trial separation didn't work so well and some new EF features--including much-wanted enum support--will have to wait for a full .NET Framework upgrade.

"Our new features that require updates to our core libraries will need to wait for the next .NET Framework release. This includes support for Enum Types, Spatial Types, Table-Valued Functions, Stored Procedures with Multiple Results and Auto-Compiled LINQ Queries" reads an entry on the ADO.NET team blog. [Editor's note: The preceding italicized text was changed due to an error; the italicized text that follows was also changed and refers to this same blog post. We apologize for the errors.]

The post explained that the EF team at first wanted to address these core library updates with a separate, full release of EF instead of waiting for .NET 4.5. The June EF Community Technology Preview was the result, offering up that "The Enum data-type is now available in the Entity Framework."

Well, not so fast. "While we are still pursuing this option it has become clear that from a technical standpoint we are not ready to achieve this immediately," the post said. No details about the technical problems were mentioned. The aforementioned list of EF enhancements "will reappear in a preview of Entity Framework that we will ship alongside the next public preview of .NET 4.5," the post said. The post didn't indicate when that might be.

The .NET Framework 4.5 developer preview was introduced in September at the BUILD conference.

What do you think of the EF and .NET Framework previews? When do you think you'll finally get that enum support? Comment here or drop me a line.

Posted by David Ramel on 10/20/2011 at 1:15 PM0 comments


Google, Apple Play Catch-Up to Microsoft (for a change)

I'm no Microsoft fanboi, but I noticed an interesting tidbit when I recently wrote a news article about Google Cloud SQL, which adds a MySQL database service to the company's App Engine development stack.

In the comments section of the blog post announcing the new service, was this from reader Jeff King:

"Microsoft has had SQL Azure for ages so why would you need this?"

Now that's a switch. Usually it's the other way around: The slow, ponderous, bureaucratic, out-of-touch Redmond software giant is chastised for being behind the times and playing clumsy catch-up to the hip, nimble Web 2.0 pioneer.

Indeed, SQL Azure was introduced in March 2009. Truth be told, after Amazon basically pioneered the cloud phenomenon in 2006, Google beat Microsoft to the punch in the fight for the sky when it introduced App Engine in April 2008, about six months before Windows Azure was unveiled.

But, looking at the database component, it's clear that Microsoft has had a leg up on Google, which heretofore offered a datastore with a syntax similar to SQL called GQL. OK, how many of you developers have liked, or even used, GQL? Raise your hands (or flame me; your choice).

"One of App Engine's most requested features has been a simple way to develop traditional database-driven applications," said the Google Cloud SQL program manager in the previously mentioned blog post. Well, yaaah!

And today I noticed a news report that Apple is preparing to launch its iCloud. I know the products don't really compare--with Apple's focus on music and consumer entertainment as opposed to enterprise development--but launching an iCloud service in late 2011 seems a little iBehind.

And stodgy old Microsoft seems to have acquitted itself well in the cloud despite its late start, judging from this recent Ars Technica headline: "Windows Azure beats Amazon EC2, Google App Engine in cloud speed test."

I've even noticed some positive buzz about Windows Phone in the media as of late. Is Microsoft finally turning things around, like a huge supertanker that takes miles to change direction? Will it (gasp!) become cool? Well, let's not go overboard here.

What do you think about Microsoft: dying dinosaur or comeback kid? Comment here or drop me a line.

Posted by David Ramel on 10/13/2011 at 1:15 PM3 comments


Not Your Typical Data Driver Column

Dear ‹FirstName>,

In these trying times you occasionally just need to take a break from the business of data and have a good laugh. Which is what I did when I received the following e-mail, purportedly from a real data-related vendor. I'll protect that innocent by anonymizing the company/personal details in italics, but otherwise the message is presented as received:

Dear ‹FirstName›,

We're ‹insert emotion› to announce our research is nearly complete. In just a few ‹random time duration›, we'll be announcing the new Company Name Telepathy Source and Destination, allowing the everyday man and woman to read minds into an SSIS data stream.

Imagine being able to:

Read the entire encyclopedia in a matter of minutes

Output your wife's thoughts to find out how she really feels

Learn a new skill in seconds like Neo from the Matrix

Over the past week we've run a contest to see who can be the first to view this amazing research and I'm happy to announce that Person's Name is our winner. If you are Person's Name, please click the below link to see our research. If you are not Person's First Name, please do not click below. We operate solely on the honor system at Company Name.

Person's Name Click Here

‹Emotional Stub›,

CEO's Name, Founder of Company Name

So, I don't know if it was meant to harvest contact information or install malware or what, but it certainly provided some much-needed ‹insert pleasant emotion› to the Data Driver. What's the clumsiest troll you've ever received? Comment here or drop me a line.

Posted by David Ramel on 10/05/2011 at 1:15 PM2 comments


Windows 8 Ups the Data Transfer Ante

Talk about driving data: the audience broke into applause at last week’s BUILD conference when some of the new blazing fast data transfer capabilities were demonstrated by Microsoft’s Bryon Surace during a keynote address.

“With Windows Server 8, we can use multiple NICs [network interface controllers] simultaneously to help improve throughput and fault tolerance,” Surace said.

To demonstrate the new speedy data-transfer capabilities, Surace used a server running Hyper-V with two virtual machines, one of which was connected to two disks. One disk was connected using a 1GB Ethernet connection, a setup he described as “very typical, very commonplace in today’s environment.”

The other disk was connected “using multiple high-speed NICs that are leveraging SMB 2.2 multi channel and RDMA [remote direct memory access].” Starting up a SQL load generator and going to a performance monitor, Surace pointed out how the 1GB Ethernet card was transferring data at less than 100MB/sec., which he said was “pretty typical.” The second disk, however, was transferring data at more than 2GB/sec. That’s when the applause broke out.

“Now, previously, these technologies were only available in high-performance computing, but now with Windows Server 8, we're building them for one of the most common roles in Windows, Surace said. He went on to show that the NIC wasn’t saturated, but rather was using only about 15 percent of the available throughput.

“This is a clear indication that we haven't even scratched the surface of what's possible with Windows Server 8,” he said. “And as we move over and take a look at the performance, we're only using about 1 percent of the CPU on the server to be able to push this throughput.”

Surace also demonstrated the simplified storage array management capabilities of Windows Server 8. For this, he used a server connected to 16 SSD hard drives, with no specialized controllers, “just a bunch of disks, or JBOD, directly connected to our server and being managed by Windows.” He noted how the disks were used to create a storage pool for which some space was carved out and represented as a drive on the server. He also showed file shares connected by the “improved SMB 2.2 protocol.”

“So, the key here is you don't need a PhD in storage, Surace said. “You can simply attach just a bunch of disks to Windows and have it all managed and deployed right there.”

The full keynote can be viewed via Microsoft’s Channel 9 video service.

What are the software development ramifications of the new Windows Server 8? Comment here or drop me a line.

Posted by David Ramel on 09/21/2011 at 1:15 PM0 comments


Is Microsoft Really Embracing Big Data?

In the continuing effort to reach détente with the open source community, Microsoft is making inroads in the big data movement.

Last month, it released CTPs of Hadoop connectors for SQL Server and Parallel Data Warehouse "to promote interoperability between Hadoop and SQL Server." That's not so surprising--the Redmond software giant has made similar moves with other open source technologies.

But now there are signs the company may even be opening up to the so-called "NoSQL" data store world!

Microsoft's MSDN Magazine may be a harbringer of this trend. Up until now, there basically have been only two articles in the publication that dealt with NoSQL products. And both of them discussed the MongoDB product. One was actually a three-part series of articles by columnist Ted Neward, at the time an independent consultant. The other article was by developer evangelist Brandon Satrom, actually a Microsoft employee.

But for the November issue, there are two articles slated on the subject: an exploration of document databases by columnist Julie Lerman and an article on embedding the RavenDB data store into an ASP.NET MVC 3 app by Justin Schwartzenberger. And magazine editor Michael Desmond interviews the two in an editor's note titled "NoSQL? No Problem." I can't go into more detail now because the articles haven't been published (I just know about them because I'm technical editor of MSDN Magazine).

But is it just a coincidence that there have basically been only two previous NoSQL-themed articles and the November issue alone will double that? We'll see.

What do you think Microsoft is up to? Comment here or drop me a line.

Posted by David Ramel on 09/15/2011 at 1:15 PM2 comments


Looking at Juneau's Integrated Database Development

One of the nice things about my day job as technical editor at MSDN Magazine is getting early looks at cutting-edge technologies and how-to guidance from some of the top experts in the world.

This month, for example, Jamie Laflen and Barclay Hill explore “The ‘Juneau' Database Project,” which promises that “you can now perform your database development in the same environment as your application development.” That sounds nice. No more jumping around from one tool to another.

I found it particularly intriguing that the new Database Project in the next version of Visual Studio enables offline SQL Server development. The two SQL Server Developer Tools experts explain this “project-based development” provides the following advantages over using a shared live database:

  • Isolation of developer changes
  • Rich T-SQL editing support
  • Verification of source prior to deployment and enforcement of team coding standards through code analysis rules
  • Automated migration-script generation

Furthering the move to more self-contained development is SQL Server Express LocalDB, as introduced in a sidebar. It provides a kind of simplified user instance and lets you develop against SQL Server Express without having to fuss with managing a full-fledged desktop Express instance, cutting way back on setup time. Check out the article for more technical details. Your job as a .NET/SQL Server developer is about to get a lot easier.

What are you looking forward to in Juneau? Comment here or drop me a line.

Posted by David Ramel on 09/07/2011 at 1:15 PM1 comments


New NoSQL Language Unveiled As Debate Rages On

Pretty much every blog, article or discussion you see about the SQL vs. NoSQL debate includes sage advice from a reasonable voice of authority along the lines of something like this:

Whoa! Let's calm down. No need to fight. It's not a which-is-better issue, because each (tool/approach/language/philosophy) has its use. They should be used together as needed to solve different kinds of problems according to their strengths ...

And so on.

So it was interesting to read a comment on a blog post that went against that grain:

I wish it was as simple as SQL & RDBMS is good for this and NoSQL is good for that. For me at least, the waters are much muddier than that.

Tony Bain made that comment on a blog post by Conor O'Mahony titled "The Future of the NoSQL, SQL, and RDBMS Markets." Bain goes on to discuss the issue in detail, with much of the discourse from the perspective of a database developer. It's definitely worth reading by you data devs. It's also noteworthy that the blog posting was prompted by an article in The Register with the subhead "World says 'No' to NoSQL."

Were it that easy. Just a week or so earlier, in fact, there was much buzz generated when Couchbase released a "flagship NoSQL database" and an entirely new NoSQL query language called UnQL.

Does that sound like the desperate last gasp of a major player in a dead movement? Or will we one day look back and recognize it as a major step in an industry transformation?

You tell me. Comment here or drop me a line. I'm just happy that Couchbase provided some pronunciation guidance, a pet peeve of mine. UnQL is pronounced like the word "Uncle."

Posted by David Ramel on 08/22/2011 at 1:15 PM0 comments


How are Database Developers Doing, Salary-Wise?

I noticed in the comprehensive 16th annual IT Salary Survey that database developers lost their No. 1 spot in the category of average base salary by job title, actually falling three rungs down the ladder to No. 4.

Not that $95,212 is that bad. But still, that seemed like kind of a big drop in statistics that don't usually change that much from year to year. In fact, editor Michael Domingo said "Database programmers have been fairly consistent in the rankings, but dropped from the top spot to fourth from a dollar perspective. Still, based on percentages, they managed to go up nearly 7 percent from last year's result." Besides being consistent, database programmers "often rank highest," Domingo said in the more extensive PDF document, downloadable with registration.

So, what does a healthy 7 percent average salary hike combined with the reduced job title salary ranking really mean? It looks like that, while data devs are doing OK, others--especially networking project leads--are just doing a little better.

Indeed, when it comes to "salary by technology expertise," Domingo said, "The biggest gains from a year ago are those with database development skills, earning 6.2 percent higher." In that category, the average salary for those with database development skills was $92,460.

Another survey, conducted by Microsoft Certified Professional Magazine, also had good news for database developers. Domingo edited this one, too (the guy is everywhere). "Network project leads also often do well, but in the scheme of things, it's the DBAs and database developers who came in right above on the salary scale," he said. "DBAs and developers often tell us that they're well compensated and happy with their pay, and this year is no different. It's data, after all, that is at the heart of many businesses and good data people are often plied with incentives to either stay put or lured away to companies who can afford to pay higher salaries."

Whew! I guess that initial ranking drop I mentioned isn't that worrisome after all. Data still rules, and pays the big bucks. Now, for me, it's with relief that I return to wrestling with outer joins and normalization in that "Become a Database Developer in 21 Days" course I paid so much for.

What the heck is a tuple? Clue me in or comment otherwise, or drop me a line.

Posted by David Ramel on 08/04/2011 at 1:15 PM0 comments


.NET Insight

Sign up for our newsletter.

I agree to this site's Privacy Policy.