Is LINQ to SQL Dead?

Developers are reckoning with the fact that Microsoft's LINQ to SQL data access protocol is getting short shrift in Redmond these days as the company continues to shore up its focus on the second version of the ADO.NET Entity Framework.

Some would argue LINQ to SQL was DOA when it arrived in the .NET 3.5 Framework just over a year ago, but in Microsoft's recent messaging it leaves little doubt that the company doesn't have any major plans to further enhance LINQ to SQL. For many, the blog post by Tim Mallalieu, the program manager for both LINQ to SQL and the Entity Framework during PDC sealed its fate.

"We're making significant investments in the Entity Framework such that as of .NET 4.0 the Entity Framework will be our recommended data access solution for LINQ to relational scenarios," he wrote on Oct. 29. Two days later, as people were returning home from PDC, he added: "We will continue to make some investments in LINQ to SQL based on customer feedback."

Many are saying that is code for LINQ to SQL is finished. "It is dead as a door knob," said Stephen Forte, chief strategy officer at Telerik Inc. and a Microsoft regional director, speaking at a .NET User Group Meeting in New York two weeks ago.

To put Forte's remarks in context, he was giving a talk on the various data access alternatives, including the Entity Framework, ADO.NET Data Services with REST, and ASP.NET Dynamic Data, among others. "In my opinion there is going to be a shakeout; the first casualty will be LINQ to SQL," Forte told the group.

For his part, Mallalieu explains in his Oct. 31 post that Microsoft has been looking for ways to migrate both LINQ to SQL and LINQ to Entities. "At first glance one may assert that they are differentiated technologies and can be evolved separately," Mallalieu wrote at the time. "The problem is that the intersection of capabilities is already quite large and the asks from users of each technology takes the products on a rapid feature convergence path."

Andrew Brust, director of new technology at twentysix New York, said given both are relatively new, Microsoft's moves shouldn't prove disruptive to most developers. "Both are new and neither has gathered so much steam that the victorious emergence of the other could be viewed as a huge imposition," Brust writes in an e-mail. "To me it's like Blu Ray winning out over HD DVD. While people who bought HD DVD players and discs are not happy about the outcome, they represent a small group of early adopters, all of whom were warned of and understood the risks in making an early commitment."

Roger Jennings, principal of Oakland, Calif.-based Oakleaf Systems, authored this month's Visual Studio Magazine cover story covering object/relational mapping using LINQ to SQL. Jennings explains while Microsoft may abandon any significant enhancements of LINQ to SQL, it is forever part of the .NET 3.5 Framework, and despite Microsoft's messaging on the next version of the Entity Framework, many developers may still be inclined to work with LINQ to SQL.

"LINQ to SQL is alive and well," Jennings says. "They can't remove it because it's part of the .NET 3.5 Framework."

Jennings believes many developers will continue to use LINQ to SQL, given the direction Microsoft is taking Entity Framework v2. He for one, laments Microsoft's announcement last month that v2 won't support N-Tier architectures.

Jennings says Microsoft appears to be backing off on others features that were presumed to be slated for EF version 2. In a blog posting Tuesday, Microsoft explained how developers should migrate stored procedures developed with LINQ to SQL to EF using Visual Studio 10.

But, says Jennings, Microsoft made it less certain than earlier messaging that it will make the EF v2 cut. "What they are saying now is support for stored procedures might be implemented in EF v2, instead of will be," Jennings says. "Basically what they are doing is back peddling on their original commitment."

Jennings also pointed to the LINQ to SQL Designer, which allows developers to map stored procedures that return scalars. While acknowledging that such automatic code-generation of methods is missing, Microsoft is now saying "this is something that is being strongly considered for the next release of Entity Framework." Jennings said it was presumed that would make the EF v2 release. "That's called it's fallen off the list," Jennings says. "The upshot is it appears the team is paring their list of what they are going to implement in EFv2 form what the original plan was."

As a result, Jennings believes many developers might still opt to use LINQ to SQL via a Visual Studio add-in developed by Huagati Systems, based in Bangkok, Thailand. Huagati's DBML/EDMX adds menu options for synching LINQ to SQL designer diagrams with changes in the database.

"It's ironic that a lone developer can provide add-ins for features that the ADO.NET Entity Framework v2 team aren't even proposing for their upgrade," Jennings says.

What's your take on this? Drop me a line at [email protected].

Posted by Jeffrey Schwartz on 12/04/2008 at 1:15 PM3 comments


The Query Optimizer: Q&A with Microsoft's David DeWitt

Database administrators and developers converged on Seattle for this week's annual Professional Association for SQL Server (PASS) conference, where Microsoft is talking up its recently released SQL Server 2008 and the forthcoming upgrade, code-named "Kilimanjaro." You can read all about that here.

One of the key advances that will enable Kilimanjaro is "Madison," the code name for the technology that will allow SQL Server to handle massive parallel processing. Microsoft's acquisition of DATAllegro back in September is providing the key assets in developing Madison.

It turns out that much of that work is happening in Madison, Wis., where Microsoft back in March announced its database research lab, called the Jim Gray Systems Lab, located at a facility not far from the university. To run the lab, Microsoft brought on board as a technical fellow David DeWitt, who spent 32 years in academic research at the University of Wisconsin-Madison. DeWitt will be making his first public appearance as a Microsoft employee in front of his largest audience ever at PASS on Friday in a keynote address.

I had the opportunity, joined by my colleague Kathleen Richards, to talk with DeWitt this week. Here's an excerpt:

Was this lab built from the ground up?
I am still building it. It's a lot of work. I currently just have three staff members; we'll be finding up to six graduate students next semester. I have some open staff positions but I am very fussy on who I hire. I'm never going to have 50, and the goal is to have 10 to 15 full-time staff, mostly Ph.D.s and some masters students, but people that like to build systems. I am a real hands-on systems builder.

What are you working on?
We are working with the DATAllegro team to look at parallel query optimization techniques. Optimizing queries is hard, optimizing for a scalable database system is even harder, and query optimization is something I've been interested in for a long time. We have one project that involves looking at some optimization techniques that will come out in a future release of the DATAllegro product.

What role did you have in proposing, suggesting the DATAllegro acquisition?
Zero. I had absolutely no role in the acquisition process. I knew about it soon after I joined, but Microsoft is very careful about how it does acquisitions these days. I was not involved in any way in the technical decision on whether to buy it or not. But I think it's a great acquisition. They've got a great product and I think Microsoft's expertise will be able to take it to an entirely new level. It's a great team. We were there last week. We are excited about working with them. It was like a big Christmas present as far as I am concerned because now, all of a sudden, I am working at a company that has a really seriously scalable parallel database system. Having built three in my life, getting a chance to work on a fourth [is] just like Christmas.

How do you see taking it to the next level?
First of all, replacing Ingres with SQL Server will certainly drastically improve the kinds of performance we should be able to get. SQL Server is a modern database system and Ingres is an old system. The DATAllegro system avoided using indices because the indexing in Ingres was not very effective. I think we'll get all of the benefits SQL Server has as the underlying engine. We're going to get this huge boost, and I think that the DATAllegro is a startup and they have a great system but it's a startup, and there are a lot of things that were done in the area of query optimization [that] I think we can improve on. Having built a number of parallel database systems in the past, I think we can offer something when it comes to optimization of queries that will allow us to scale even higher.

How else do you see SQL Server advancing as a platform?
SQL Server will advance as a platform by using DATAllegro as the base. Will DATAllegro make SQL Server more scalable? Absolutely. I think query optimization is the main unsolved problem in data warehousing today. I think we know how to build parallel database systems that scale to hundreds of thousands of nodes. DATAllegro already has one customer that's 400 terabytes. Ebay has a competitor's system that has 5 petabytes. But there are really serious challenges of optimizing queries for hundreds of nodes and thousands of spindles. I think those are the opportunities that a team like mine can get its teeth into and make some forward progress. Query optimization is something that will come for a very long time, and we have some ideas for some new models for optimizing and executing queries that we will be exploring as part of the DATAllegro process.

You mentioned it can take 10 years for research to make it into a commercial product. Is that timeframe changing?
That's one of the goals of the lab. One of our ideas in setting up this lab was to have a much shorter path from the innovation by the graduate students and by my staff, into the product line. That's one of the reasons I am not part of Microsoft Research, even though I'm an academic researcher. I am part of the SQL Server organization and we intentionally put this lab as part of the SQL Server organization so that we had a direct path from the university into the SQL Server organization. It would not have made much sense to try to do this lab as part of Microsoft Research because we don't have a direct path.

What will you be talking about in your keynote later this week?
The other keynotes, they get to introduce new products and do fancy demos. I am just the academic guy. The talk is really going to be about the key components of a parallel or scalable database system, how partitioning works, here's the relationship between partitioning indices, here's what happens to a SQL query when it gets compiled on scalable parallel database systems. It will really be a lecture on the fundamental technologies behind today's scalable database products.

If you had to sum up your key message, what is your vision for where you'd like to see your efforts at Microsoft take the SQL Server platform moving forward?
I'd like to have us become the world leader in data warehousing. I think that we have a great SMP, product, it's easy to use, it's got great performance. We can take on Teradata. I don't see any reason why we should not become the premier solution for very large-scale data warehousing.

Posted by Jeffrey Schwartz on 11/19/2008 at 1:15 PM1 comments


Will Dublin Replace BizTalk?

Among many pressing questions that came up at last month's Professional Developers Conference (PDC) was whether Microsoft's new Dublin app server extensions will replace BizTalk Server. Microsoft says that's not the plan but it is important to understand what Dublin is.

Microsoft released the first CTP of its new distributed application server extensions to Windows Server, code-named Dublin, at PDC. Microsoft first disclosed its plans to build these extensions in concert with the introduction of its new modeling platform, code-named Oslo, last month.

According to Microsoft, Dublin will incorporate key components of the new .NET Framework 4.0 -- specifically the second iterations of Windows Communications Foundation (WCF) and Windows Workflow Foundation (WF). In addition to improving scalability and manageability, Microsoft said it will allow Windows IIS to function as a host for apps that use workflow or communications.

I attended a session at PDC that outlined Dublin, where Product Unit Manager Dan Eshner explained where Dublin fits. In short, if the modeling tool called Quadrant in Oslo lets developers create models or domain-specific languages (DSLs), think of Dublin as one deployment repository for those models. Dublin is scheduled to ship roughly three months after Visual Studio 2010, Eshner said, and will initially extend Windows Server, though it will ultimately be built into future versions of the platform.

"Dublin really is a hosting environment for WF and WCF services," Eshner said. The goal, he added, was to take the heavy lifting and skill requirements out of invoking WCF and WF services. "You can make these services work without Dublin, you just got to do some stuff. You've got to get all the configs set up and you got to do some work, create services out of them," he said.

Within Visual Studio Dublin will add project templates, and in the IIS Manager it will add WF and WCF management modules. It also adds discovery within the hosting environment, a SQL persistence provider, application monitoring, and adds versioning, partitioning and routing to messaging.

But questions abound regarding Dublin. To my original point, several people were trying to get a grasp on whether Dublin will ultimately subsume BizTalk during the Q&A portion of the session. Microsoft architect Igor Sedukhin said he doesn't see that happening. "Dublin is not intended to be an integration server at all," he said. "We aren't trying to put all the adaptors in Dublin. BizTalk is really focused on that integration scenario."

Cutting to the chase, one attendee asked: "Three years from now, will BizTalk as a product exist, and if it does, why would I want to pay for it?"

Yes, it will still exist, Eshner said. "We really believe that there is a ton of scenarios on BizTalk that we will not address in Dublin, or you would have to do a whole bunch of work to make Dublin work in the same kind of way that BizTalk does," he said, adding Dublin won't have the transforms and adaptors found in BizTalk. "BizTalk as an integration server is much more powerful than what you get with an app server like Dublin."

Eshner and his team addressed a few more questions regarding Dublin, among them:

To what degree will Dublin scale to support enterprise-class applications?
That will be more clear over the next six months. Though probably not as scalable as some would like, partners should be able to close the gap.

If Dublin is going to rely heavily on persistence, will it require shops to purchase SQL Server?
The details are still being worked out, but to scale, that will probably be a safe assumption.

What about transactions beyond the existing transaction services in Windows?
It's not clear how much will get added into version 1.

Will developers be able to deploy both locally and to Azure?
The Dublin team will be working with the IIS team using MS Deploy (Microsoft's new IIS Web deployment tool) to see if it can be leveraged. "That's a great thing to add to our future list to see how we can do that," Eshner said.

Have you looked at the Dublin bits? If so drop me a line at [email protected] and let me know what you think.

Posted by Jeffrey Schwartz on 11/12/2008 at 1:15 PM0 comments


PDC: Microsoft Goes Into the Blue

The names keep on changing at Microsoft. This week, SQL Data Services or SDS (formerly SQL Server Data Services or SSDS) became part of a broader group called "SQL Services." The technology is exciting even if the naming conventions leave some developers scratching their heads.

SQL Services is part of the rollout for Windows Azure -- another name that got a lot of people talking about Microsoft's inability to communicate its promising technology to developers...or the world at large, for that manner.

"I don't know how they come up with these names," voiced one Microsoft partner during his presentation. "I just hope I'm pronouncing it right." If he did, he was ahead of several Microsoft presenters and even some keynoters who offered several "variations" of Azure in the same speech.

SQL Services is the data storage component of the Azure Services Platform for building cloud-based apps. Just for showing up -- and for paying the $1,000-plus conference fee -- PDC attendees got the coveted "goods," which included a preview of Windows 7, the first Visual Studio 2010 CTP and an invitation to register for components of the Azure Services Platform, including SDS provisioning.

Redmond Developer News Executive Editor Jeffrey Schwartz and I got to sit down with Dave Campbell, the Microsoft Technical Fellow leading the SDS effort. We didn't really touch on the name change except to confirm it, but we did ask him all about Microsoft's evolving data platform. Look for our Q&A in the Nov. 15 issue of RDN. And see "PDC: Microsoft's Cloud-based SQL Services Redefined" for more data-related announcements at PDC.

Is the economic climate piquing your interest in cloud-based utility services? What would you like to see in SDS? Weigh in on SDS and Microsoft's naming habits at [email protected].

Posted by Kathleen Richards on 10/29/2008 at 1:15 PM1 comments


Can Next Release of SQL Server Bring BI To Masses?

When Microsoft outlined its BI strategy for future releases of SQL Server at its Business Intelligence Conference 2008 in Seattle last week, the company put forth an ambitious road map that looks to broaden the reach of its data management platform.

Ted Kummert, corporate vice president of Microsoft's Data and Platform Storage division showcased three efforts in play. First is the next release of SQL Server, code-named "Kilimanjaro," due out in 2010 and intended to further penetrate the enterprise database market owned by Oracle and IBM.

Second is Project "Gemini," a set of tools Microsoft is developing with the aim of bringing BI to a wider audience of information workers. The third project he outlined was Madison, aimed at taking the technology Microsoft acquired from DATAllegro, an Aliso Viejo, Calif.-based provider of data warehouse appliances and developing its own to be sold as hardware.

In addition to trying to up the ante with enterprise deployments, perhaps more notable about Kilimanjaro is that "it signifies a greater emphasis towards supporting the needs of end users by leveraging the capabilities of SQL Server and the ubiquity of Excel," writes Ovum senior analyst Helena Schwenk in a bulletin to clients.

"These are unchartered waters for Microsoft," Schwenk warns. "While Excel is a pervasive BI tool, it has certain technical limitations that prevent it from being used as a full-blown reporting and analysis tool."

Despite the challenge, the next release of SQL Server promises to address these limitations, she adds. If Microsoft makes its delivery goals and can price it competitively, Schwenk believes Microsoft could make further inroads into the BI market at the expense of other BI vendors.

Still, IBM, Oracle and SAP aren't sitting still. With all three having made huge acquisitions over the past year, the battle to broaden BI is still at an early state of evolution.

Posted by Jeffrey Schwartz on 10/15/2008 at 1:15 PM1 comments


Upshot to VSTS Database and Developer Edition Integration

As I pointed out in my last post, Microsoft is rolling VSTS Database Edition into VSTS Developer Edition, and effective immediately those with Microsoft Software Assurance licenses can use both for the cost of one.

The company's goal: get more traditional developers delving into the database and vice versa. But Randy Diven, CIO of Modesto, Calif. produce supplier Ratto Brothers Inc., raised an interesting question:

"Will I end up with two installations, one being the Development install and one being the Database install or are the product features designed to integrate with each other?," Diven wrote. "I am not overly excited about installing VSTS twice on my machine."

After all, VSTS is a big install and the last thing he wanted to do is end up rebuilding his workstation. "I am very interested in an integrated solution," Diven said.

Not to worry, said Cameron Skinner, product unit manager for Visual Studio Team Studio. "They integrate with each other," Skinner said in an e-mail. That helps, but Diven said he'd like to see an integrated install or clearer instructions that address interactions with Visual Studio 2008 SP1. "This is a big deal for VSTS programmers," he replied.

Skinner agreed and said that problem will go away with the next release. "This is a point-in-time problem with the current products and making them available given our decision to merge the SKUs," Skinner added. "Once we ship VS and VSTS 2010, the install will be integrated."

Posted by Jeffrey Schwartz on 10/03/2008 at 1:15 PM0 comments


Will VSTS 2010 Drive SQL Server 2008 Upgrades?

With Microsoft this week adding more information about its plans for the next release of its Visual Studio Team System, it bears noting that those that were not keen on upgrading from SQL Server 2005 to the new 2008 release may need to reconsider that stance.

That's because those who upgrade to TFS "Rosario" will need to use SQL Server 2008, as reported by my colleague Redmond Developer News senior editor Kathleen Richards, who points to VSTS lead Brian Harry's blog. "That was a controversial decision, but it is a final decision," Harry writes. "The primary driving force behind it is that the Report Server feature in SQL Server 2008 is sooooo much improved over that in previous versions that we simply could not pass up taking advantage of it for Rosario."

But considering the substantial new reporting capabilities in SQL Server 2008 and the likely release date of VSTS 2010, there's a "compelling" case to be made for Microsoft's decision, according to Andrew Brust, chief of new technology at twentysix New York.

"While it's a tough call to tether one new release to another, and doing so risks alienating some users, it's also true that if Microsoft released a version of TFS that didn't take advantage of now-released SQL Server 2008 technology, that a year or so post-release, Rosario would look under-featured," Brust responded in an e-mail, when I asked how customers might react to this latest change.

Presuming Microsoft upholds its practice of including a SQL Server Standard Edition in TFS moving forward, there are organizations that have strict policies about allowing new releases into their shops. Brust believes that, too, should not be a show stopper for VSTS shops. "Even in corporate environments where new versions of SQL need to be approved before deployment, one could make the argument that SQL 2008 is an intrinsic component of TFS Rosario and would thus qualify for a "waiver" of sorts." 

Another point worth noting: Microsoft is rolling VSTS Database Edition into VSTS Developer Edition, and effective immediately those with Microsoft Software Assurance licenses can use both for the cost of one. The goal: get more traditional developers delving into the database and vice versa, said Dave Mendlen, Microsoft's director of developer marketing in an interview last week.

"Developers are more hybrid today than they were in the past ... this needs to work not just with the core source code but also with the database becoming more and more important to them," he said.

What's your take on these latest moves? Drop me a line.

Posted by Jeffrey Schwartz on 10/01/2008 at 1:15 PM0 comments


Can HPC Help Mitigate Risk In These Turbulent Times?

In its latest bid to show that the Windows stack is suited for the most mission critical applications, Microsoft's release of Windows HPC Server 2008 this week promises to extend the limits of Redmond's data platform.

I attended the High Performance On Wall Street conference in New York, where Microsoft launched Windows HPC Server and the timing was quite ironic. On the one hand, Wall Street is undergoing a historic crisis -- indeed, the landscape of the entire financial services industry has unraveled. Meanwhile IT vendors made the case for performing complex risk analysis across large clusters that could yield better transparency and performance using methodologies such as algorithmic trading.

For its part, Windows HPC Server 2008 will push the envelope for those looking to run such applications on the Windows platform. But with everything that's going on, it will be interesting to see whether the potential rewards of such capabilities increases investment in high-performance computing or whether the risk becomes more than organizations are willing to bear.

Posted by Jeffrey Schwartz on 09/24/2008 at 1:15 PM1 comments


Reaching For The Clouds With SSDS

If you're a database developer, you may be wondering how SQL Server Data Services will affect how you build data-driven applications. SSDS is Microsoft's cloud-based repository that is available for testing through the company's community technology preview program.

When it comes to Microsoft's emerging cloud strategy, Microsoft is giving a lot of airplay to SSDS because it epitomizes the company's mantra that enterprise customers are most likely to adopt the hybrid approach to premises and cloud-based services, which it calls "software-plus-services."

To be sure, Microsoft is not currently targeting SSDS for transaction-oriented applications, though if you are developing or administering OLTP applications, SSDS could become a repository for referential and/or backup data.

But of all the new data-driven technologies Microsoft is offering these days, SSDS will be viewed as the simplest, according to Jim Williams, an account architect at Microsoft. Williams gave a session on SSDS at VSLive! New York last week.

"You're not going to write SQL against SQL Server Data Services," Williams said. "You are not going to see tables, you are not going to see foreign keys, you're not going to see the concept of referential integrity that you are used to."

Among some questions Williams addressed in his session:

Will SSDS support transactions?
There's no transaction semantics in this offering today. There certainly could be one in the future... Since a SOAP interface is supported, it would certainly be possible to offer Web services transactions.

If it doesn't need a SQL interface, what's on the client?
Any technology that knows how to do SOAP or REST. The samples in the documentation cover Ruby, Java, and C#.

How do will developers write queries against SSDS?
If you know LINQ, you know more than you need to make queries against SSDS the way it is today.

If you're interested in SSDS, you won't want to miss the detailed TechBrief by Roger Jennings, principal with OakLeaf Systems, which is in the current issue of Redmond Developer News.

Posted by Jeffrey Schwartz on 09/15/2008 at 1:15 PM0 comments


Oslo Coming To PDC

Oslo, the code-name for Microsoft's next generation modeling platform championed by chief software architect Ray Ozzie, is shaping up to have a prominent role at next month's Professional Developers Conference in Los Angeles.

While tidbits of information continue to unfold, it became apparent at this week's VSLive! New York show that Oslo will be one of many key technologies Microsoft showcases and that it will center around Microsoft's BizTalk Services, as several speakers pointed out (not to be confused with Microsoft's BizTalk Server, for which the company is planning to upgrade).

Douglas Purdy, a product unit manager at Microsoft, revealed in a blog posting earlier this week that he will giving a presentation on Oslo. In his posting he broke it down into three components:

  1. A tool that helps people define and interact with models in a rich and visual manner
  2. A language that helps people create and use textual domain-specific languages and data models
  3. A relational repository that makes models available to both tools and platform components

"That is it," Purdy wrote. "That is all Oslo is. Oslo is just the modeling platform." The question is what does that mean to .NET developers? Speaking during a panel session at VS Live! Brian Randell, a senior consultant at MCW Technologies, is that it will broaden programming to a much wider audience.

"His vision is that everyone can be a programmer," said Randell. "The idea behind this is they want to make building complex systems easier, and where the big word is modeling."

Still there was a fair amount of skepticism at VSLive! about Oslo as well. "It's important to realize that this whole Oslo initiative is an umbrella term that's talking essentially about a 10 year vision," said Rockford Lhotka, principal technology evangelist at Magenic Technologies, who was on the same panel.

Microsoft's announcement yesterday that it will join the Object Management Group was also a sign that Oslo will embrace the Unified Modeling Language.

Posted by Jeffrey Schwartz on 09/10/2008 at 1:15 PM1 comments


Despite Help From Microsoft, SQL Injections Remain A Threat

While the spate of SQL injection attacks appears to have died down from its peak earlier this year, it is still a considerable problem that should be on the radar of all database developers and DBAs.

Any SQL-based database server is vulnerable to a SQL injection, but the ones that have wreaked havoc this year have been directed at Microsoft's SQL Server via malicious code in a SQL query string, directed to the database via a Web app.

As reported last week, the number of unpatched Web sites that are exposing malicious code still is alarmingly high -- some seven of 10 Web apps are unsafe, according to Cenzic's Intelligent Analysis Lab report.

Of particular concern to database developers is the fact that one in five of those measured for by Cenzic had SQL injection applications. The finding comes as Microsoft released a new security filter for its Internet Information Services (IIS) Web server aimed at thwarting such attacks.

Microsoft's UrlScan 3.0 is an upgraded version of a five-year-old tool that now examines the query string in a SQL query request. That allows developers to create more granular rules for specific types of requests, Wade Hilmo, senior development lead on Microsoft's IIS team, which wrote UrlScan, told Redmond Media Group online editor Kurt Mackie. "For example, you can write a rule that only applies to ASP pages or PHP pages," Hilmo says.

While a step in the right direction, Kevin Beaver, founder and principal information security consultant of Atlanta-based Principle Logic LLC, tells Mackie that the features in UrlScan are rather basic. "It's good the features are now available, but getting admins and developers to actually upgrade is a whole different issue," Beaver tells Mackie.

And therein lies the problem. Until patching systems becomes a priority at the CIO levels, hackers are going to continue to have a field day.

Is your organization taking these threats more seriously? Drop me a line.

Posted by Jeffrey Schwartz on 09/03/2008 at 1:15 PM2 comments


Microsoft Unleashes SDK For SSDS

Microsoft yesterday released a software development kit for SQL Server Data Services, its forthcoming cloud-based service that will let organizations store and query data.

The first beta of SSDS was released back in March, announced with much fanfare at the Mix08 Conference by Microsoft chief software architect Ray Ozzie

The SDK includes the command-line tool and the SSDS Explorer demonstrated by Soumitra Sengupta at TechEd back in Orlando in June. "The team would appreciate if you can give it a spin and let us know what you like, what you do not like and above all file bugs that you see," Sengupta wrote in an MSDN posting yesterday. Testers do need an SSDS account in order to use the SDK, he noted. The SDK can be downloaded here.

Sengupta also suggests Microsoft may open up the SSDS tools. "I am personally curious to find out if there is any interest in the community to take over the code base for these tools," he asked in a follow-up post late yesterday.

If you haven't paid much attention to SSDS, perhaps you should -- it appears to be a key component of Microsoft's plan to offer a cloud-based repository for data-driven content. Microsoft, Google and others are looking to the success Amazon.com is having with its S3 cloud-based repository and the companies have come to the conclusion that this is the future of enterprise computing.

Have you looked at SSDS and the new SDK? Please share your stories with us.

Posted by Jeffrey Schwartz on 08/20/2008 at 1:15 PM2 comments


Upcoming Events