In the spirit of the back-to-school season, which is now upon us, I thought I'd share a story about how Gautam Arora, a Georgia Tech graduate student, spent his summer. Arora spent 11 weeks as a paid intern at Morgan Stanley, where he helped bridge the gap between the .NET-based order generation process used by portfolio managers with operational systems that are built in Java.
It's worth noting that Arora, a native of Bombay, India and currently attending Georgia Tech, is a Java developer who came to New York this summer with no prior .NET programming experience.
The clincher though, was that his project outshined presentations of five other seasoned developers who all outlined their own respective programming case studies at an "American Idol"-themed bakeoff called "Speaker Idol," held by the New York City .NET User Group last month.
During Arora's 10-minute presentation to a room full of local developers, held at Microsoft's midtown offices, Arora, who was the last of the six presenters, described his project. When he began his summer internship at Morgan Stanley's Investment Management group, the firm's asset management unit, he took on the task of providing better interoperability between the trade-order generation tools used by portfolio managers developed in Visual Basic for Applications with Microsoft Office as a front end and disparate Java-based back-end systems.
Arora said replacing Office with a Java-based UI was not an option. "Office in the enterprise is ubiquitous, it's familiar, it's powerful and extensible -- why take it away from my users and try to make a Java UI?" Arora explained.
As a result, he spent six weeks building reusable services oriented applications using Visual Studio Tools for Office. The SOA-based components provide connectivity between Office Business Applications and Java-based backend infrastructures such as IBM's DB2 and Sybase databases. Those repositories are typically non-Windows-based, running on Linux and Apache servers.
VSTO's rapid application development environment allowed him to create .NET services that could be consumed by different Java-based systems, Arora said. The goal was to improve the business process of how trade orders are generated. "It provides access to business process services in a standards-based approach," he said of the way processes are defined in VSTO.
"The code is not just written in VBA [where just] one portfolio manager is going to use it," he said. "[Now] everybody is going to use it; it will run on a Java back-end."
To build the order generator, Arora used VSTO 2003 using C# and WinForms. On the back-end he used Sun Microsystems Java EE -- Apache CXF, JAXB, Hibernate, Spring, and Apache Tomcat, among other tools.
In an interview following his presentation, Arora told me the biggest challenge was that he had never used .NET or Visual Studio before. "I'm a Java developer at heart," Arora said, though he said he did not encounter any difficulty figuring out .NET and VSTO. "It's a different world, but it's been very nice."
Does that make Arora a .NET convert? "I can think of some use cases where I might expand on my .NET experience," he said.
Arora was the only student in the competition. The six developers showed a variety of programming efforts ranging from the use of F# to arithmetic algorithms, before a panel of four judges: Andrew Brust, chief of new technology at twentysix New York; blogger Mary Jo Foley; Peter Laudati, a developer evangelist and Kathleen McGivney, a software consultant. Stephen Forte, chief strategy officer at Telerik, organized and moderated the event.
"We thought his presentation was the most broad-based, well rounded and he did the best job of conveying a business case, and really explaining it clearly," Brust said of the panel's decision to name Arora the winner. The prize: an Xbox 360.
Posted by Jeffrey Schwartz on 08/13/2008 at 1:15 PM1 comments
It's official: Microsoft has released to manufacturing its SQL Server 2008 database. Developers can download
the much-anticipated upgrade to the company's database server immediately from MSDN or TechNet.
Officials from Microsoft's Data Platform Group held a conference call for analysts and press to announce the RTM. The company had indicated last month that its release was imminent despite skepticism to the contrary.
Among the key new features Microsoft is touting is support for policy management, improved use of data encryption, the ability to store and query spatial data, a new report builder and improved support for analysis, reporting and administration. It also boasts new data compression capability, which the company said makes better use of storage and provides faster queries.
Microsoft officials belabored the point that organizations can upgrade without having to modify their software. "Customers can adopt these enhancements and features without making changes to their applications," said Ted Kummert, corporate vice president of Microsoft's Data and Storage Platform division.
But that raised another question on the call: Will Microsoft's cloud-based incarnation of its forthcoming database platform, dubbed SQL Server Data Services, or SSDS, be just as seamless to developers or will they require new interfaces or development methodologies?
"As we move things forward, I think you will see things change," Kummert said. "Our focus today is on SQL server 2008. I think in the next year, you will see a lot of clarity emerge around SSDS and how SSDS relates to our overall data platform. But the overall commitment is clear, that we are spanning this data platform vision to the cloud and we will provide a consistent application model across all tiers -- that is the edge, the data center and the cloud."
With so much buzz about cloud computing, this will certainly be something to watch.
Note to database developers: Tell us what you think of SQL Server 2008. Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 08/06/2008 at 1:15 PM0 comments
Microsoft's SQL Server database continued to outpace its larger rivals Oracle and IBM in the overall database market in 2007, according to this year's annual database server market share reports by IDC and Gartner. But providers of open source database servers, while still a small slice of the market, could have a significant impact over the coming years.
Both IT research firms last month released their annual market share reports for database software. SQL Server revenues grew 16.5 percent in 2007 over the prior year, compared to Oracle, which grew a more modest 14.9 percent, and IBM, which was up just 10 percent, according to Gartner analyst Donald Feinstein. However Oracle and IBM's revenues come from a much higher base -- $48.6 billion and $20.7 billion respectively, compared with SQL Server's $18.1 billion.
Microsoft's license fees tend to be lower than its larger rivals, suggesting unit-wise even more substantial growth for SQL Server. But the latter database server platform only runs on Windows, hence limiting its proliferation within the enterprise.
In the Windows server market, SQL Server accounts for a dominant share of 51.4 percent, compared with 28.8 percent for Oracle and 9.8 percent for IBM. In the Windows market, Oracle's growth of 21 percent outpaced Microsoft's 16.5 percent.
All that noted, Gartner this year drilled deeper into the open source database market.
The steady growth of open source database software could affect license fees going forward, Feinstein predicts. Until this year, Gartner had only broken out open source databases as a whole -- accounting for about 1 percent of the market. Having broken out the key players, here is the rundown: MySQL revenues were $65 million, up 46.9 percent, Ingress was just shy of $40 million, up 100 percent, and EnterpriseDB, supplier of the popular Postgres database was up nearly 9 million, though revenues were up 282 percent.
Feinstein points out since the open source database licenses are free, it's hard to compare it to suppliers of licensed software. Revenues for open source database software, of course, are all for services, etc. Because open source database software lags the features of licensed software today, it has not been a threat, but Feinstein says that could change over time.
"Looking out five years, if open source databases can start to be used in mission critical situations, they could have a major impact on pricing," he said.
Posted by Jeffrey Schwartz on 07/24/2008 at 1:15 PM0 comments
When I spoke last week with Fausto Ibarra, Microsoft's new director of product management for SQL Server, I asked why move his predecessor Francois Ajenstat (who is now working on Microsoft's green initiatives) off the team before the official RTM of SQL Server 2008? Ibarra explained that the product was officially kicked off in February during the Heroes Happen Hear Launch and the timing was right for the transition.
For Ibarra's part, it's onward and upward to the next release of SQL Server, where, if history should be our guide, will come out somewhere around 2011 -- though to be clear, that's what observers suspect. That didn't come from Ibarra or anyone else at Microsoft. All he would offer up on that front is Microsoft's goal of making it easier to manage all content across multiple tiers ranging from mobile devices to the cloud.
Key to that, he offered, will be Project "Velocity" and SQL Server Data Services. You can read more about Ibarra's plans in his new role here.
Data quality apparently is another key area of focus for SQL Server. The company's announcement this week that it will acquire Israel-based Zoomix gives it entrée into that space. The little-known startup offers what it calls Data Accelerator, server-based software that it says provides a scalable and fast approach to synchronization of critical data.
Data quality is an important aspect for numerous business operations, among them identity management, but it remains to be seen how much of a focus Redmond and its other key rivals will place on this area.
"If any of their customers have a need for a powerful matching engine, Zoomix has a pretty interesting product," says Forrester Research analyst Rob Karel. "It's relatively new; they have some customers but for the most part it's really an early stage technology that Microsoft acquired."
It remains to be seen whether Microsoft's key goal was to acquire the product or the team and presence in Israel, adds Gartner analyst Donald Feinstein. "Instead of buying a major vendor of data quality, they bought a development team in Israel," Feinstein says. "It will pay off in that they get the product that they got to date, but they also get the developers out of it."
Posted by Jeffrey Schwartz on 07/17/2008 at 1:15 PM0 comments
Six weeks ago I raised the question
: Will SQL Server 2008 slip again? The skepticism arose because Microsoft announced plans to release Service Pack 3 SP3 of SQL Server 2005.
Also raising doubts were testers of SQL Server 2008, saying there were still numerous bugs in the CTP. But Redmond officials insisted that SQL Server 2008 was still on track to be released to manufacturing in the third quarter and urged testers to report any bugs.
Now it appears it won't slip. At yesterday at Microsoft's Worldwide Partner Conference in Houston, the company said the database is on pace to be released by September 30th.
Fausto Ibarra, Microsoft's new director of product management for SQL Server, told me that he is quite confident that SQL Server 2008 will be ready to ship.
"The release candidate we have is feature complete, so our customers and partners can try out absolutely everything SQL Server offers," Ibarra says.
Still, among those who had raised doubts back in May was Andrew Brust, chief of new technology at twentysix New York, speaking at a meeting of the NYC .NET User Group, a user group he not so coincidently founded. I asked Brust for his reaction of Microsoft's latest announcement.
"There is mounting evidence that a Q3 RTM is the reality," Brust said in an e-mail, though he remains skeptical. "I'm still dubious, but that might be irrational at this point. I would, in any case, greatly prefer to see one more RC before RTM. That would raise my comfort level enormously."
It's not clear, however, whether Brust will see his comfort level rise. While Microsoft's Ibarra hasn't ruled out another release candidate, he sounded quite comfortable with the stability with the current build.
"We are getting some reports of bugs," he admits. "It's usual -- it's actually a small number. By now the product is very solid. We already have a lot of customers in production, we have a lot of applications that Microsoft is running on SQL Server 2008, including SAP -- we all get paid because of SQL Server 2008. At this point we are just in the final stages of testing."
Ibarra says a decision whether to issue another release candidate will come within a few weeks. What's your take? Is SQL Server 2008 almost good to go? Drop me a line.
Posted by Jeffrey Schwartz on 07/10/2008 at 1:15 PM1 comments
A vocal group of ADO.NET Entity Framework testers that has issued
a "vote of no-confidence"
is illuminating a long-standing conflict
between a segment of the .NET development community and Redmond. The petition
raises a big question: Is there a storm brewing among developers of data-driven
applications looking toward the latest iterations of the .NET Framework, or
is this just a tempest in a teapot?
Ironically, I got wind of the protest on Tuesday while attending the Data Services
World conference in New York, where none other than Michael Pizzo, a principal
architect on Microsoft's data programmability team, was giving a session on
LINQ, the Entity Framework and ADO.NET Data Services.
In his session, he demonstrated LINQ queries being used against the Entity
Framework. "We expect [that] when you're using the Entity Framework and
writing applications against the Entity Framework, LINQ will be the primary
way you do that," he explained in an interview following his demo.
As for the online
petition, Pizzo was unaware of it until I pointed it out to him (it was
posted just this week). However, he said the discord comes from what he called
a "vocal subgroup." Pizzo said the issues raised are under consideration
for a future release.
"While we certainly take them seriously and are planning on addressing
many of their concerns, specifically around persistence ignorance in the next
release, we have a large number of customers who see value in the Entity Framework
as it is in version 1," Pizzo said.
For his part, Tim Mallalieu, a Microsoft product manager, posted
a detailed response to the no-confidence vote in his blog.
Not surprisingly, the controversy has generated a lot of buzz. OakLeaf Systems
Principal Consultant Roger Jennings pointed
out in his own blog that this vocal group is spearheaded by NHibernate proponents.
Jennings pointed to a blog post by Ian Cooper, proponent of LINQ to SQL but an opponent to Entity Framework and a Microsoft MVP, who argued that this is history repeating itself
(see circa 2000 when complexities around Enterprise Java Beans surfaced).
"EJBs were an ambitious attempt to support persistence, transactions,
events, RPC, etc. in a single component," Cooper wrote
in his blog. "While there was an initial rush to adoption, they quickly
proved a millstone around the Java communities' neck, because of their complexity.
Technologies like Spring and Hibernate emerged as less complex ways of delivering
enterprise-class solutions. Many of them were later to be incorporated into
drastically revised EJB visions."
Cooper believes the .NET community can learn from that. "Ports of Hibernate
and Spring offered the community the chance to avoid the mistakes of the past,"
he added. "However, seemingly unaware of the lessons of history, the EF
team embarked on a project to produce a complex framework, of which persistence
is just one aspect, reminiscent of the EJB initiative."
Jennings described Cooper's assessment as a "level-headed critique of
the Entity Framework v1."
So what's your take on this? Will this affect the choices you make as you embark
on developing new data-driven apps using the Entity Framework and related technologies?
Drop me a line at [email protected].
Posted by Jeffrey Schwartz on 06/26/2008 at 1:15 PM3 comments
, I pointed to Microsoft's unexpected launch of Project Velocity, the
code-name for a distributed in-memory data caching platform, which it launched
at Tech-Ed Developers in Orlando.
Many jaws dropped when Microsoft unveiled Velocity -- not just as a project
but with enough
code to allow developers to test.
William Bain, CEO of ScaleOut Software, told me this week he was blindsided
about Velocity, only getting wind of it days before its announcement. Particularly
troubling, said Bain, is that ScaleOut is a Microsoft partner that is on the
fourth release of its distributed cache server, dubbed ScaleOut StateServer
(SOSS), used by a number of large organizations -- including Home Shopping Network
and Reuters -- for high-performance applications.
"This was a surprise to us," Bain said. "We are a Microsoft
partner focused on high-performance computing, particularly in the financial
services market and we've had a positive and successful relationship with them
over the past year."
Because Microsoft has hedged on its product plans, many are speculating that
the software giant might offer Velocity free of charge or, at the very least,
put the squeeze on other players like ScaleOut.
"The steps they take over the next year will determine whether it has
an effect on the market," Bain said. "If Microsoft should elect to
make distributed caching available as its own standalone SKU free of charge,
that does undermine the market for products of this type, coming from vendors
that have worked for years to develop and evangelize this market and that has
a negative effect on investment for new Microsoft-related technologies."
ScaleOut's pain, of course, could mean customers' gain. "A free entry
into this arena from a major vendor is sure to make the other vendors think
about lowering their prices," wrote
Marc Adler, technical head of complex event processing at a major investment bank
on Wall Street, on his blog. "In these economic times, financial companies
will certainly welcome the chance to embrace this technology without having
to spend a lot of money."
There are still many unknowns about Velocity, Adler points out. Among his questions
are what product group will commercialize it, how will it be tied to LINQ and
SQL Server, will it have support for third-party (even Java-based) message busses,
and will it link to non .NET environments and database servers other than SQL
As for whether or not it will commoditize in-memory data caching, ScaleOut's
Bain believes time is on his side. Based on the current CTP, Velocity only provides
a subset of the functionality in SOSS, Bain said. For example, SOSS already
supports high availability of stored data, offers a self-configuring and -healing
architecture, offers quorum-based updating of stored objects, supports push
notification of events within the cache, offers asynchronous replication between
data centers for disaster recovery and parallel query of cached objects using
If Microsoft ends up giving away Velocity or bundling it with other tools,
Bain said the onus on suppliers of distributed caching technology will be to
add value in other ways.
That said, he also knows if you can't beat them, join them. For example, he
didn't rule out supporting Velocity and the APIs that come with it.
"I think the Velocity API may become a defacto standard as the lowest
common denominator API that everybody will support," he said. "In
addition I think vendors like ourselves and our competitors will add value;
I know that we will add value beyond those APIs."
Posted by Jeffrey Schwartz on 06/19/2008 at 1:15 PM0 comments
On the data management front, one of the big surprises at last week's TechEd Developers conference in Orlando was Microsoft's release of Velocity, the code-name for a distributed, in-memory data caching platform that has been quietly under development in Redmond.
Velocity is designed to provide high-speed access to data developed in .NET via partitioned, replicated or local caches. "It's an application cache," said Anil Nori, a distinguished engineer in Microsoft's SQL Server group. Naturally, given his roots, I asked him if this is seen as an alternative to using the SQL Server repository. Nori explained the rational for data caching software, which he sees as a separate tier in the application stack.
"Today we have ASP.NET applications, which are going off to SQL Server," Nori said. "One of the bottlenecks we have is how do we scale our session state. The way we solve this is to use a distributed cache and you keep the session state in the cache, and it can provide a really large-scale place to really hold a session cache. Then you can provide better performance and scale."
What is not clear is Microsoft's intentions for commercializing Velocity. The company released it as a community technology preview (CTP) last week .
"We haven't figured out what is a long-term set of facilities, but one thing we feel strongly, is that it is aligned with the .NET stack rather than SQL Server," he said.
Nori did explain that CTP 1 Velocity offers dynamic scaling by adding nodes in memory and provides automatic load balancing and high availability by distributing copies of data across multiple nodes. However, it does not support any availability, Nori said.
In CTP 2, Nori believes Microsoft will offer higher functionality and it will round off with the utilities necessary to loading the caches, including high availability and improved integration for manageability. Microsoft is targeting its Professional Developers Conference in late October for that release, followed by a full release sometime in the second half of next year.
Meanwhile, Velocity caught a number of industry watchers off guard, though others say it makes sense for Microsoft to offer an in-memory data cache to help rev .NET applications.
"That wasn't something I saw coming," said Daniel Chait, managing director of New York-based Lab49, in an interview during this week's Securities Industry and Financial Markets Association (SIFMA). What remains to be seen, Chait said, is whether Microsoft plans to release Velocity as a product and in what form.
There are a number vendors who offer their own distributed cache products, among them GigaSpaces, Gemstone Systems, Oracle's Tangosol and ScaleOut Software. "[These companies] spent a lot of research and development money; they are naturally going to be upset," Chait said.
In fact, ScaleOut, took the unusual step of expressing its dismay in a press release. "We are disappointed that this surprise announcement has created confusion for our customers," the release stated.
Gideon Low, GemStone's principal architect, said during an interview at SIFMA that caching is just one component of its product family, which includes support for among other things complex event processing.
"It's really our roots. We began our product as a distributed caching product, but today it's really just a component of a much larger technology suite that we have," says Low. "Managing data in memory is very core to what we do because of the performance advantages of in-memory data management, but it really is a subset -- we're not a caching vendor so to speak."
Have you looked at the CTP yet? Let me know your thoughts.
Posted by Jeffrey Schwartz on 06/12/2008 at 1:15 PM1 comments
While Microsoft chairman Bill Gates yesterday talked up everything from Silverlight to robotics in his TechEd keynote yesterday
, he also gave a plug for database development and the forthcoming SQL Server 2008 release.
"It's a very big release in terms of what people can do in the data center, how these various pieces connect together, different types of data that we can understand in a very rich way. And so this is central, and it's a big investment for us, something that is very key," Gates said.
Interestingly, he talked up how SQL Server is becoming the engine for other data-driven Microsoft platforms including Active Directory and even SharePoint. "Microsoft has always had a central focus on SQL Server as the place to store data," said Gates. "It's where we have the greatest capacity, the ability to distribute, update, query in very rich ways. In fact, what you see us doing is taking all our different data driven activities, and pushing them into SQL."
For example, now meta directory stored in Active Directory is SQL-based, where objects are replicated. Gates said in the future, Exchange will use the SQL store as well. While SharePoint already uses SQL, "we'll expose more and more of that native SQL power to the SharePoint developer for them to do easy application development," Gates said.
Gates was joined on stage by Dave Campbell, who talked up SQL Server 2008's spatial data as well as its support for file types.
Meanwhile, Gates also threw a bone to shops that use Visual Studio Team System that are managing apps developed for IBM's DB2 database. With the new support, developers will be able to perform DB2 development within Visual Studio.
Posted by Jeffrey Schwartz on 06/04/2008 at 1:15 PM0 comments
Database developers going to Microsoft's annual North American Tech-Ed conference will have plenty of opportunities to fine-tune their development skills. This year, Microsoft has split the Orlando, Fla., Tech-Ed into two weeks of events, the first week is Tech-Ed North America 2008 Developers, which begins on June 3 with a keynote address from departing chairman and founder Bill Gates.
While it remains to be seen what Gates may have to say about data-driven development, it wouldn't be surprising to hear him talk up Microsoft's Language-Integrated Query (LINQ) and Microsoft's "Oslo" project. Microsoft, of course, released LINQ with Visual Studio 2008 as a means of making it easier for developers to build database queries into their applications, and it's a significant extension to the .NET Framework. As for Oslo, Microsoft’s next-generation modeling platform for building service oriented architecture applications, it would be disappointing if some new details didn't emerge from Gates or chief software architect Ray Ozzie next week.
Regardless of what the big brass has to say, those interested in Oslo may want to check out Jon Flanders' "Framework and Microsoft BizTalk Best Practices with an Eye Toward Oslo" session on June 4th and David Chappell's "The Road to Oslo" session the following day.
If your interest is more down and dirty SQL Server development, Ward Pond, a technology architect in Microsoft's business online service delivery group's SQL Server Center of Excellence, points out he will be delivering several sessions that cover the gamut. Among them are "Set-Based Thinking for the OLTP Developer" on Tuesday June 3. A day later, Pond will be giving three more sessions: "SQL Tricks," "Data Modeling for OLTP," and "Version-Stamping of Database Objects."
Posted by Jeffrey Schwartz on 05/28/2008 at 1:15 PM0 comments
The release of the public betas of the first Visual Studio 2008 and .NET 3.5 Framework service packs last week is an important milestone for database developers looking to bring Microsoft's newest data access technologies into production.
As reported last week, the beta of the first .NET 3.5 service pack includes updates to the ADO.NET Entity Framework for mapping object and relational data and ADO.NET Data Services (formerly code-named Astoria), a framework for building on-premise REST-based data services, layered on top of Windows Communication Foundation (WCF).
The release of the beta also signifies that SP1 is on pace for summer release. Like many new releases that don't find their way into production until the first service pack is released, Microsoft's newest release of the .NET Framework and development toolset is no exception.
I spoke last week with Clint Wood, application manager for the water district, based in Brooksville, Fla., which has built a services oriented architecture (SOA) based on the .NET Framework 2.0 and Visual Studio 2005. While the goal is to move to VS2008 and the version 3.5 framework, no way will they even consider the move until SP1 is out and validated, according to Wood.
"We'll feel much better once that first service pack is out," Wood told me. "We really can't afford to get knocked out of the water because we are waiting for bug fixes."
Presuming the service pack is released this summer as scheduled, the water district plans to start transitioning its .NET development to the new release in the October-to-November time frame, Wood says.
Posted by Jeffrey Schwartz on 05/21/2008 at 1:15 PM1 comments
The SQL injection saga first outlined here last week
continues in the form of new attacks, while others are talking about what developers need to do to minimize their exposure.
The Shadowserver Foundation, a volunteer watchdog consortium of security pros who track various threats, today reported that the latest SQL injection exploit is affecting the "winzipices.cn domain name." According to today's posting by Steven Adair, the attack has hit more than 4,000 pages.
Web sites that have been hacked will see the following HTML source code on the pages affected:
CNET senior editor Robert Vamosi posted an enlightening interview with Jeremiah Grossman, CTO of WhiteHat Security, who describes how these latest SQL attacks are carried out in comparison to attacks in the past. Notably, he touches on a sensitive spot with database developers and the theme of my last post: Microsoft's contention that this exploit is not a bug in SQL Server but rather it exploits a feature of the database -- and one that is not at risk if developers employ the proper practices.
Indeed, most who responded to last week's entry here agree that while the latest attack affects only SQL Server, critics should not be so quick to pounce on Microsoft.
"In this case I absolutely agree that it's not the fault of the DBMS," writes Lee from the Solomon Islands. "Since I first heard of SQL insertion attacks I developed a small set of functions to filter every input from my Web sites (or others). I can't see any excuse for not doing this once, and then simply copying the code from one project to the next."
Ralph Wilson, from Boerne, Texas, who was a developer and now is moving toward becoming a database modeler, says many developers are unaware of the impact SQL injections can have. "I can tell you from personal experience that there are a lot of developers out there who do not realize the possibilities and vulnerabilities of insertion attacks," Wilson wrote. "I know that, before encountering one, I didn't realize how easy it could be."
However, Wilson believes fault lies with managers who pressure developers to move projects along. "The Injection Attack Fault Line starts at the top of the IT chain of command, if not at the top of the organization," he says. "The management culture that tries too hard to run a 'lean, mean, coding machine' usually sets IT up so that it is about 90 percent staffed to do the 125 percent work load."
Managers and developers alike shouldn't underestimate the threat of SQL injections, writes Robert Robbins from Williamsport, Pa. "I've seen the SQL being injected in these attacks and it is really devious," writes Robbins. "It casts a lengthy series of numbers into the actual SQL command so just parsing for a few SQL keywords may not catch this. It also uses table cursors and system objects to discover the table and column names. That allows it to infect the entire database without knowing the schema. Even a Web site that protects against SQL injection may be unprepared for this particular example."
What affect has the latest spate of attacks had on your organization's development practices? If you've been hit, what are you doing about it? Drop me a line.
Posted by Jeffrey Schwartz on 05/07/2008 at 1:15 PM6 comments