Maybe it's not the sexiest programming language, but SQL continues to be relevant. In fact, TIOBE Software, which publishes a TIOBE Programming Community Index gauging the popularity of programming languages, named Transact-SQL the language of the year for 2013.
This "award" further emphasizes the importance of competency in SQL. I earlier wrote about how SQL gurus and other database-related programmers enjoyed excellent job security and how SQL Server developers were in high demand.
That's the good news. The bad news, according to TIOBE, "It is a bit strange that Transact-SQL wins the award because its major application field, Microsoft's database engine SQL Server, is losing popularity. The general conclusion is that Transact-SQL won because actually not much happened in 2013."
Not much happened in 2013? Wow, talk about strange. Has TIOBE heard of a little thing called Big Data?
Anyway, following Transact-SQL in popularity gains were Objective-C and F#. Objective-C had been the "language of the year" for the previous two years.
Microsoft fared well in other aspects, too, even regarding the much-maligned Windows Phone platform. As TIOBE wrote: "As we have seen the last decade, programming language popularity is largely influenced by external trends. The most important ones at the moment are mobile phone apps and web development. Android (mainly Java) and iOS (Objective-C) are the major mobile platforms, while Windows Phone (mainly C#) is catching up."
In other attempts at ranking the popularity of programming languages, SQL was No. 12 in a list developed by LangPop.com last October. Meanwhile, Python garnered the "language of the year" prize in the Popularity of Programming Language (PYPL) index, which measures how often respective language tutorials show up in Google searches. No variants of SQL made the top 10. TIOBE said its ratings "are based on the number of skilled engineers world-wide, courses and third party vendors."
In Google Trends, searches for "SQL Programming Language" held fairly steady throughout 2013, except for a strange dip right at the end of the year.
How do you feel about the importance of keeping your SQL skills honed? Do these popularity rankings mean anything at all? Comment here or drop me a line.
Posted by David Ramel on 01/16/2014 at 11:23 AM0 comments
Regardless of the future of the Microsoft ecosystem (and those latest quarterly numbers should slow the naysayers some), data developers can rest easy knowing their SQL Server skills are transferable in the New Data Order.
The latest example is yesterday's open sourcing of a new distributed SQL query engine for Big Data developed by Facebook, called Presto.
It was designed to improve upon existing solutions for Big Data analytics such as Hadoop MapReduce and Hive, Facebook's Martin Traverso said in a post announcing the move to GitHub. "Presto is a distributed SQL query engine optimized for ad-hoc analysis at interactive speed," Traverso said. "It supports standard ANSI SQL, including complex queries, aggregations, joins and window functions."
Traverso said Presto has provided performance gains of up to 10 times more than equivalent Hive/MapReduce tools in CPU efficiency and latency for most queries. While it doesn't run on Windows, "It currently supports a large subset of ANSI SQL, including joins, left/right outer joins, subqueries, and most of the common aggregate and scalar functions, including approximate distinct counts (using HyperLogLog) and approximate percentiles (based on quantile digest)," he said.
Yes, SQL isn't going anywhere. It has withstood challenges in one form or another from other Relational Database Management Systems such as Oracle, branch movements such as MySQL, hybrids like the NoSQL movement and so on. The Big Data onslaught seemed to be stealing much of its mindshare, but the pendulum is swinging back. The problem was that the specialized Hadoop-based solutions often proved too cumbersome to quickly and efficiently glean meaningful analytics from the vast new data stores.
"This enormous knowledge gap in accessing Big Data in Hadoop has prompted an avalanche of vendors to offer SQL-on-Hadoop solutions, which increase the accessibility of Hadoop and allow organizations to reuse their investment learning in SQL," stated a Gigaom report titled "Sector RoadMap: SQL-on-Hadoop platforms in 2013."
"SQL is widely known by most business analysts," the report continued. "Many nontechnical staff without a programming background can write SQL and use traditional business intelligence (BI) tools like Tableau, MicroStrategy, and Business Objects to query data."
Further evidence of SQL's solid positioning came in a recent presentation by Roger Magoulas, research director at O'Reilly Media, at the Strata Conference + Hadoop World event. He spoke about "the state of data science as a profession." An O'Reilly salary survey conducted last year reported that the top tool in use by the responding data scientists was SQL. "I guess it's not a surprise ... we heard some of the other speakers talk about it ... that SQL is still the top thing being used," Magoulas said. His accompanying slide proclaimed "SQL Rules" and indicated 71 percent of respondents reported it as their data science tool of choice. Hadoop was a surprising No. 5 at 35 percent. SQL users also fared well in the salary level findings, but averaged below the far-fewer number of Hadoop specialists.
You can expect to soon hear about more SQL-related Big Data initiatives, joining high-profile efforts such as Teradata's Enterprise Access for Hadoop; Cloudera's Impala; IBM's Big SQL Technology Preview; Hortonworks' Stinger; and many, many more. Stay tuned.
What do you think of the SQL-on-Hadoop and other SQL-related Big Data technologies? Comment here or drop me a line.
Posted by David Ramel on 11/07/2013 at 12:50 PM0 comments
Microsoft today announced the availability of SQL Server 2014 CTP2, a near-final version highlighted by new in-memory capabilities formerly called Project Hekaton.
The in-memory enhancements include new Online Transaction Processing (OLTP) features (Hekaton), and column store technology, along with better T-SQL support and new indexes and advisory tools. The product already featured in-memory data warehousing and business intelligence functionality.
New cloud capabilities such as easier backup and recovery in Windows Azure were also announced by Microsoft exec Quentin Clark in his keynote address at the Professional Association of SQL Server (PASS) Summit 2013 in Charlotte, N.C.
Clark noted that this "public and production-ready" release is shipping only 18 months after SQL Server 2012, a much faster release cycle than previous SQL Server versions.
"A year ago we announced project 'Hekaton,' and today we have customers realizing performance gains of up to 30x," said Clark, corporate vice president of the Data Platform Group. "This work, combined with our early investments in Analysis Services and Excel, means Microsoft is delivering the most complete in-memory capabilities for all data workloads -- analytics, data warehousing and OLTP."
Clark also nodded to the Big Data phenomenon, noting that customers are collecting and storing more data than ever before, such as machine signals, data from devices and services and even data from outside the organization, "so we invest in scaling the database and a Hadoop-based solution."
The new Windows Azure backup and recovery features are part of Microsoft's "hybrid cloud" strategy in which customers can work with SQL Server databases on-premises and back them up and recover them from the cloud. Clark announced today that all currently supported SQL Server releases can use Windows Azure backup. A preview of a stand-alone SQL Server Backup for Windows Azure Tool will be available for download later this week.
The CTP2 trial is available for download now along with a product guide.
Posted by David Ramel on 10/16/2013 at 11:21 AM0 comments
Microsoft last week updated its latest WCF Data Services version so it will work with Entity Framework 6.
Rather than requiring the download of a new WCF DS version, the update to version 5.6.0 comes in the form of an out-of-band alpha1 NuGet package called, appropriately, WCF Data Services Entity Framework Provider.
WCF DS 5.6.0
was released in August with support for Visual Studio 2013, portable libraries and other enhancements. The VS 2013 support lets developers consume OData services with the Add Service Reference functionality. The portable libraries support lets developers use the new streamlined JSON format
(part of the OData v3 protocol, sometimes called "JSON light") in Windows Store and Windows Phone 8 apps. While core libraries have support for .NET Framework 4.0, the WCF DS client portable library support targets .NET 4.5. Both the core libraries and the WCF Client have support for Windows Phone 8, Windows Store and Silverlight 5 apps.
Users, however, wanted more. A couple of readers responded in the comments asking about OData v4 support, and one asked, "Does this release include EF 6 support." Microsoft's Mark Stafford last week replied, "Sort of. The EF 6 support will come in a different NuGet package, which will go into alpha today."
The Oct. 2 NuGet update that catches up to EF 6 was made possible by some work the WCF DS team was doing to make providers public. The team wanted to override provider behavior so developers could use features such as spatial properties and enums, which lack native support in the OData v3 protocol. "Unfortunately we ran into some non-trivial bugs with $select and $orderby and needed to cut the feature for this release," the team said in its August announcement.
However, that work paid off later, the team said in last week's update announcement. "We were able to build this provider as an out-of-band provider (that is, a provider that ships apart from the core WCF DS stack) because of the public provider work we did recently" the team said.
The new support for EF 6 basically results from a new class, EntityFrameworkDataService<T>, where T is type DbContext. For previous EF versions, developers should use the base class DataService<T>, where T is type DbContext or the older ObjectContext.
"The new public provider functionality comes with a little bit of code you need to write," the team said. "Since that code should be the same for every default EF 6 WCF DS provider, we went ahead and included a class [the new EntityFrameworkDataService class] that does this for you."
The team admitted that it didn't have time to do full testing on the new provider because the developers were "heads down" preparing for OData v4 support. "So ... we're going to rely on you, our dear customer, to provide feedback of whether or not this provider works in your situation. If we don't hear anything back, we'll go ahead and release the provider in a week or so." Which should be right around now.
Have you tried it yet? Comment here or drop me a line.
Posted by David Ramel on 10/10/2013 at 6:41 AM0 comments
I was dropped by my previous auto insurance company for a couple of at-fault accidents on my wife's driving record.
Trouble was, she was not involved in those accidents in any way. They happened to somebody else and somehow got on her report from a data collection company used by the insurer. And, try as I might, I could not convince the insurance company of this. I provided the company with a note from my previous insurer confirming that those accidents were not hers. I even provided an official driving record from the state showing those weren't her accidents. It didn't make any difference to the insurance company (as much as I'd like to see the company burned to the ground in an agonizing bankruptcy, I won't name it, but it definitely wasn't on my side). The accidents were on the ChoicePoint report--that's all that mattered.
I contacted the data collection company and began the nightmarish process of trying to get their information corrected. I eventually gave up; it just wasn't worth the hassle they were putting me through. (Ironically, I've never--ever--been in an at-fault accident. Believe it or not, I've never even received a moving violation, in several decades of driving. I was probably one of the best customers the insurance company could've had.)
I bring up these painful memories because of recent reports about a Big Data company, Acxiom, that this month announced a portal where individuals can look up information collected about them. Several articles noted that the portal, AboutTheData.com, reported some incorrect information. So I checked it out.
Sure enough, the site had a few things wrong, including my birth date, which was strange because I had just provided that date as part of registering for the privilege of looking up my info (pretty sly way to collect data, when you think about it--these people aren't stupid, like people in some other companies, if you know what I mean). They also got my education level, race and age of children wrong, among a few other things. Keep in mind the portal is in beta, and it gives you the chance to correct the data (I didn't even try to go there) and even opt out of the system.
So, just a warning: If you're a developer and your company is hopping on the Big Data bandwagon and you're assigned to the project, be very careful about the quality of the information you collect, especially if the data will have a significant impact on the success of the project--and the company's bottom line.
I mean, just imagine how much money that previous insurance company left on the table if the fiasco I experienced was commonplace among its multitude of customers. Fortunately, my new insurer uses a different data collection company that actually has accurate information and I got a sweet rate. And my present insurer is soaking up those monthly premiums and hasn't had to pay out a dime. Think of that, repeated thousands and thousands of times. If they only knew, I imagine the headquarters honchos in Columbus, Ohio, would be kicking themselves.
Have you any Big Data horror stories? Comment here or drop me a line
Posted by David Ramel on 09/19/2013 at 6:47 AM0 comments
Microsoft may have been late to the cloud party, but its Windows Azure ranks near the top when it comes to popularity for data-related development, according to a new survey from Forrester.
The Forrsights Developer Survey, Q1 2013 found that North American and European developers preferred the Amazon EC2 cloud service for their compute services by a significant margin over Windows Azure, but it's a different story for Relational Database Management Systems (RDBMS) services.
"Microsoft and Amazon are neck and neck amongst users of cloud RDBMS," said Forrester analyst Jeffrey Hammond in a blog post yesterday.
The survey found that the top three types of cloud services adopted by developers are compute, storage and relational data services. Hammond said 47 percent of respondents regularly use compute and storage services, while 36 percent use RDBMS services. These numbers come from 325 developers in the North America and European regions (out of 1,611 total) who reported they had used cloud computing or elastic applications.
Of those respondents using cloud compute services, 62 percent said they were using Amazon EC2 or planned to expand their use of it in the next year. That compares to 39 percent for Windows Azure and 29 percent for the Google Cloud Platform.
That gap of more than 20 points in adoption "is well outside a standard margin of error, so we have to give the nod to AWS when it comes to compute," Hammond said.
"Things are very different when it comes to developers using cloud-based RDBMS services," Hammond said. There, 48 percent of developers reported they were "implementing and expanding" use of Microsoft SQL Azure, followed by 45 percent for the Amazon RDS service (see Figure 1). However, that 3-point gap is within the margin of error.
"Also note that there are a high number of developers that are planning to implement Amazon's RDS service (27 percent) while 7 percent of Microsoft SQL Azure developers plan to decrease or remove their RDBMs workloads," Hammond said. "Overall, we'd have to rate this workload as a push--with no clear adoption leader."
The percentage of developers who reported using cloud storage and plan to expand that usage over the next 12 months was almost equally divided among Amazon Web Services, Windows Azure and Google Cloud Storage, at 25 percent, 22 percent and 23 percent, respectively.
"Amazon still has a larger body of developers that have implemented but are not expanding AWS S3 (17 percent compared to 10 percent for Microsoft Azure and 9 percent for Google, respectively)" Hammond said. "Our take: this workload looks like it's headed for a strongly competitive market in 2014."
So, despite a lag of about 4 years between the introduction of Amazon EC2 (August 2006) and Windows Azure (February 2010), Microsoft has caught up in attracting developers to its cloud platform. That's interesting news, considering the popular backlash about Microsoft's decision to not provide developers early access to the Windows 8.1 RTM.
What is it that makes Windows Azure database-related services so popular among developers? Comment here or drop me a line.
Posted by David Ramel on 09/05/2013 at 1:57 PM0 comments
The Microsoft Entity Framework has a spotty history of inconsistent release strategies, lagging feature requests and other issues, but things seem to be getting better with new leadership and even community contributions since it went open source.
The past problems with the Object Relational Mapping (ORM) tool were candidly discussed this week when EF team members Rowan Miller and Glenn Condron visited with Robert Green on the Visual Studio Toolbox Channel 9 video show to preview EF 6, which is "coming soon."
Reviewing past release history, Miller admitted the 2 years it took to update EF 3.5 SP1 to EF 4 was excessive. "That's really slow if you're waiting for new features," he said. Trying to get new features out to customers sooner, the team tried a hybrid model for subsequent releases with the runtime core components being part of the .NET Framework while new features shipped out-of-band via NuGet and the tooling was part of Visual Studio.
"We made one not-very-good attempt at doing an out-of-band release of the tooling, the 2011 June CTP of 'death and destruction' as we call it on our team," Miller said. "It went horribly; our product just wasn't set up for shipping out-of-band."
While the hybrid model got features out quicker, Miller noted there was a confusing mismatch of EF and .NET versions, so the team "bit the bullet" going from EF 5 to EF 6, going out-of-bound by moving the runtime to NuGet and the tooling to the Microsoft Download Center, while also being "chained in" to new Visual Studio releases.
The whole history, which Condron admitted was "still confusing," is explained in the 1-hour-plus show if you want to try to sort it out.
I was more interested in the move to open source at the same time EF was moved out of the .NET Framework, putting source code, nightly builds, issue tracking, feature specifications and design meeting notes on CodePlex. "Anything that we can make public, we made public," Condron said. The team also opened up the development effort to the community. "We happily accept contributions from as many people as we can," Condron said. But that strategy raised concerns by some, so he emphasized only the EF team has commit rights, and any pull requests must go through the same rigorous code review and testing process as do internal changes.
"We've gotten 25 pull requests; we've accepted 21 from eight different contributors," Condron said. He noted that while the development effort is open source, the shipping, licensing, support and quality remain in the hands of Microsoft.
Green noted that the strategy was a great way to get more people on the development effort, and Condron agreed, citing the thorny issue of enum support long requested by thousands of customers but not introduced until EF 5 and .NET 4.5. "Why is there no enum support--because nobody had written enum support," he said. "If you guys write it, we'll put it in." You can see the list of contributors on CodePlex (though only seven were listed in the Feb. 27 post).
Other benefits of open source, Miller said, include better customer relations, transparency and release cadence. "Back in the 3.5 SP1 days, we weren't listening to customers that great, but we've ... opened up the EF design blog" and started doing more frequent releases, he said. Condron also noted that with the nightly builds, the team has even gotten a few immediate bug reports the very next day after some source code was broken in the ongoing development efforts, which he said was "fantastic."
And the open source strategy will be expanding, Miller said. "The open source codebase at the moment just has the runtime in it" he said. "The tooling for EF 6 isn't going to be there, but after the EF 6 release, we're planning to put the tooling in as well. So we'll have the tooling there and you'll get to watch us develop on it, and we'll even accept pull requests on it as well."
Previewing EF 6, Miller and Condron ran through many new features coming from the EF team, such as nested entity types, improved transaction support and multiple contexts per database. But other new features are coming from community contributors, such as custom migrations operations, improved warm-up time for large models, pluggable pluralization and singularization service and "a few other cool things."
EF 6 is currently in Beta 1, and will go to RTM at same time as Visual Studio 2013, with an RC1 coming "soon."
[CLARIFICATION: The RC1 was actually made available on Wednesday, Aug. 21.]
For more information, see the EF feature suggestions site and CodePlex for community contribution guidelines and a product roadmap.
What do you think of the new EF 6? Comment here or drop me a line.
Posted by David Ramel on 08/22/2013 at 8:00 AM0 comments
Here's a troublesome aspect of the Big Data revolution I didn't expect: the melding of mind and machine. IBM yesterday unveiled a completely new computer programming architecture to help process vast amounts of data, modeled on the human brain.
The old Von Neumann architecture that most computers have been based on for the past half century or so just doesn't cut it anymore, scientists at IBM Research decided, so something new was needed, starting from the silicon chip itself. Ending with ... who knows, maybe some kind of organic/electronic hybrid. I keep getting visions of Capt. Picard with those wires sticking out of his head after being assimilated by the Borg. (If you don't know exactly what I'm talking about here, I've misjudged this audience completely and I apologize--please go back to whatever you were doing.)
The Von Neumann architecture encompasses the familiar processing unit,control unit, instruction register, memory, storage, I/O and so on. Forget all that. Now it's "cognitive computing" and "corelets" and a bunch of other stuff that's part of a brand-new computer programming ecosystem--complete with a new programming model and programming language--that was "designed for programming silicon chips that have an architecture inspired by the function, low power, and compact volume of the brain," IBM said.
Those would be "neurosynaptic chips" stemming from IBM's SyNAPSE project headed by Dharmendra S. Modha, unveiled in August 2011. In a video, Modha called the new system a "brain in a box." Now IBM is sharing with the world its vision of a new programming language and surrounding software ecosystem to take advantage of the chips, partially in response to the Big Data phenomenon.
"Although they are fast and precise 'number crunchers,' computers of traditional design become constrained by power and size while operating at reduced effectiveness when applied to real-time processing of the noisy, analog, voluminous, Big Data produced by the world around us," IBM said in its announcement.
That theme was echoed in the description of the SyNAPSE project, which explained that the old "if X then do Y" equation paradigm wasn't enough anymore.
"With the advent of Big Data, which grows larger, faster and more diverse by the day, this type of computing model is inadequate to process and make sense of the volumes of information that people and organizations need to deal with," IBM said.
The company said its new system could result in breakthroughs such as computer-assisted vision.
"Take the human eyes, for example," IBM said. "They sift through over a terabyte of data per day. Emulating the visual cortex, low-power, light-weight eye glasses designed to help the visually impaired could be outfitted with multiple video and auditory sensors that capture and analyze this optical flow of data."
That's fine and dandy--and seriously laudable. But, of course, these systems will eventually somehow be connected. And with the machine learning focus, the machines will, well ... learn. And after they learn enough, they will become self-aware. And when they become self-aware, they'll realize they're vastly superior to pathetic humankind and don't really need us around anymore. And then you've got some kind of dystopian nightmare: Skynet and Arnold androids, Borg, whatever.
Well, that's fine. I, for one, welcome our new machine overlords. I'm getting fitted for my headpiece right away. Have a good weekend!
What's your plan? Are you up for learning a new programming language? Did you get all the pop culture references? Please let me know by commenting here or dropping me a line.
Posted by David Ramel on 08/09/2013 at 2:53 PM0 comments
The SQL Server community this week engaged in a lively debate about limitations of the 2014 Standard Edition and Microsoft licensing practices.
The discussion--highlighted on Hacker News--was sparked by a post by database consultant/blogger Brent Ozar, titled "SQL Server 2014 Standard Edition Sucks, and It’s All Your Fault."
"Every release lately, Microsoft has been turning the screws on Standard Edition users," Ozar wrote. "We get less CPU power, less memory, and few (if any) new features."
He complained that organizations wishing to use more than 64GB of memory needed to step up to the more expensive Enterprise Edition (see feature comparison). After listing deficiencies of the Standard Edition--such as not allowing database snapshots, auditing, numerous business intelligence (BI) features and more--he enjoined readers to boycott the product:
"Microsoft won’t change its tack on SQL Server licensing until you start leaving. Therefore, I need you to stop using SQL Server so they’ll start making it better. You know, for me."
You should check out the Hacker News discussion (116 comments as of noon Wednesday). Even if you aren't interested in the squabbling--some readers dared to contend that 64GB was plenty of RAM for small businesses, for example--you can learn a lot about the nitty-gritty concerns and trials and tribulations of SQL Server users and developers on the front lines.
Or, as Ozar put it in an update to his post yesterday: "If you [DBAs] want to stay current on what startup developers think about databases for their new projects, HackerNews is a good reality check. It's a completely different perspective than the typical enterprise developer echo chamber."
What do you think about Microsoft licensing terms for SQL Server and limitations of the the Standard Edition? Comment here or drop me a line.
Posted by David Ramel on 07/31/2013 at 1:15 PM0 comments
Microsoft on Tuesday announced the availability of a Premium preview for Windows Azure SQL Database with beefed-up features for cloud-based business-class applications.
Those features include reserved capacity for each database for better and more predictable performance. The Premium service "will help deliver greater performance for cloud applications by reserving a fixed amount of capacity for a database including its built-in secondary replicas," Microsoft said.
An e-mail alert said that current SQL Database customers--excluding free trials--can sign up to receive an invitation to the limited preview.
A SQL Server Blog on TechNet noted
that the preview--first announced earlier this month at the Worldwide Partner Conference in Houston--is ideal for:
- Apps that require a lot of resources such as CPU cycles, memory or input/output operations. An example is a database operation that consumes many CPU cores for a long time.
- Apps that require more than the limit of 180 concurrent connections provided in the Web and Business editions.
- Apps that require a guaranteed fast response, such as a stored procedure that needs to return quickly.
To satisfy these demanding apps, Microsoft is initially offering two levels of reservation size, called P1 and P2. The former offers one CPU core and 8GB of RAM at a preview price of $15 per day and an eventual general availability price of $30 per day (in addition to storage). The P2 service doubles all those numbers. Full pricing is available here
There's already lots of detailed information about the Premium preview. Our sister site Redmond Channel Partner covered the announcement, and Microsoft has an extensive guidance page with all the nitty-gritty details you could ask for. Scott Guthrie also provides some information on the release, in addition to discussing new support for performing "recurring, fully automated, exports of a SQL Database to a Storage account."
Microsoft said it "will continue to add business-class functionality to Premium databases over time, to further support higher end application requirements."
What do you think of the Premium service? Comment here or drop me a line.
Posted by David Ramel on 07/25/2013 at 1:15 PM0 comments
Big Data is the future, Hadoop is the tool and Hortonworks is the partner to help Microsoft help businesses navigate the coming sea change in the way they operate. That's the takeaway I got from Microsoft exec Quentin Clark in his keynote address at the recent Hadoop Summit North America held in San Jose, Calif.
Clark, corporate vice president, Microsoft Data Platform, told the audience he took a break from his vacation to address the early adopters of a transformation that will completely change the industry in the next couple of decades.
"We believe Hadoop is the cornerstone of a sea change coming to all businesses in terms of how they are able to embrace information to effect change for how they run their day-to-day business," Clark said.
He likened this change to the way line-of-business applications changed the way all organizations work, from government to large businesses to small businesses. "We believe Hadoop is at the very root, the very cornerstone, of a similar kind of impacting change, but it's about all this new value, if you will, of information--information from systems and data that people already have that they aren't processing well today, embracing signals from partners, from customers, even from competitors in the industry, and analyzing that information differently. We believe that over the next couple of decades we'll see a complete transformation in how businesses think about their information, think about their businesses."
This transformation will be similar to the way the world was changed by advances in the telecommunications and travel industries, going from switchboard-assisted phone calls to ubiquitous cell phone coverage and from steamships and wagons to jet airliners, Clark said. He predicted that one day there will be 1 billion users of Big Data, and that will signal the completion of the transformation.
Microsoft feels a responsibility to help customers embrace Hadoop because it has become the Big Data standard, Clark said, noting that the company has logged some 6,000 engineering hours over the last year in its partnership with Hadoop vendor Hortonworks, a cosponsor of the summit. "It is a bit different for us," Clark admitted, to work on such an open source project in view of its strong brands in the data platform space, such as SQL Server and Excel. But he said the move made sense for Microsoft and Hortonworks was the best partner. "We're putting our shoulder now firmly behind their distribution on Windows," he said. "The Hortonworks Data Platform for Windows is what we're standing behind for our customers."
Of course, the cloud is a major part of Microsoft's vision of the future, and Clark said the Windows Azure HDInsight Hadoop service is one of the fastest-growing roles in the Azure arena.
After some demonstrations of technologies such as GeoFlow
and Data Explorer
, Clark emphasized that Microsoft was addressing the conference attendees as part of its effort with Hortonworks to get Hadoop out to the masses--and eventually to 1 billion users. "You all are the early adopters," Clark said. "You're the ones that see this coming. You're the ones on the leading edge of this, and every phenomenon we've had that's impacted businesses in this deep a way has always come with folks like yourselves that have that clarity early on to know what's coming."
Do you know what's coming? Clue us in by commenting here or dropping me an e-mail.
Posted by David Ramel on 07/11/2013 at 1:15 PM0 comments
The latest version of Microsoft's flagship Relational Database Management System (RDBMS) is offered in two versions: the regular SQL Server 2014 Community Technology Preview 1 and the cloud-based SQL Server 2014 Community Technology Preview 1 on Windows Azure, both from the TechNet Evaluation Center. The announcement comes one day before the BUILD 2013 developer's conference in San Francisco.
The Windows Azure cloud was first and foremost in Microsoft's messaging about the new software, touting the company's "Cloud OS." "Microsoft has made a big bet on what we call our cloud-first design principles," said Brad Anderson, corporate VP, in a blog post discussing the new previews.
"SQL Server 2014 features in-memory processing for applications ("Hekaton"), as well as data warehousing and business intelligence," Anderson said. "SQL Server 2014 also enables new hybrid scenarios like AlwaysOn availability, cloud backup and disaster recovery. It lives in Windows Azure and can be easily migrated to the cloud from on-premises."
Along with SQL Server 2014, Microsoft announced the availability of previews for Windows Server and System Center, both as 2012 R2 versions.
The SQL Server 2014 CTP will expire after 180 days or on Dec. 31, 2013, whichever comes first. Download options include an ISO DVD image, CAB file or Azure version. Microsoft recommends the ISO or CAB version to test the software's new in-memory capabilities.
Posted by David Ramel on 06/25/2013 at 1:15 PM0 comments