.NET Tips and Tricks

Blog archive

CA ERwin and the Data Modeling World

After working with CA Technologies' ERwin Data Modeler 7.3 for our review in the December issue of Visual Studio Magazine (Designing Databases with CA ERwin Data Modeler 7.3), I spoke with Donna Burbank, the product marketing manager for CA ERwin, about what it's like competing in the data modeling market.

Peter Vogel: What market is CA ERwin Data Modeler part of? What are the essential issues/characteristics in this market?
Donna Burbank: Data modeling is a core component in a number of markets that manage or use data including Data Integration, Business Intelligence (BI), Data Warehousing, Master Data Management (MDM), and Data Governance. Data modeling provides a blueprint for data assets that can be used by both the business and technical staff to define and modify data structures. For example, business analysts use CA ERwin Data Modeler (CA ERwin) to define core business requirements, which are then integrated with the technical architecture at the database design level.

Challenges with data management include ensuring that all data is of the same high quality, maintaining data integrity and standardizing data fields across different data sources. Companies that understand their customer, product, and competitive data have a distinct advantage. Data Modeling helps organizations maintain a clear, consistent view of their strategic information -- helping them understand the data they have, what it means, and the relationships between data elements.

PV: What challenges do you see in CA ERwin Data Modeler's future?
DB: The major challenge is to keep up with new data formats and on-demand/cloud platforms. Where much of an organization's data was historically stored on-premise in traditional relational databases (RDBMS), today there are new platforms and data formats to deal with: unstructured data in emails and documents; XML and related formats for Web development; cloud-oriented databases; and so forth. CA ERwin's challenge is to provide the same level of data management and model-driven development options for these new data sources as we historically have for the traditional database market.

Posted by Peter Vogel on 12/22/2010


comments powered by Disqus

Featured

  • AI for GitHub Collaboration? Maybe Not So Much

    No doubt GitHub Copilot has been a boon for developers, but AI might not be the best tool for collaboration, according to developers weighing in on a recent social media post from the GitHub team.

  • Visual Studio 2022 Getting VS Code 'Command Palette' Equivalent

    As any Visual Studio Code user knows, the editor's command palette is a powerful tool for getting things done quickly, without having to navigate through menus and dialogs. Now, we learn how an equivalent is coming for Microsoft's flagship Visual Studio IDE, invoked by the same familiar Ctrl+Shift+P keyboard shortcut.

  • .NET 9 Preview 3: 'I've Been Waiting 9 Years for This API!'

    Microsoft's third preview of .NET 9 sees a lot of minor tweaks and fixes with no earth-shaking new functionality, but little things can be important to individual developers.

  • Data Anomaly Detection Using a Neural Autoencoder with C#

    Dr. James McCaffrey of Microsoft Research tackles the process of examining a set of source data to find data items that are different in some way from the majority of the source items.

  • What's New for Python, Java in Visual Studio Code

    Microsoft announced March 2024 updates to its Python and Java extensions for Visual Studio Code, the open source-based, cross-platform code editor that has repeatedly been named the No. 1 tool in major development surveys.

Subscribe on YouTube