Click to learn more about author Thomas Frisendal. I recently (finally) had the opportunity to read Doug Laney’s fine book about Infonomics. I have followed his work on this for years, because we share the ambition that data and information should be recognized as assets, just like factory equipment and trucks. Because data keep the […]
Data Modeling vs. Data Architecture
In the second edition of the Data Management Book of Knowledge (DMBOK 2): “Data Architecture defines the blueprint for managing data assets by aligning with organizational strategy to establish strategic data requirements and designs to meet these requirements.” Another way to look at it, according to Donna Burbank, Managing Director at Global Data Strategy: “Data […]
Five Tips to Completing Analytics’ Infamous “Last Mile” in 2019
Click to learn more about author Mike Lamble. In 2017, the term “Data Scientist” was LinkedIn’s fastest growing job title; yet, in the same year, McKinsey reported that less than 10 percent of Analytic Models that are developed actually make it to production where they can deliver ROI. The bottleneck lies in what industry insiders call […]
Domino Data Lab Announces Domino 3.0 to Power Model-Driven Organizations
According to a new press release, “Domino Data Lab, provider of an open data science platform, today announced Domino 3.0 featuring Domino Launchpad, a module designed to help companies maximize the impact of their data science investments by addressing the operational challenges and bottlenecks they face getting models into production. ‘Following the introduction of the […]
Data Science and Data Analysis
Click to learn more about author Steve Miller. I met up with a grad school friend of 40 years the other day. While he earned a doctorate and became an academic luminary, I departed the program with a masters in statistical science and went on first to the not-for-profit and then to the business worlds. […]
Building the Pillars of Data Modeling and Enterprise Architecture
Ron Huizenga believes it’s possible for an organization to reach an enlightened state where users can “understand the journey of their data through the entire organization.” That entails knowing when the data was created, the processes that use it, being aware of how it’s transformed on its way through the organization, and ultimately, knowing when […]
Next and Prior: Pointing in Data Models
Click to learn more about author Thomas Frisendal. Pointers have been in and out of data models. From the advent of the rotating disk drive in the 60s and until around 1990, pointers were all over the place (together with “hierarchies”, which were early versions of aggregates of co-located data). But relational and SQL made them […]
Implementing Bitemporal Modeling for the Best Value
Click to learn more about author Mike Brody. Bitemporal Modeling is an extremely useful tool for documenting historical data. It allows you to recreate databases as they existed at any point in the past and see whether the records were correct — based on what you know to be true now. This information can not only […]
Bitemporal Data Modeling: How to Learn from History
Click to learn more about author Mike Brody. Have you ever called about a real estate listing only to learn that the house has been taken off the market? Or had to pick up mail that should have been routed to your new home? Sometimes our records don’t reflect reality, and bitemporality exists to keep track […]
The Value of Strong Metadata Discovery
Too often professionals such as Data Analysts, Data Architects, and Compliance Specialists find themselves at a loss when it comes to being able, on their own, to discover, understand and use the rich Metadata within large enterprise ERP and CRM applications. It becomes complicated, costly, and time-consuming for these users to pull relevant business-context subsets […]