Advertisement

Better Data Modeling with Lean Methodology

By on

The process used today in systems development started with principles developed for assembly lines in the 1950s, when manufacturers wanted a more disciplined approach to producing goods and services.Products would come off an assembly line, they’d be inspected, defects would be found, and would be sent back to rework or start from scratch.

This process remained relatively unchanged until the 1980s, when the idea of “total quality management” emerged, and the focus changed to embedding quality throughout the entire manufacturing process. By the 1990s, the concept of “just in time” manufacturing enabled faster cycle times, shortening work in progress time and reducing inventories.

LIVE ONLINE TRAINING: DATA MODELING DEEP DIVE

Join us for this in-depth three-day workshop on the fundamental building blocks of Data Modeling – October 11-13, 2022.

Ron Huizenga, Principal Program Manager at Microsoft, spoke at the DATAVERSITY® Enterprise Data World Conference about how Data Modeling can benefit from the principles and practices of Lean methodology.

A Brief History Lesson: Software Development

Software development took a similar course, starting in the 1950s when the Toyota Production System introduced the concept of structured programming. The traditional Waterfall methodologies had their roots in the 1960s, and by the 1970s, iteration and incremental development techniques began to appear.

The 1980s saw the rise of prototyping, spiral methodology, and an increasing focus on Data Modeling and rapid application development (RAD). Agile methods started to gain traction in the 2000s.

In the earlier decades leading up to the 1990s, Huizenga said, it was all about trying to be predictive and realizing that current methods really weren’t keeping up with changing business needs. Since the 1990s, the approach has become more adaptive, not only in software development, but in all aspects of system design, including Data Modeling.

Methodologies

  • Waterfall is a linear, sequential approach to the development lifecycle, used in software engineering and product development, emphasizing a logical progression of steps: Requirements → Analysis → Design → Develop → Test → Deploy → Maintain.
  • Agile is based on iterative development using self-organizing, cross-functional teams, with requirements and solutions evolving through collaboration. Agile increases productivity and reduces time-to-benefits relative to Waterfall, he said.
  • Scrum is a variant of Agile, defined as a collaborative lightweight approach to software development. Scrum features fixed duration focused iterations called sprints (typically 30 days or less), driven by the use of a product backlog document. Teams are self-organizing with project requirements kept by a designated project owner, and a designated Scrum master managing the overall process. Sprints typically have a formal kickoff at the start and a retrospective at the end, with daily Scrum meetings in between.
  • Extreme Programming (XP) is another variant of Agile, with a focus on responsiveness to changing needs in the software development process. XP emphasizes face-to-face discussion using a whiteboard (virtual or in person), constant feedback and adjustment, respectful collaboration, encouraging and empowering individuals to contribute, and the pursuit of a simple solution.

Data Modeling’s Increasing Value

Huizenga sees the role of the data modeler as particularly important for success. Because the iterative process is focused on rapid development, quite often there’s disdain for the role of data modeler because modeling is perceived as a waste of time, he said. In contrast, Huizenga proposes a process of “Model-Driven Development,” where modeling is actually a design step that manifests itself in the actual code and data deliverables produced as part of the delivery process.

Role of the Data Modeler

Sometimes the lack of support for modeling is a result of a data modeler viewing themselves as a gatekeeper  rather than an enabler, he said. But it is more effective to encourage team members to share ideas and then synthesize them into a data model that works.

Like any team member, a data modeler needs to have full engagement in sprint planning to ensure deliverables are complete from the data perspective, and have input into dependency prioritization. It also helps if the data modeler understands the business perspective, and can relate to the developers and work closely with them as a team as well.

Lean Methodology: A Better Approach

Lean is a methodology for organizational management based on Toyota’s 1930 manufacturing model, which has been adapted for knowledge work. Where Agile was developed specifically for software development, Lean was developed for organizations, and focuses on continuous small improvement, combined with a sound management process in order to minimize waste and maximize value. Quality standards are maintained through collaborative work and repeatable processes.

Lean Software Development Principles

Eliminate anything not adding value as well as anything blocking the ability to deliver results quickly. At the same time, empower everyone in the process to take responsibility for quality. Automate processes wherever possible, especially those prone to human error, and get constant test-driven feedback throughout development.

Improvement is only possible through learning, which requires proper documentation of the iterative process so knowledge is not lost. All aspects of communication, the way conflicts are handled, and the onboarding of team members should always occur within a culture of respect.

Shutting Down the Assembly Line

An analogy for this, Huizenga said, is assembly line workers with the quality movement. Workers on the assembly line are empowered to shut down the entire line if they see a defect or something that could affect quality. This process prevents problems from going downstream and affecting other things, he said. “We need to have exactly that same type of attitude and process in place when we’re doing these types of development initiatives.”

Applying Lean Principles to Data Modeling

Within the iteration workflow, every change gets modeled and associated with the proper task or user story for traceability, and appropriate incremental DDL scripts are generated. Some designs may be originated by the data modeler, and in other contexts, the modeler adjusts or refactors as ideas are generated by developers experimenting in a sandbox.

Everyone involved should be building off the same officially sanctioned image of the database. Ensuring there’s a full script at the end of each iteration is extremely important, as well, so it’s always possible to go back and compare it to the end of any previous iteration. Data modelers should fully participate in development, retrospectives, and all parts of the process.

Learn from the Past

Applications come and go, but data has always been important, he said. Organizations need to preserve and maintain their data, but to utilize it going forward, it must be designed and documented correctly.

Although there has been a lot of evolution of methodologies within systems development, the foundations have all been derived from manufacturing principles and practices of the 1950s. “So what we really want to do with organizations is learn and adapt based on the cumulative body of knowledge built from the last 70 years.”  

The Best of Both Worlds

Data models are more important than ever in order to manage complexity, maintain or increase quality, deliver value, and avoid failure, because models provide a true picture of how all the pieces fit together. Using Lean principles improves systems development, as well as operations in general, because the focus is on value, on efficiency, and on waste reduction. Most of all, keeping business stakeholders in the process creates customer satisfaction within the business.

Generally, speaking, Huizenga said, approaches using Lean are the most successful, because Lean is adaptive.

When individuals have an adaptive mentality, they are able to change with the business needs. It’s also important to know what’s ahead, so “predictive” capabilities should be incorporated as well, he said. “Having that best of both worlds is what really helps you to deliver the value.”

Want to learn more about DATAVERSITY’s upcoming events? Check out our current lineup of online and face-to-face conferences here.

Here is the video of the Enterprise Data World Presentation:

Image used under license from Shutterstock.com

Leave a Reply

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept