You are here:  Home  >  Conference and Webinar Communities  >  Current Article

7 Tips for Staying Relevant and Valued as a Data Modeler

By   /  January 24, 2013  /  6 Comments

by Karen Lopez

Seven1. Learn about NoSQL database technologies

You aren’t going to get any warning when your company’s first NoSQL project shows up.  It might be the result of a 19th-hole acquisition project (when C-level executives are wined and dined by a software vendor) or it might be part of a package solution that someone has purchased.  And it will need to be installed and running by tomorrow.

I recommend you check out Hadoop-related data technologies, a graph database, a key-value pair database and a document-based database.  Attend user group meetings for NoSQL technologies.  Attend a whole conference.  Read.  Watch videos.  You’ll need to understand where NoSQL technologies fit within a modern data architecture.

2. Learn how to correctly reverse engineer application databases

I have worked with more than a few data architects who have never reverse or forward engineered a database.  I’m not sure how they have survived all these years, but data architects need to have the skills to reverse engineer a database, even with package solutions.  Sure commercial software might be missing primary keys, foreign keys and all kinds of constraints and other database-hosted data quality features, but we need to know that, too.It’s not enough to know how to click NEXT, NEXT, NEXT on your data modeling tool, either.  You’ll need to understand what those hundreds of options are in that wizard.

3. Get experience data modeling for integration projects

Not all data modeling results in a brand new database built from scratch.  One of the most common reasons I’ve been doing modeling is for building canonical models for integration projects. These are often implemented as XML schemas, but I start with a logical data model of requirements using very similar techniques as I do for a database design project.

4. Learn a pattern or universal data model

There’s no strong reason why data models about core data about people, organizations, products, categories, addresses, contact mechanisms should vary across models, especially at the same company.  Using pattern data models can significantly reduce the time it takes to produce a correct and complete model.  Sure, tailoring may be needed, but if you are familiar with common data modeling patterns, you can free up a huge amount of time to focus on those data requirements are proprietary and unique to your business.

Understanding a pattern model can take some time; that’s why you need to start reading and studying now.

5. Learn more features of your data modeling tool

Like many productivity tools, data modeling tools have evolved a great deal over the last few years.  We architects benefit from a significant number of mature features in our tools.  And like office productivity users, many of us use only 10-25% of those features.  Automation is a key part of many of the leading tools, yet some modelers are reluctant to learn new coding and scripting skills needed to master them.  You should.  The payoff for productivity can be significant after a short learning curve.

6. Learn about the new features of your target DBMSs

The new data types, identifier types, constraints, etc. for DBMSs is growing with every release.  As data modelers, we need to be able to understand the pros and cons for choosing those over more traditional features.  Take formal training if you have to.  Attend user group meetings.  Get your DBAs and developers to lead some brown bag lunches.  Read online tutorials.

Nothing says “out of touch” like having to ask what something means in a design when that feature has been around for five years.  It gets worse if you get caught specifying a deprecated approach to a design.

7. Build a data modeling process that allows you to produce releases quickly

Whether or not you will be following an Agile approach, modern development projects need to have faster and shorter iterations.  If it takes 3 days to publish and share your data models, that’s too long.  Modeling tool automation and documentation features should be your primary method for producing the artifacts related to data models.

As little as 15 minutes a day can keep your skills sharp

If these tips sound a bit more physical than you are used to, they are.  Yes, we still need to produce beautiful conceptual and logical data models, but our relevance is also being measured on how well we are able to provide business solutions.  More times than not, that means being able to effectively contribute to the designing and building of those solutions.Spending just 15 minutes a day working your way through this list will make a huge difference in ensuring your data relevance on future projects.  Your project teams will find you more valuable and be happier with your contributions.

About the author

Karen Lopez is Sr. Project Manager and Architect at InfoAdvisors. She has 20+ years of experience in project and data management on large, multi-project programs. Karen specializes in the practical application of data management principles. She is a frequent speaker, blogger and panelist on data quality, data governance, logical and physical modeling, data compliance, development methodologies and social issues in computing. Karen is an active user on social media and has been named one of the top 3 technology influencers by IBM Canada and one of the top 17 women in information management by Information Management Magazine. She is a Microsoft SQL Server MVP, specializing in data modeling and database design. She’s an advisor to the DAMA, International Board and a member of the Advisory Board of Zachman, International. She’s known for her slightly irreverent yet constructive opinions and rants on information technology topics. She wants you to love your data. Karen is also moderator of the InfoAdvisors Discussion Groups at www.infoadvisors.com and dm-discuss on Yahoo Groups. Follow Karen on Twitter (@datachick).

  • Excellent article.

    I would make one cautionary comment regarding NoSql and “Big Data”. Don’t forget everything one has learned from one’s data modeling experience. ONe still should do some form of a logical data model and then use it for the basis of the physical design. Just because the technology does not reference the terms relational or sql, this does not relieve us of the responsibility to rationally organize our data

    • I agree. It’s like when we learned XML. Turns out that tags aren’t all it takes to define data.

  • Raj Yarramasu

    Excellent article. The key to success and survival in the industry is “being relevant”.

You might also like...

The Network Finally Gets Its Due as the Enabler of Edge Computing

Read More →