Loading...
You are here:  Home  >  Data Education  >  BI / Data Science News, Articles, & Education  >  BI / Data Science Blogs  >  Current Article

Instilling Industrial Discipline in Enterprise Analytics

By   /  September 19, 2016  /  No Comments

Click here to learn more about author James Kobielus.

We’re living in a supposedly post-industrial era. So it can feel a bit odd to refer to the need to bring “industrialization,” an “assembly line” approach, or “factory”-like processes to the development of business analytics.

Behind any such discussion, the specter of automation usually lurks, as the author of this recent O’Reilly article makes explicit in her first sentence. But I think she greatly overstates the importance of automation in the industrialization of any function. She doesn’t go so far as to predict massive unemployment for human data scientists, but one wouldn’t be surprised to see others read that into her otherwise excellent discussion of trends in business analytics.

Contrary to what many believe, automation isn’t the essence of industrialization. You can achieve many of the same benefits without turning personnel into de-skilled drones who are doomed to stoke a heartless machine. Even in enterprises that we all take as the very definition of industrial—such as manufacturing, mining, and milling—there are a wide range of indispensable human functions. Fundamentally, industrializing any business function—even those that we associate with the post-industrial economy–involves organizing it to achieve greater scale, speed, efficiency, and predictability.

With that in mind, you can industrialize enterprise analytics by reconstituting its operational processes around the following core principles:

  • Role specialization: This has been the core principle of industrial production since long before there were machines to do the work. The key specializations in an industrialized analytics process consist of data scientists, data engineers, application developers, business analytics, and subject-domain specialists. As I discussed in this recent post, role specialization within collaborative environments enables team members to pool their skills and specialties in the exploration, development, deployment, testing, and management of machine learning and other data-driven business-analytics models.
  • Workflow patterning: This is the foundation of scale, speed, efficiency, and quality control in any industrial process. The principal patterns in the analytics lifecycle consist of standardized tasks, flows, and rules governing the creation and deployment of machine-learning models, statistical algorithms, and other repeatable data/analytics artifacts. As I discussed in this recent post, the primary workflow patterns fall into such categories as data discovery, acquisition, ingestion, aggregation, transformation, cleansing, prototyping, exploration, modeling, governance, logging, auditing, archiving, and so on. In a typical enterprise analytics practice, some of these patters may be largely automated, while others may fall toward the manual, collaborative, and agile end of the spectrum.
  • Tool-driven acceleration: This is what people usually associate with task automation. As I discussed in this recent post, many analytics lifecycle processes are being accelerated through cloud-based tools and platforms such as massively-parallel cloud-based Spark runtime engines; through unified workbenches for fast, flexible sharing and collaboration within analytics development teams; and through on-demand, self-service collaboration environments that provide each specialist with tools and interfaces geared to their specific tasks. To ensure industrial-grade quality controls, collaboration tools should enable the necessary checks and balances for monitoring and assuring automated outputs. For example, data science teams should be able to create workflows for manual reviews of machine-generated models prior to their deployment into production applications. This is analogous to how high-throughput manufacturing facilities dedicate personnel to test samples of their production runs before they’re shipped to the customer.

Increasingly, we see the nouveau coinage “InsightOps” referring to the intensifying industrialization of enterprise data-science processes. Check out an excellent recent column by my colleagues Tim Vincent and Bill O’Connell on this topic. Obviously based on the established DevOps paradigm, this term refers to a development and deployment lifecycle for data-driven enterprise analytics assets that has the agility to incorporate shifting blends of automated, collaborative, and self-service processes.

As this trend picks up speed, and even as automation pushes more deeply into their core functions, enterprise analytics developers will be in even greater demand. Industrialization of their work processes will enable them to focus on problem solving, domain knowledge, interactive exploration, and application development.

 

About the author

James Kobielus, Wikibon, Lead Analyst Jim is Wikibon's Lead Analyst for Data Science, Deep Learning, and Application Development. Previously, Jim was IBM's data science evangelist. He managed IBM's thought leadership, social and influencer marketing programs targeted at developers of big data analytics, machine learning, and cognitive computing applications. Prior to his 5-year stint at IBM, Jim was an analyst at Forrester Research, Current Analysis, and the Burton Group. He is also a prolific blogger, a popular speaker, and a familiar face from his many appearances as an expert on theCUBE and at industry events.

You might also like...

Property Graphs: The Swiss Army Knife of Data Modeling

Read More →