Advertisement

Big Data Trends in 2019

By on

Big Data TrendsThe accessibility of data has provided a new generation of technology and has shifted the business focus towards data-driven decision making. Big Data Analytics is now an established part of gathering Business Intelligence. Many businesses, particularly those online, consider Big Data a mainstream practice. These businesses are constantly researching new tools and models to improve their Big Data utilization.

In 2019, some tools and trends will be more popular than others. New Big Data concepts and technologies are constantly appearing on the market, and older technologies fade away, or get used in new ways. The continuous growth of the Internet of Things (IoT) has provided several new resources for Big Data. New technologies change not only how Business Intelligence is gathered, but how business is done.

Streaming the IoT for Machine Learning

There are currently efforts to use the Internet of Things (IoT) to combine Streaming Analytics and Machine Learning. In 2019, we can anticipate significant research on this theme, and possibly a startup or two marketing their services or software.

Typically, Machine Learning uses “stored” data for training, in a “controlled” learning environment. In this new model, streaming data provides useful information from the Internet of Things to offer Machine Learning in real time, in a less controlled environment. A primary goal in this process is to provide more flexible, more appropriate responses to a variety of situations, with a special focus on communicating with humans.

Changing from a training model that uses a controlled environment and limited training data to a much more open training system requires more complex algorithms. Machine Learning then trains the system to predict outcomes with reasonable accuracy. As the primary model adjusts and evolves, models at the edge or in the Cloud will coordinate to match the changes, as needed. Ted Dunning, the Chief Application Architect at MapR said:

“We will see more and more businesses treat computation in terms of data flows rather than data that is just processed and landed in a database. These data flows capture key business events and mirror business structure. A unified data fabric will be the foundation for building these large-scale flow-based systems.”

AI Platforms

Big Data as a tool of discovery continues to evolve and mature, with some enterprises accessing significant rewards. A recent advancement is the use of AI (Artificial Intelligence) platforms. AI  platforms will have significant impact over the next decade. Using AI platforms to process Big Data is a significant improvement in gathering Business Intelligence and improving efficiency. Anil Kaul, CEO and Co-Founder of Absolutdata stated:

“We started an email campaign, which I think everybody uses Analytics for, but because we used AI, we created a 51 percent increase in sales. While Analytics can figure out who you should target, AI recommends and generates what campaigns should be run.”


AI platforms will gain in popularity in 2019. AI platforms are frameworks designed to work more efficiently and effectively than more traditional frameworks. When an AI platform is designed well, it will provide faster, more efficient communications with Data Scientists and other staff. This can help reduce costs in several ways—such as by preventing the duplication of efforts, automating basic tasks, and eliminating simple, but time-consuming activities (copying, data processing, and constructing ideal customer profiles).

AIs will also provide Data Governance, making best practices available to Data Scientists and staff. The AI becomes a trusted advisor, and can also help to ensure work is spread more evenly, and completed more quickly. Artificial Intelligence platforms are arranged into five layers of logic:

  • The Data & Integration Layer gives access to the data. (Critical, as developers do not hand-code the rules. Instead, the rules are being “learned” by the AI.)
  • The Experimentation Layer lets Data Scientists develop, test, and prove their hypothesis.
  • The Operations & Deployment Layer supports model governance and deployment. This layer offers tools to manage the deployment of various “containerized” models and components.
  • The Intelligence Layer organizes and delivers intelligent services and supports the AI.
  • The Experience Layer is designed to interact with users through the use of technologies such as augmented reality, conversational UI, and gesture control.

The Data Curator

In 2019, many organizations will find the position of Data Curator (DC) has become a new necessity. The Data Curator’s role will combine responsibility for managing the organizations metadata, as well as Data Protection, Data Governance, and Data Quality. Data Curators not only manage and maintain data, but may also be involved in determining best practices for working with that data. Data Curators are often responsible for presentations, with the data shown visually in the form of a dashboard, chart, or slideshows.

The Data Curator regularly interacts with researchers, and also schedules educational workshops. The DC communicates with other curators to collaborate and coordinate, when appropriate. (Good communication skills are a plus). Tomer Shiran, co-founder and CEO of Dremio, said:

“The Data Curator is responsible for understanding the types of analysis that need to be performed by different groups across the organization, what datasets are well suited for this work, and the steps involved in taking the data from its raw state to the shape and form needed for the job a data consumer will perform. The data curator uses systems such as self-service data platforms to accelerate the end-to-end process of providing data consumers access to essential datasets without making endless copies of data.”

Politics and GDPR

The European Union’s General Data Protection Regulation (GDPR) went into effect on May 25, 2018. While GDPR is focused in Europe, some organizations, in an effort to simplify their business and promote good customer relations, have stated they will provide the same privacy protections for all their customers, regardless of where they live. This approach, however, is not the general position taken by businesses and organizations outside of Europe. Many corporations have chosen to revamp their consent procedures and data handling processes, and to hire new staff, all in an effort to maximize the private data they “can” gather.

Businesses relying on “assumed consent” for all processing operations can no longer make this assumption when doing business with Europeans. Businesses have had to implement new procedures for notices and receiving consent, and many are currently trying to plan for what’s next, while simultaneously struggling with problems in the present.

Several organizations have assigned GDPR responsibilities to their Chief Security Officers. (The CDC should be responsible for having these changes made.) Though GDPR fines can be quite large (fines can be as high as 20 million Euros or four percent of the annual global turnover, depending on which is higher), many businesses, especially in the United States, are still not prepared.

In 2019 the U.S. government could make an effort to imitate the GDPR and hold businesses accountable for how they handle privacy and personal data. In the short term, it would make sense for online businesses to begin implementing new privacy policies or simply make the shift to a GDPR policy format. Making the shift now, and advertising it on the company’s website, has the potential to develop a good relationship with the customer base.

5G Not Likely in 2019

Switching to a 5G (fifth generation) system is expensive and comes with some potential issues. While the expense may not stop 5G implementation in 2019, other problems might.

Though the U.S. Federal Government completely supports the implementation of a 5G system, some communities have passed ordinances halting the installation of a 5G infrastructure. It seems likely this will become a standard practice for blocking 5G systems.

An additional factor blocking 5G is a decision by the United States FCC, which eliminated regulations supporting net neutrality. Net neutrality offered internet providers, and their users, a level playing field, and promoted competition. Net neutrality is the concept that internet providers should treat all data, and people, equally, without discrimination and without charging different users different rates based on such things as speed, content, websites, platforms, or applications.

Hybrid Clouds Will Gain in Popularity

Clouds and Hybrid Clouds have been steadily gaining in popularity and will continue to do so. While an organization may want to keep some data secure in its own data storage, the tools and benefits of a hybrid system make it worth the expense. Hybrid Clouds combine an organization’s private Cloud with the rental of a public Cloud, offering the advantages of both. Expect a significant increase in the use of Hybrid Clouds in 2019.

Generally speaking, the applications and data in a Hybrid Cloud can be transferred back and forth between on-premises (private) Clouds and IaaS (public) Clouds, providing more flexibility, deployment options, and tools. A public Cloud, for example, can be used for the high-volume, low-security projects, such as email advertisements, and the on-premises Cloud can be used for more sensitive projects, such as financial reports.

The term “Cloud Bursting” is a feature of Hybrid Cloud systems and describes an application that is running within the on-premises Cloud, until there is a spike in the demand (think Christmas shopping online, or filing taxes), and then the application will “burst” through, into the public Cloud, and tap into additional resources.

 

 

Image used under license from Shutterstock.com

 

Leave a Reply