Advertisement

Cloud Architecture Trends in 2018

By on

cloud architecturePeople are constantly coming up with new and intelligent ways to use the Cloud. As a consequence, Cloud Architecture designs and developments are constantly being tweaked, adjusted, and improved upon. Today’s businesses need to be flexible, move quickly, and understand their customer’s desires and expectations. To do this, businesses are relying on the Cloud to provide a private communication system, a data storage system, and a Big Data processing system. As Cloud technologies evolve, organizations continue to find more and more uses.

Cloud Architecture describes the “organization of software systems” used to deliver Cloud Computing services. It normally involves multiple elements within the Cloud, which communicate with one another by way of a loose coupling mechanism (for example, a messaging queue). The architecture of a Cloud must be flexible enough to allow a variety of systems access. Scalability and security are two important elements in the design.

Cloud Architecture should not be confused with Data Architecture, which is following a similar evolution. It is rather one part within the much larger topic of Data Architecture.

Hybrid Clouds

More and more organizations will adopt a combination of in-house data processing, private Cloud processing, public Cloud processing, and Software-as-a-Service (SaaS) to provide research results and Business Intelligence. While this approach offers a great deal of flexibility, it is generally not a streamlined system, and is often “unnecessarily” expensive and clumsy to manage. A recent DATAVERSITY® survey showed 47.7 percent of organizations polled are using a hybrid system.

The architecture of Hybrid Clouds involves integrating on-premises resources and Cloud resources. For most businesses with on-premises technology, use of the Cloud-as-a-service, requires operating within a hybrid architecture. One Cloud service may have architecture that is a good match with the client’s, while others are not selecting a Cloud provider with a compatible system can minimize setup costs, without investing in equipment and software.

When investigating Cloud providers, it is also important to consider the integration of “information supplying applications” across both the Cloud and on-premise systems. Hybrid architecture should include the ability to integrate Big Data from the Internet of Things and from sensors at remote locations. Compatible systems eliminate the need to purchase new hardware to support the apps, and can streamline the process.

Machine Learning and Deep Learning

Deep Learning and Machine Learning are both based on algorithms. Machine Learning gives computers the capacity to learn using repetitive experience. Deep Learning takes the process further with the help of Graphic Processing Units (GPUs) and massive amounts of data, and is often used to train AI entities.

A number of business leaders believe Machine Learning will maximize the insights gained from Big Data and provide them with a competitive edge. There are also business leaders who are still uncertain about what Machine Learning is. Those who understand it, and are using it, have gained useful Business Intelligence. In just a few years, Machine Learning has gone from a lab experiment to a very useful research tool. A recent survey by the Harvard Business Review’s Analytic Services showed 60 percent of the respondents believe their organization’s future success depends on Machine Learning. Many have started using Deep Learning to develop pattern recognition, workflow management, predictive recommendations, and to detect fraud.

An unlikely partnership between Amazon Web Services and Microsoft Azure expresses the importance of Machine Learning and Deep Learning. Their partnership has been expressed by way of Gluon, a Deep Learning, “open source library” designed to automate specific processes, in turn making Machine Learning more streamlined. These two companies, along with IBM, Google, and other tech giants, see the great potential of Machine Learning within the Cloud.

Cloud Containers

Cloud containers (a version of application containers), and container management platforms, will gain in popularity in 2018 because they are useful, efficient, and functional. Dave Bartoletti, Vice President and Principal Analyst at Forrester has suggested 10 percent of businesses are currently using containers for production, but up to a third have begun testing them. The term “application container,” represents a new kind of technology providing consistency and design efficiency. Basically, they are an alternative to virtual machines and hypervisors. Containers use storage resources, memory, and CPU more efficiently than virtual machines, making it possible to support more apps using the same infrastructure.

An application container uses a system called container-based virtualization. It is a “virtual” storage container designed for data and processing, and insulates all of the internal elements from the environment of the physical computer. It uses the computer as a platform, while acting independently.

This protects both the container’s apps and the server, from damaging the other due to a misstep. IT experts claim container-based virtualization supports a more efficient design, by removing the need for setting up infrastructure systems on the computer. Containers come with their own infrastructure.

Additionally, Cloud containers have the feature of being very portable. A container can be uploaded into a variety of different servers quite easily. Containers can be copied quite easily, allowing them to be used for testing, development, and integration, without reconfiguring the host computer. (Note:  Application containers are still new and do not always live up to the “ideal” of being compatible with all servers. Double check a container’s limitations).

Artificial Intelligence

Cloud providers have historically focused on offering infrastructure and software. Of late, however, the focus has shifted to offering an Intelligent Cloud. IBM ‘s Watson, which is only available on their Cloud, has taken the lead in Artificial Intelligence (AI).

A major challenge in developing an AI entity is the amount of time and money involved in training it. GPU technology on the Cloud have been used to provide Deep Learning. IBM has recently announced they can reduce the training time of an AI entity by as much as 65 percent. This is considered a significant breakthrough, and a major step in reducing the time and money needed for the training of an AI entity.

Few organizations can afford to research Artificial Intelligence. However, major Cloud vendors have invested heavily in AI research and development. Their goal is to develop a Cloud that will work with the customer more quickly, efficiently, and intuitively, than their competition. Access to a Cloud with Artificial Intelligence can provide:

  • Inexpensive AI Research: Cloud services with an integrated AI can provide organizations with the resources for AI research and development, with the client paying only for time used. An unsuccessful pilot project can easily be shut it down, without dealing with expensive hardware that is no longer needed.
  • Ease of Use: Cloud vendors are constantly striving to make their system more “user-friendly,” and are using Artificial Intelligence to achieve this goal.
  • Access to the Latest Technology: Using the AI cloud services allows organizations to stay at the cutting-edge of technology. Major cloud vendors are consistently offering new AI services as they compete with each other.

AI and Cloud Security

Expect Artificial Intelligence to be used, more and more, for security purposes, both in, and outside of, the Cloud. Security is a significant consideration when working in the Cloud, and AI offers a way to combat cyber-attacks by recognizing threats and cutting down response time. In the Cloud, where businesses interact with multiple systems, there is an increased chance the Cloud’s service may become compromised, leading to cyber-fraud and attacks on an organization’s private computers. Security is an important factor when choosing a Cloud.

In October of 2017, Vipre Cloud announced its use of Artificial Intelligence to prevent online endpoint attacks. They are working to protect this vulnerable area with a security system that interlinks behavioral analytics, crowdsourced data collection, Machine Learning, and unified management. The Vipre Cloud is constantly updated with the latest cyber-attack information, and has a companion agent, installed locally, which keeps the endpoints secure.

Blockchain

Blockchain is a Distributed Ledger Technology (DLT) invented to support Bitcoin and other cryptocurrencies and can store large amounts of data by using a network of computers instead of a single, localized server. Blockchain doesn’t offer direct security by itself, but a security app can be created. ABI Research states Blockchain offers three basic features which support security:

  • Immutability: Data cannot be altered after it’s been created.
  • Transparency: Everyone can see what’s taking place.
  • Autonomy: It is self-governing.

The technology, theoretically, will work with almost any kind of online transaction involving money, goods or property. It has great potential for online financial transactions, ranging from tax collection to allowing migrants to send money to family in other countries, where banking can be difficult.

In October, 2017, IBM created a new Blockchain payment platform which can speed up payments across-the-border. Several international banks worked together on the project, including National Australia Bank and Bank Danamon. Later in October, the central banks of Hong Kong and Singapore agreed to collaborate on a cross-border network using blockchain technology.

Data Virtualization

Data Virtualization is a term used for describing a method of Data Management allowing an application to recover and manipulate data, while not requiring the technical details of the data – for example, where it is located or how it is formatted. It provides both integrated and abstracted data, in real-time, from a variety of sources, to multiple users. Data Virtualization models are described as easy to understand, easy to build, and easy to maintain.

The Data Virtualization process involves transferring data from disparate sources. The primary goal in using Data Virtualization technology involves providing access to the data from a variety of data sources through a single point. This allows a user access to the applications without needing their exact locations.

Data Virtualization has recently been adapted to the Cloud and will be used increasingly, in 2018.

 

Photo Credit: vectorfusionart/Shutterstock.com

Leave a Reply