26.12.2017 Views

C&L_December 2017 (1)

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Insight<br />

mechanical systems. When data is selfaware,<br />

it can be tagged so it controls<br />

who sees what parts of it and when,<br />

without additional time consuming<br />

and potentially error prone human<br />

intervention to subdivide, approve and<br />

disseminate the valuable data.<br />

2. Virtual machines<br />

become “rideshare”<br />

machines<br />

It will be faster, cheaper and more convenient<br />

to manage increasingly distributed<br />

data using virtual machines, provisioned<br />

on webscale infrastructure,<br />

than it will be on real machines.<br />

This can be thought of in terms of<br />

buying a car versus leasing one or<br />

using a rideshare service like Uber<br />

or Lyft. If you are someone that hauls<br />

heavy loads every day, it would make<br />

sense for you to buy a truck. However,<br />

someone else may only need a certain<br />

kind of vehicle for a set period of time,<br />

making it more practical to lease. And<br />

then, there are those who only need<br />

a vehicle to get them from point A<br />

to point B, one time only: the type of<br />

vehicle doesn’t matter, just speed and<br />

convenience, so a rideshare service the<br />

best option.<br />

This same thinking applies in the<br />

context of virtual versus physical<br />

machine instances. Custom hardware<br />

can be expensive, but for consistent,<br />

intensive workloads, it might make<br />

more sense to invest in the physical<br />

infrastructure. A virtual machine<br />

instance in the cloud supporting variable<br />

workloads would be like leasing:<br />

users can access the virtual machine<br />

without owning it or needing to know<br />

any details about it. And, at the end of<br />

the “lease,” it’s gone. Virtual machines<br />

provisioned on webscale infrastructure<br />

(that is, serverless computing) are<br />

like the rideshare service of computing<br />

where the user simply specifies the<br />

task that needs to be done. They leave<br />

the rest of the details for the cloud<br />

provider to sort out, making it more<br />

convenient and easier to use than traditional<br />

models for certain types<br />

of workloads.<br />

3. Data will grow faster<br />

than the ability to<br />

transport it...and that’s ok!<br />

It’s no secret that data has become<br />

incredibly dynamic and is being generated<br />

at an unprecedented rate that will<br />

greatly exceed the ability to transport<br />

it. However, instead of moving the<br />

data, the applications and resources<br />

needed to process it will be moved to<br />

the data and that has implications for<br />

new architectures like edge, core, and<br />

cloud. In the future, the amount of data<br />

ingested in the core will always be less<br />

than the amount generated at the edge,<br />

but this won’t happen by accident.<br />

It must be enabled very deliberately<br />

to ensure that the right data is being<br />

retained for later decision making.<br />

For ex<strong>amp</strong>le, autonomous car manufacturers<br />

are adding sensors that will<br />

generate so much data that there's no<br />

network fast enough between the car<br />

and data centers to move it. Historically,<br />

devices at the edge haven’t created<br />

a lot of data, but now with sensors<br />

in everything from cars to thermostats<br />

to wearables, edge data is growing<br />

so fast it will exceed the capacity of<br />

the network connections to the core.<br />

Autonomous cars and other edge<br />

devices require real-time analysis at<br />

the edge in order to make critical inthe-moment<br />

decisions. As a result, we<br />

will move the applications to the data.<br />

4. Evolving from “Big<br />

Data” to “Huge Data” will<br />

demand new solid statedriven<br />

architectures<br />

As the demand to analyze enormous<br />

sets of data ever more rapidly increases,<br />

we need to move the data closer<br />

to the compute resource. Persistent<br />

memory is what will allow ultra-low<br />

latency computing without data loss;<br />

and these latency demands will finally<br />

force software architectures to change<br />

and create new data driven opportunities<br />

for businesses. Flash technology<br />

has been a hot topic in the industry,<br />

however, the software being run on it<br />

didn’t really change, it just got faster.<br />

This is being driven by the evolution<br />

of IT’s role in an organization. In the<br />

past, IT’s primary function would have<br />

been to automate and optimize processes<br />

like ordering, billing, accounts<br />

receivable and others. Today, IT is integral<br />

to enriching customer relationships<br />

by offering always-on services,<br />

mobile apps and rich web experiences.<br />

The next step will be to monetize the<br />

data being collected through various<br />

sensors and devices to create new business<br />

opportunities and it’s this step<br />

that will require new application architectures<br />

supported by technology like<br />

persistent memory.<br />

5. Emergence of decentralized<br />

immutable mechanisms<br />

for managing data<br />

Mechanisms to manage data in a trustworthy,<br />

immutable and truly distributed<br />

way (meaning no central authority)<br />

will emerge and have a profound<br />

impact on the datacenter. Blockchain is<br />

a prime ex<strong>amp</strong>le of this.<br />

Decentralized mechanisms like<br />

blockchain challenge the traditional<br />

sense of data protection and management.<br />

Because there is no central point<br />

of control, such as a centralized server,<br />

it is impossible to change or delete<br />

information contained on a blockchain<br />

and all transactions are irreversible.<br />

Current datacenters and applications<br />

operate like commercially managed<br />

farms, with a central point of control<br />

(the farmer) managing the surrounding<br />

environment. The decentralized<br />

immutable mechanisms for managing<br />

data will offer microservices that<br />

the data can use to perform necessary<br />

functions. The microservices and data<br />

will work cooperatively, without overall<br />

centrally managed control<br />

–Mark Bregman, CTO, NetApp outlines 5 key<br />

CTO predictions for 2018.<br />

<strong>December</strong> <strong>2017</strong> | CIO&LEADER<br />

29

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!