01.12.2023 Views

ST Nov-Dec 2023

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Front Cover 1.qxd 30-<strong>Nov</strong>-23 12:35 PM Page 1<br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

<strong>Nov</strong>ember/<strong>Dec</strong>ember <strong>2023</strong><br />

Vol 23, Issue 6<br />

The UK’s number one in IT Storage<br />

INDU<strong>ST</strong>RY ROUNDTABLE:<br />

Is Al reliant on Flash?<br />

D.R. <strong>ST</strong>RATEGIES:<br />

Application recovery planning<br />

DATA PROTECTION:<br />

ARCHIVING:<br />

The resurgence of tape<br />

Ransomware fears rising<br />

COMMENT - RESEARCH - INTERVIEWS - CASE <strong>ST</strong>UDIES - OPINIONS - PRODUCT REVIEWS


Hybrid storage architecture, optimized for<br />

SSD cache acceleration<br />

The TS-hx87XU-RP rackmount NAS series features unique 6<br />

+ x drive bay architectures*, with 6 2.5-inch SSD bays and<br />

12/16/24 HDD bays.<br />

High-speed 10GbE & 2.5GbE connectivity<br />

for bandwidth-demanding tasks<br />

Dual 10GBASE-T and dual 2.5GbE high-speed connectivity<br />

enhances virtualization, intensive file access, and large<br />

backup/restoration tasks.<br />

High-speed PCIe Gen 4 slots for additional<br />

NAS functionality<br />

PCIe expandability allows for installing 10/25GbE adapters,<br />

QM2 cards, or Fibre Channel cards to increase application<br />

performance.<br />

All-in-one, License-free business<br />

backup solution<br />

QNAP’s backup solution covers most business systems,<br />

including PC/Mac, VMs, containers, popular cloud services,<br />

SaaS, and file servers.<br />

* The TS-h987XU-RP NAS has a different a drive bay architecture compared<br />

to the other three models.<br />

www.qnap.com<br />

Copyright © 2022 QNAP Systems, Inc. All rights reserved.


<strong>ST</strong> Contents <strong>Nov</strong><strong>Dec</strong>.qxd 01-<strong>Dec</strong>-23 10:52 AM Page 1<br />

INDU<strong>ST</strong>RY ROUNDTABLE:<br />

Is Al reliant on Flash?<br />

<strong>Nov</strong>ember/<strong>Dec</strong>ember <strong>2023</strong><br />

Vol 23, Issue 6<br />

CONTENTS<br />

<strong>ST</strong>ORAGE<br />

The UK’s number one in IT Storage<br />

MAGAZINE<br />

CONTENTS<br />

D.R. <strong>ST</strong>RATEGIES:<br />

Application recovery planning<br />

DATA PROTECTION:<br />

Ransomware fears rising<br />

ARCHIVING:<br />

The resurgence of tape<br />

COMMENT - RESEARCH - INTERVIEWS - CASE <strong>ST</strong>UDIES - OPINIONS - PRODUCT REVIEWS<br />

COMMENT….....................................................................4<br />

Turning the ransomware tide<br />

OPINION: <strong>ST</strong>ORAGE <strong>ST</strong>RATEGIES….......................…….6<br />

Dene Lewis, CTO at CAE Technology Services Ltd, looks at how and why data storage<br />

has evolved, what is being done to innovate and solve the potential issues modern<br />

storage presents - as well what the future may hold<br />

12<br />

RESEARCH: DATA PROTECTION…........................……..8<br />

A new Veeam survey has revealed business leaders' growing anxiety about dealing with<br />

ransomware and the psychological, human and financial damage the attacks are<br />

causing<br />

CASE <strong>ST</strong>UDY: MR DATENTECHNIK…….................……10<br />

Leading German IT provider MR Datentechnik has launched a new data storage service,<br />

built on Quantum ActiveScale object storage<br />

14<br />

ANALYSIS: DISK BASED BACKUP….......................…..12<br />

Jerome M. Wendt of DCIG argues that cyber security and resilience are effectively<br />

redefining today's 100-plus disk-based backup target models<br />

OPINION: CYBER RESILIENCE……........................……..14<br />

If you haven't already, make sure you put enterprise storage cyber resilience and<br />

recovery on your to-do list, argues JT Lewis, Director of Channels EMEA and APJ at<br />

Infinidat<br />

MANAGEMENT: DATA ARCHITECTURE…….........……16<br />

Matt Peachey, Vice President, International at Dremio, argues that open is the smart<br />

way forward for data management<br />

20<br />

<strong>ST</strong>RATEGY: TAPE <strong>ST</strong>ORAGE……............................…….18<br />

Matt Ninesling, Senior Director Tape Portfolio, Spectra Logic, shares three current use<br />

cases that are driving organisations to take a second look at tape technology<br />

CASE <strong>ST</strong>UDY: HO<strong>ST</strong>ED.NL…………….......................……20<br />

Dutch IT firm Hosted.nl is 'breaking the boundaries of traditional hosting infrastructure'<br />

with high-performance managed storage clusters running on StorPool and VMware<br />

RESEARCH: RANSOMWARE…………......................……22<br />

Nearly 60% of companies are 'very' to 'extremely' concerned about ransomware<br />

attacks, according to new research from Hornetsecurity<br />

30<br />

ROUNDTABLE: FLASH/AI………...........................………24<br />

Storage magazine gathered a selection of industry experts to discuss how flash<br />

storage and AI have impacted each other, and how flash prices are set to change<br />

ANALYSIS: CONTENT GROWTH…......................……..28<br />

Tom Dunning, CEO of Ad Signal, looks at the potential environmental impact of<br />

relentless content growth<br />

MANAGEMENT: DATA MIGRATION………..................…….30<br />

Kevin Wild, head of presales at Syniti, explains how to manage complex data<br />

migration in the real world<br />

<strong>ST</strong>RATEGY: DISA<strong>ST</strong>ER RECOVERY ……................……32<br />

Sam Woodcock, Senior Director of Cloud Strategy, 11:11 Systems, explains the<br />

importance of architecting a DR plan for application recovery in the cloud<br />

32<br />

RESEARCH: CYBER-ATTACKS……......................………34<br />

To mitigate ransomware attacks, IT professionals must consider both business-related<br />

and infrastructure data equally, suggests new research<br />

www.storagemagazine.co.uk @<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

03


<strong>ST</strong> Comment.qxd 01-<strong>Dec</strong>-23 10:52 AM Page 2<br />

COMMENT<br />

EDITOR: David Tyler<br />

david.tyler@btc.co.uk<br />

SUB EDITOR: Mark Lyward<br />

mark.lyward@btc.co.uk<br />

REVIEWS: Dave Mitchell<br />

PUBLISHER: John Jageurs<br />

john.jageurs@btc.co.uk<br />

LAYOUT/DESIGN: Ian Collis<br />

ian.collis@btc.co.uk<br />

SALES/COMMERCIAL ENQUIRIES:<br />

Lucy Gambazza<br />

lucy.gambazza@btc.co.uk<br />

Stuart Leigh<br />

stuart.leigh@btc.co.uk<br />

MANAGING DIRECTOR: John Jageurs<br />

john.jageurs@btc.co.uk<br />

DI<strong>ST</strong>RIBUTION/SUBSCRIPTIONS:<br />

Christina Willis<br />

christina.willis@btc.co.uk<br />

PUBLISHED BY: Barrow & Thompkins<br />

Connexions Ltd. (BTC)<br />

35 Station Square, Petts Wood<br />

Kent BR5 1LZ, UK<br />

Tel: +44 (0)1689 616 000<br />

Fax: +44 (0)1689 82 66 22<br />

SUBSCRIPTIONS:<br />

UK £35/year, £60/two years,<br />

£80/three years;<br />

Europe: £48/year, £85 two years,<br />

£127/three years;<br />

Rest of World: £62/year<br />

£115/two years, £168/three years.<br />

Single copies can be bought for £8.50<br />

(includes postage & packaging).<br />

Published 6 times a year.<br />

No part of this magazine may be<br />

reproduced without prior consent, in<br />

writing, from the publisher.<br />

©Copyright <strong>2023</strong><br />

Barrow & Thompkins Connexions Ltd<br />

Articles published reflect the opinions<br />

of the authors and are not necessarily those<br />

of the publisher or of BTC employees. While<br />

every reasonable effort is made to ensure<br />

that the contents of articles, editorial and<br />

advertising are accurate no responsibility<br />

can be accepted by the publisher or BTC for<br />

errors, misrepresentations or any<br />

resulting effects<br />

TURNING THE RANSOMWARE TIDE<br />

BY DAVID TYLER<br />

EDITOR<br />

Welcome to the <strong>Nov</strong>ember/<strong>Dec</strong>ember issue of Storage magazine, which<br />

features not one, not two, but three surveys into ransomware/cyberattacks,<br />

and how prepared - or otherwise - organisations are for what increasingly<br />

feels like a 'not if, but when' scenario for most of us these days.<br />

In a piece from Zerto about ransomware preparedness, we learn that only one in<br />

seven businesses are able to fully recover their data following an attack - a statistic<br />

that should really have all of us rushing off to check just how ready our business<br />

systems are!<br />

The most recent high profile news story on the subject featured the British Library,<br />

who appear in the last few weeks to have lost vast amounts of HR data including<br />

passport information - information that could be of value to ID fraudsters, for example.<br />

"Given the high frequency of ransomware attacks and the impacts of successful ones<br />

such as data and infrastructure loss, many organisations are left with damages that<br />

have an effect well beyond IT," comments Christophe Bertrand, practice director at<br />

ESG.<br />

Elsewhere a recent Veeam survey suggests that ransomware is now seen as a greater<br />

concern than either the current state of the economy or the effects of Brexit by most UK<br />

businesses. The consequences of any kind of ransomware attack can be far wider<br />

than many might imagine: according to Veeam, 20% of businesses considered<br />

dissolving within a year of an attack, 32% reported that their staff worked longer<br />

hours, and 42% of respondents said they experienced greater than normal customer<br />

losses.<br />

Given the significant financial damage caused by ransomware, it's clear why some<br />

businesses simply don't make it through. As well as the cost of the ransom itself - if<br />

indeed it is paid - companies lost an average of 35% of their annual turnover in the<br />

three months following an attack, and 39% lost over 40%. 28% experienced a<br />

revenue-hitting drop in productivity.<br />

So, what are organisations doing to prepare for and combat these threats? Our third<br />

piece on the subject, courtesy of Hornetsecurity, gives some indications: their annual<br />

survey suggests that well over 90% of respondents rank ransomware protection as<br />

'very' to 'extremely' important in terms of IT priorities for their organisation, and over<br />

85% confirmed they have a disaster recovery plan in place for a ransomware attack.<br />

There is some reassurance as well in their finding that the number of ransomware<br />

victims actually appears to have gone down slightly in <strong>2023</strong>. It can only be hoped<br />

that this is the beginning of a turning of the tide, as organisations become more<br />

vigilant in their data protection.<br />

04 <strong>ST</strong>ORAGE<br />

MAGAZINE<br />

<strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk


DON’T<br />

SaaSSS<br />

GET YOUR<br />

KICKED! !<br />

TAKE CONTROL NOW AND<br />

PROTECT YOUR SaaS DATA<br />

Global SaaS vendors like Microsoft, Google and Salesforce<br />

don’t assume any responsibility for your data hosted<br />

in their applications. So, it’s up to you to take control<br />

and fully protect your SaaS data from cyber threats or<br />

accidental loss. Arcserve SaaS Backup offers complete<br />

protection for your SaaS data, eliminating business<br />

interruptions due to unrecoverable data loss.<br />

Arcserve SaaS Backup<br />

Complete protection for all your SaaS data.<br />

arcserve.com<br />

The unified data resilience platform<br />

Arcserve-SaaS-297x210-BLUE-AD-AW.indd 1 10/03/<strong>2023</strong> 10:20


CAE 6.qxd 01-<strong>Dec</strong>-23 10:53 AM Page 2<br />

OPINION:<br />

OPINION: <strong>ST</strong>ORAGE <strong>ST</strong>RATEGIES<br />

WHAT IS THE FUTURE OF DATA <strong>ST</strong>ORAGE?<br />

DENE LEWIS, CTO AT CAE TECHNOLOGY SERVICES LTD, LOOKS AT HOW AND WHY DATA <strong>ST</strong>ORAGE<br />

HAS EVOLVED, WHAT IS BEING DONE TO INNOVATE AND SOLVE THE POTENTIAL ISSUES MODERN<br />

<strong>ST</strong>ORAGE PRESENTS - AS WELL WHAT THE FUTURE MAY HOLD<br />

af<br />

The data storage industry has evolved<br />

significantly over the last 30 years, but<br />

there has been a real step change within<br />

the last decade. Not only has the amount of<br />

data, especially unstructured data, increased,<br />

but cloud-based storage has prompted<br />

companies - and individuals - to adopt a<br />

'store everything' mindset rather than take the<br />

time to consider the true value of that<br />

information.<br />

While cloud access to data has been a<br />

critical driver for business innovation, many<br />

data storage strategies have not kept pace<br />

with the diverse needs of digital<br />

transformation. How many businesses can<br />

confidently point to their five-year strategy and<br />

demonstrate not only an ability to scale to<br />

meet data growth but also the key availability,<br />

performance, sustainability and security<br />

features?<br />

UNCAPPED GROWTH<br />

The data we produce and utilise has grown at<br />

a phenomenal pace - by 2025 we will be<br />

storing 160 zettabytes a year, much of it in the<br />

cloud. The challenge for businesses - and the<br />

IT industry - is that the growth in data<br />

volumes, especially unstructured data, is not<br />

gradual. Each technology innovation, in highresolution<br />

imagery, for example, can double<br />

storage demands overnight.<br />

data. Organisations are investigating the use<br />

of AI to optimise business processes, which<br />

creates an additional data source that<br />

ultimately will support business growth, but it<br />

will need to be stored and that storage costs<br />

money as well as having an environmental<br />

impact.<br />

Every aspect of stakeholder engagement<br />

and interaction now demands increasing data<br />

volumes - all of which need to be stored<br />

multiple times to ensure business continuity<br />

and support disaster recovery plans. The<br />

evolution of storage technology means large<br />

quantities of data can be stored in smaller<br />

footprints at the edge, the data centre, or in<br />

the cloud - supporting distributed data<br />

growth.<br />

While the millions, possibly even billions of<br />

personal photographs and videos stored and<br />

posted to multiple social media locations are,<br />

without a doubt, a significant contributing<br />

factor, businesses are also struggling with<br />

data growth that has fundamentally<br />

outstripped expectations.<br />

Just consider the storage implications<br />

when an NHS Trust upgrades its CT<br />

scanner software, improving<br />

the quality and hence the<br />

size of high-resolution<br />

images two, three, or<br />

even four-fold in one<br />

moment.<br />

Or the rapid<br />

evolution of IoT,<br />

which is allowing<br />

businesses to<br />

generate<br />

increasingly high<br />

levels of valuable<br />

But is this financially, or environmentally<br />

sustainable? Unfortunately, there is no clearcut<br />

'yes' or 'no' answer. Why? Because, as<br />

much as the individual components of data<br />

storage have become super-efficient, the<br />

reality is that the power consumption required<br />

has exponentially increased - leaving some<br />

real considerations for organisations looking<br />

to optimise or create a data storage strategy.<br />

TIME TO TAKE RESPONSIBILITY<br />

The IT industry continues to innovate and<br />

address the rising data storage challenge.<br />

Storage density improvement has seen the<br />

price per gigabyte for storage reduce in<br />

recent years, dropping by over 80% from<br />

2009 to 2022, allowing businesses, in theory,<br />

to scale up data storage and positively impact<br />

the bottom line.<br />

The storage market has moved on from the<br />

old days of installing rack upon rack of<br />

mechanical spinning disks, which aside from<br />

being low on capacity, gave IT teams a<br />

06 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


CAE 6.qxd 01-<strong>Dec</strong>-23 10:53 AM Page 3<br />

OPINION:<br />

OPINION: <strong>ST</strong>ORAGE <strong>ST</strong>RATEGIES<br />

"So, is it viable to retain our 'store everything' approach, when data centres are<br />

estimated to be responsible for up to three per cent of global electricity consumption<br />

today, and projected to touch four per cent by 2030? And if so, what is the sustainability<br />

compromise? Businesses face ever more serious demands regarding sustainable<br />

operations and reporting - as well as stakeholder expectations."<br />

constant headache of ensuring optimum<br />

performance. Those days are behind us, but<br />

there are now different solutions available that<br />

bring their own financial, operational, and<br />

sustainability challenges to overcome.<br />

It is important to note that storing data in the<br />

cloud can be perceived as easy and relatively<br />

pain-free to manage, but shifting or migrating<br />

data between cloud services, or from onpremises<br />

to the cloud is complex.<br />

Organisations dealing with high volumes of<br />

data at scale need to consider where they<br />

want to put the data and make the most<br />

appropriate and well-informed decisions<br />

about their data storage strategy up-front,<br />

rather than getting deep into a project and<br />

then making the choice.<br />

Although cost in financial terms is always<br />

considered, it may not always be fully<br />

understood by businesses comparing different<br />

options for hybrid cloud solutions. And<br />

increasingly, important costs such as<br />

environmental and sustainability factors are<br />

rarely considered by businesses at the<br />

planning stage.<br />

MU<strong>ST</strong> WE <strong>ST</strong>ORE EVERYTHING?<br />

So, is it viable to retain our 'store everything'<br />

approach, when data centres are estimated to<br />

be responsible for up to three per cent of<br />

global electricity consumption today, and<br />

projected to touch four per cent by 2030?<br />

And if so, what is the sustainability<br />

compromise?<br />

Businesses face ever more serious demands<br />

regarding sustainable operations and<br />

reporting - as well as stakeholder<br />

expectations. The volume of data stored is not<br />

an issue that can be influenced by regulators,<br />

as it will differ from business to business.<br />

Governments globally, however, are keen to<br />

achieve ESG goals.<br />

I firmly believe that the industry will continue<br />

to innovate and build with sustainability at the<br />

core of its objectives, to build sustainable<br />

ways of storing data and sustainable storage<br />

device technology, from hard drives to<br />

storage arrays and data services that<br />

maximise efficiency and enable every business<br />

to reduce the carbon footprint per terabyte of<br />

storage.<br />

<strong>ST</strong>RATEGIC PLANNING - THE<br />

BALANCING ACT<br />

This is not an issue for the IT industry alone.<br />

The responsibility also sits with businesses to<br />

put in place robust data management and<br />

storage strategies. However fast the data<br />

grows, what are the implications of having a<br />

reactive approach to investment in cloud<br />

storage without considering the business<br />

needs for cost, performance, security and<br />

sustainability?<br />

Given the scale of data growth, businesses<br />

must plan ahead, having a multi-year<br />

approach to data storage. A comprehensive<br />

five-year forward view should encompass<br />

both the intended strategy and structural<br />

framework, along with a well-defined plan. It<br />

should take into account not just the<br />

anticipated data expansion and its influencing<br />

factors but also utilise this data to evaluate<br />

strategic alternatives for data storage that<br />

align with the organisation's objectives.<br />

Put simply - how and where should data be<br />

stored? And for what reason?<br />

THINKING LATERALLY<br />

Data storage cannot be considered in<br />

isolation from the rest of the IT and wider<br />

business strategy. We need to be thinking<br />

strategically and holistically about data<br />

storage - not just picking it off as a tactical<br />

issue in an isolated area of the business. It<br />

must be considered as a broad strategic<br />

approach that spans edge, data centres, the<br />

cloud and everything in between.<br />

Creating the right data strategy is far more<br />

nuanced than simply spinning up another<br />

cloud storage subscription and, at every step,<br />

organisations should be considering the<br />

sustainability implications of storage<br />

decisions.<br />

Of course, there are difficult questions<br />

ahead as the demands for data change<br />

again. Generative AI and Machine Learning<br />

algorithms will require access to petabytes of<br />

data to accurately train AI models, which will<br />

impact financial and sustainability factors.<br />

Ultimately, data growth has significantly<br />

increased and will continue to do so. What<br />

happens when everybody gets online?<br />

There is set to be another massive step<br />

change in data volumes and data usage. The<br />

question is how is the world, as both<br />

individuals and businesses, going to respond?<br />

More info: www.thisiscae.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

07


Veeam 09.qxd 01-<strong>Dec</strong>-23 10:54 AM Page 2<br />

RESEARCH: RANSOMWARE<br />

af<br />

RANSOMWARE A BIGGER CONCERN THAN<br />

THE ECONOMY OR BREXIT FOR UK BUSINESS<br />

A NEW VEEAM SURVEY HAS REVEALED BUSINESS LEADERS' GROWING ANXIETY ABOUT DEALING WITH<br />

RANSOMWARE AND THE PSYCHOLOGICAL, HUMAN AND FINANCIAL DAMAGE THE ATTACKS ARE<br />

CAUSING<br />

Surging cyberattacks have significantly<br />

elevated UK business leaders' anxiety<br />

about ransomware, with 43% ranking it<br />

as more of a concern than all other critical<br />

macroeconomic and business challenges, a<br />

new local survey from Censuswide,<br />

commissioned by Veeam Software, has<br />

revealed.<br />

Earlier this year, the Veeam Data Protection<br />

Trends Report <strong>2023</strong> found that 85% of<br />

global businesses surveyed suffered at least<br />

one attack last year, so it's not surprising that<br />

UK business leaders rate ransomware as a<br />

more significant threat to their organisation<br />

than the economic crisis (41%), skills<br />

shortages (34%), political uncertainty (31%),<br />

and Brexit (30%). However, the psychological<br />

and human impact that this is having on<br />

business leaders and their employees is far<br />

more deep-rooted than previously<br />

understood.<br />

RISK OF BUSINESS COLLAPSE<br />

Censuswide surveyed 100 directors of UK<br />

companies with over 500 employees who<br />

had suffered a ransomware attack in the past<br />

18 months. The study unearthed several<br />

concerning results: 61% are anxious about<br />

the prospect of another ransomware attack.<br />

This might well be explained by the fact that<br />

71% agree that their business would collapse<br />

if it suffered another attack, and 56% believe<br />

another incident would force the organisation<br />

to make redundancies. In fact, 77% of<br />

organisations reduced staff numbers after the<br />

last attack and over 50% were unable to<br />

make new hires due to paying a ransom.<br />

The survey also uncovered the severe toll<br />

cybercrime places on people's wellbeing:<br />

54% of respondents said they experienced a<br />

decline in their overall health, while 26% left<br />

the role they were in altogether. Troublingly,<br />

it's not just security and IT staff that are<br />

affected. 71% of respondents believe that<br />

ransomware attacks critically disrupted most<br />

departments in the company.<br />

The marketing team is seen as bearing the<br />

08 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Veeam 09.qxd 01-<strong>Dec</strong>-23 10:54 AM Page 3<br />

RESEARCH: RANSOMWARE<br />

"It's understandable that ransomware is a leading cause of stress for business leaders<br />

and their employees, especially as it's now a case of 'how often' rather than 'if' or 'when'<br />

cyber-attacks will strike. With cybercriminals constantly evolving the pursuit of their next<br />

victim, businesses must do all they can to reduce ransomware's human and economic<br />

consequences by protecting and backing up their data to ensure rapid recovery after an<br />

attack. This will not only keep businesses running as usual in the face of the very real<br />

threat of ransomware but will also considerably alleviate the ripple effects it can have on<br />

people and businesses."<br />

costs, nearly half (49%) experienced increased<br />

customer complaints, and 47% reported team<br />

stress.<br />

largest brunt, with 82% while other business<br />

units not usually associated with dealing with<br />

the aftermath such as operations (73%),<br />

production/R&D (73%) and HR (70%), were<br />

also adversely influenced.<br />

The undesirable consequences people are<br />

experiencing may be partly attributed to the<br />

effect ransomware attacks can have on their<br />

careers and livelihoods. According to the<br />

survey, 20% of businesses considered<br />

dissolving within a year of an attack, 32%<br />

reported that their staff worked longer hours,<br />

and 42% of respondents said they experienced<br />

greater than normal customer losses.<br />

Given the significant financial damage<br />

caused by ransomware, it's clear why some<br />

businesses don't make it. As well as the cost of<br />

the ransom itself - if paid - companies lost an<br />

average of 35% of their annual turnover in the<br />

three months following an attack, and 39%<br />

lost over 40%. 28% experienced a revenuehitting<br />

drop in productivity.<br />

SKILLS ISSUES<br />

At the same time, businesses are battling the<br />

ongoing skills shortage and challenging<br />

economic conditions, making the effects of<br />

ransomware even greater. In the wake of an<br />

incident, 56% said they had increased hiring<br />

Dan Middleton, Regional Vice President UK&I<br />

at Veeam, said about the findings: "It's<br />

understandable that ransomware is a leading<br />

cause of stress for business leaders and their<br />

employees, especially as it's now a case of<br />

'how often' rather than 'if' or 'when' cyberattacks<br />

will strike. With cybercriminals<br />

constantly evolving the pursuit of their next<br />

victim, businesses must do all they can to<br />

reduce ransomware's human and economic<br />

consequences by protecting and backing up<br />

their data to ensure rapid recovery after an<br />

attack. This will not only keep businesses<br />

running as usual in the face of the very real<br />

threat of ransomware but will also considerably<br />

alleviate the ripple effects it can have on<br />

people and businesses."<br />

The findings highlight the urgent need for<br />

businesses to build up cyber resilience.<br />

Fortunately, companies are taking steps to<br />

tackle the ransomware threat head-on. 43% of<br />

those surveyed implemented a backup and<br />

recovery strategy after experiencing an attack,<br />

and 37% optimised their backup and recovery<br />

strategy, showing how backup is increasingly<br />

viewed as the best line of defence.<br />

More info: www.veeam.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

09


Quantum 10.qxd 01-<strong>Dec</strong>-23 10:55 AM Page 2<br />

CASE <strong>ST</strong>UDY: MR <strong>ST</strong>UDY:<br />

DATENTECHNIK<br />

af<br />

SIMPLE, SCALEABLE, RESILIENT<br />

LEADING GERMAN IT PROVIDER MR DATENTECHNIK HAS LAUNCHED A NEW DATA <strong>ST</strong>ORAGE SERVICE,<br />

BUILT ON QUANTUM ACTIVESCALE OBJECT <strong>ST</strong>ORAGE<br />

Headquartered in the German state of<br />

Bavaria, MR Datentechnik offers a full<br />

range of IT solutions and managed<br />

services. Organisations engage the company<br />

for everything from infrastructure deployment<br />

and systems integration to digitisation initiatives<br />

and fully outsourced IT management.<br />

Recently, the leadership team at MR<br />

Datentechnik decided to launch a new storage<br />

service to support customers' needs to preserve<br />

and protect fast-growing data volumes. The<br />

service, which would be designed for online<br />

storage of object data, could be used for<br />

backup and recovery, archiving, and data<br />

security. This online service would enable<br />

organisations to retrieve data rapidly - anytime,<br />

from anywhere.<br />

Creating an S3-compatible service was a top<br />

priority. The MR Datentechnik team wanted to<br />

support S3 applications and workflows and<br />

facilitate integration with S3 cloud storage<br />

environments.<br />

For the service's launch, the MR Datentechnik<br />

team decided to focus first on the backup use<br />

case. As a result, the underlying storage<br />

platform for the service had to integrate<br />

seamlessly with the latest version of Veeam<br />

Backup & Replication.<br />

BUILDING A NEW <strong>ST</strong>ORAGE SERVICE<br />

In designing the new service, the MR<br />

Datentechnik team explored numerous storage<br />

solutions, including solutions from several<br />

leading public cloud providers. Ultimately, the<br />

team selected Quantum ActiveScale object<br />

storage.<br />

Quantum ActiveScale is S3 compatible,<br />

which allows MR Datentechnik's clients to use<br />

the storage service with S3-enabled apps and<br />

workflows. For example, using this service<br />

based on ActiveScale, organisations can easily<br />

replicate data to S3 cloud storage<br />

environments.<br />

"Quantum ActiveScale provides the reliable,<br />

highly scalable S3-compatible object storage<br />

we needed for building our new<br />

storage service," says<br />

Jochen Kraus, Managing Director, MR<br />

Datentechnik. "The platform is stable even<br />

under high loads, and it offers sophisticated<br />

software that is extremely useful for our multitenant<br />

environment."<br />

SEAMLESS VEEAM INTEGRATION<br />

Quantum ActiveScale provides tight integration<br />

with Veeam Backup & Replication - the<br />

industry-leading backup, recovery, and data<br />

security solution for on-premises and cloud<br />

workloads. ActiveScale is part of a<br />

comprehensive portfolio of Quantum Veeam<br />

Ready products, which also includes Quantum<br />

flash storage, backup appliances, and tape<br />

storage solutions.<br />

ActiveScale is fully qualified as a Veeam<br />

Ready Object Repository with Immutability. This<br />

designation reflects ActiveScale's ability not<br />

only to act as a backup repository, but also<br />

provide outstanding<br />

ransomware<br />

10 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Quantum 10.qxd 01-<strong>Dec</strong>-23 10:55 AM Page 3<br />

CASE <strong>ST</strong>UDY: MR <strong>ST</strong>UDY:<br />

DATENTECHNIK<br />

"Quantum ActiveScale provides the reliable, highly scalable S3-compatible object<br />

storage we needed for building our new storage service. The platform is stable even<br />

under high loads, and it offers sophisticated software that is extremely useful for multitenant<br />

management."<br />

protection with immutable Veeam backups<br />

based on ActiveScale's object locking feature.<br />

ActiveScale also now supports integrated<br />

space reporting using the Veeam Smart Object<br />

API (SOSAPI) simple monitoring of Veeam<br />

storage usage.<br />

With ActiveScale as the foundation for their<br />

new storage service, the MR Datentechnik<br />

team - and its clients - can take advantage of<br />

the powerful new capabilities of Veeam<br />

Backup & Replication version 12. For example,<br />

Veeam v12 introduces direct-to-object storage<br />

capabilities, enabling ActiveScale to act as a<br />

standalone repository as well as a primary<br />

backup repository in the Veeam Performance<br />

tier. This eliminates the need to copy<br />

production data to an intermediary system<br />

before moving it to ActiveScale. This capability<br />

helps MR Datentechnik offer a faster, simpler,<br />

and more cost-effective service to customers.<br />

Also, with Veeam v12, MR Datentechnik's<br />

service provides an ideal target for backing up<br />

NAS devices and file shares. Veeam v12<br />

optimises performance with metadata caching,<br />

thus reducing overall data transfer<br />

requirements when backing up to the MR<br />

Datentechnik storage cloud.<br />

ACHIEVING GROWTH TARGETS EARLY<br />

Building on ActiveScale has helped MR<br />

Datentechnik to drive swift growth for the new<br />

storage service. "Thanks to ActiveScale's ability<br />

to automate the onboarding process, we can<br />

onboard customers in under a day," says<br />

Kraus. "With that fast onboarding, we've been<br />

able to add customers to this service much<br />

more quickly than we anticipated. We have<br />

reached our two-year growth target goals in<br />

less than a year."<br />

In addition to protecting customer<br />

environments, MR Datentechnik's internal<br />

teams are also benefiting from the new service<br />

by employing it as a backup target for their<br />

own internal IT systems and other managed<br />

services infrastructure.<br />

As MR Datentechnik continues to add<br />

customers, the company can scale its<br />

ActiveScale-based service easily and without<br />

disruption. ActiveScale enables MR<br />

Datentechnik to scale compute, networking,<br />

and storage resources to support billions of<br />

objects and exabytes of capacity for their own<br />

disaster recovery and backup needs, as well as<br />

for their S3 Service offerings to their customers.<br />

The platform's Dynamic Data Placement<br />

feature automatically optimises placement of<br />

objects across resources, eliminating the need<br />

for manual rebalancing.<br />

Capacity on demand software licensing and<br />

seamless, non-disruptive expansion give MR<br />

Datentechnik flexibility to accommodate the<br />

company's precise needs, incrementally,<br />

without large-scale overhauls. "The simple<br />

scalability of ActiveScale means we can<br />

continue to onboard new customers and<br />

provide those customers with a rapidly<br />

expandable service," says Kraus.<br />

MAXIMISED RESILIENCY FOR ALWAYS-<br />

ON SERVICE<br />

Managed service providers need to build their<br />

offerings on reliable, resilient platforms to<br />

reduce any possibility of customer downtime.<br />

ActiveScale's rolling system upgrade capability<br />

and erasure coded data durability help MR<br />

Datentechnik deliver an always-on service that<br />

can tolerate component and site failures<br />

without jeopardising availability.<br />

CONTROLLING COMPLEXITY<br />

To provide a fully managed service as<br />

economically as possible, MR Datentechnik<br />

needed a storage platform that could<br />

minimise administrative complexity. The<br />

ActiveScale web-based interface helps<br />

simplify management of accounts, users,<br />

access keys, health monitoring, capacity,<br />

performance, and more.<br />

In addition, the platform's DevOps features<br />

enable easy integration. "ActiveScale offers<br />

extensive options for connecting to our<br />

existing management infrastructureincluding<br />

through RE<strong>ST</strong>ful APIs, a<br />

command-line interface, SSH, and<br />

Prometheus-based metrics, monitoring, and<br />

alerting," says Kraus.<br />

For backup services, the MR Datentechnik<br />

can take a fully integrated approach to<br />

administration. "With the close integration of<br />

ActiveScale and Veeam, we can handle<br />

every aspect of the managed backup service<br />

efficiently - from implementation to<br />

onboarding to ongoing management of<br />

customer environments," says Kraus.<br />

FLEXIBILITY FOR ADDITIONAL<br />

USE CASES<br />

MR Datentechnik continues to onboard new<br />

customers who are using the storage service<br />

for backup with Veeam Backup &<br />

Replication. At the same time, Kraus<br />

envisions additional use cases: "With S3<br />

compatibility, we can expand the storage<br />

service to other workflows. By building on<br />

Quantum ActiveScale, we have a very<br />

flexible platform for meeting a broad<br />

spectrum of customer needs."<br />

More info: www.quantum.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

11


DCIG 12.qxd 01-<strong>Dec</strong>-23 10:56 AM Page 2<br />

ANALYSIS:<br />

ANALYSIS: DISK BASED BACKUP<br />

MOVING TARGETS<br />

JEROME M. WENDT OF DCIG ARGUES THAT CYBER SECURITY AND RESILIENCE ARE EFFECTIVELY<br />

REDEFINING TODAY'S 100-PLUS DISK-BASED BACKUP TARGET MODELS<br />

af<br />

Until quite recently organisations<br />

largely defined disk-based backup<br />

targets by three characteristics: how<br />

fast they backed up data, their data<br />

deduplication ratios, and the data<br />

protection protocols they offered and/or<br />

supported. While these attributes still<br />

matter, organisations increasingly prioritise<br />

new cyber security and resilience features<br />

when acquiring disk-based backup targets.<br />

RANSOMWARE PUTS FOCUS ON<br />

BACKUP<br />

Ransomware, perhaps more so than any<br />

other factor, has cast a spotlight on<br />

organisational backup processes. Backups<br />

no longer represent a mundane task that<br />

organisations must complete. Now<br />

organisations need confirmation of<br />

successful backups, that ransomware does<br />

not compromise them, and that they can<br />

recover them quickly.<br />

Further, these new requirements will ideally<br />

accompany the deduplication,<br />

performance, and data protection protocol<br />

features already available on backup<br />

targets. The challenge becomes how to<br />

deliver both these established and new<br />

features in a manner that meets today's<br />

organizational expectations.<br />

Providers have certainly made<br />

advancements in delivering more cyber<br />

security and resilience features to satisfy<br />

these desires. However, no offering yet<br />

checks all the boxes for delivering every<br />

established and new feature that<br />

organisations may want.<br />

This has resulted in a generation of diskbased<br />

backup products that offer a mixed<br />

bag of features. Almost all disk-based<br />

backup targets, regardless of the provider,<br />

will ensure fast, successful backups.<br />

However, organisations will then need to<br />

determine which features, established or<br />

new, they want to prioritise in their diskbased<br />

backup target.<br />

Offerings from established providers will<br />

better support deduplication and data<br />

protection protocols, while offerings from<br />

new providers tend to better meet new<br />

demands for fast restores at scale and<br />

resilience. Cyber security remains in a state<br />

of flux with providers at various stages of<br />

12 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


DCIG 12.qxd 01-<strong>Dec</strong>-23 10:56 AM Page 3<br />

ANALYSIS:<br />

ANALYSIS: DISK BASED BACKUP<br />

"While overlap exists between cyber security and resilience, one does not<br />

automatically encompass the other. For instance, cyber security does not automatically<br />

equate to fast restores or a highly available backup target. If anything, a cyber secure<br />

disk-based backup target may even delay restores and recoveries. Rather, fast restores<br />

and high availability fall more under the classification of resilience. This becomes<br />

problematic if organisations view those two features as prerequisites in their<br />

environment. In cases like this, they may need to make a trade-off between cyber<br />

security and resilience when selecting a backup target."<br />

implementing different aspects of it.<br />

INITIAL INSIGHTS<br />

As DCIG researches and prepares to<br />

release one or more 'Top 5' reports on diskbased<br />

backup targets in 2024, here are<br />

some initial insights it has into these<br />

offerings.<br />

Over 100 disk-based backup target<br />

models: Organisations, and large<br />

enterprises especially, may only view diskbased<br />

backup targets as being primarily<br />

available from Dell, ExaGrid, HPE,<br />

Quantum, and Veritas. However, the new<br />

demands for cyber security and resilience<br />

have spawned a wave of innovation. Both<br />

emerging and established primary storage<br />

providers have refocused their storage<br />

solutions to optimise them for backup.<br />

Enterprise storage providers such as<br />

Huawei, NetApp, and Pure Storage now<br />

specifically optimise their solutions for<br />

backup. This has also prompted new<br />

entrants into this space to include Arcserve,<br />

Infinidat, Infortrend, iXsystems, Nexsan,<br />

Nimbus Data, Racktop Systems, and VA<strong>ST</strong><br />

Data. Each of these has at least one model<br />

and, in some cases, over a dozen models.<br />

This total does not even include the growing<br />

number of software-defined storage<br />

providers who now play in this space.<br />

Implementations of cyber security features<br />

vary widely: Almost every disk-based<br />

backup target provider says its product<br />

includes cyber security features. That largely<br />

holds true. However, each product's cyber<br />

security features may not match the needs<br />

of your organisation.<br />

For instance, most if not all products now<br />

offer data immutability. However, on at least<br />

one product, a backup it hosts does not<br />

become immutable until an hour after the<br />

backup completes. Other providers permit<br />

administrators to log into the product's<br />

management console and change or delete<br />

immutable backups. This represents only<br />

one example of how providers implement<br />

cyber security features differently on their<br />

respective solutions.<br />

Cyber security ≠ resilience. Organisations<br />

may equate a backup target's cyber security<br />

features with resilience. While overlap exists<br />

between cyber security and resilience, one<br />

does not automatically encompass the<br />

other. For instance, cyber security does not<br />

automatically equate to fast restores or a<br />

highly available backup target. If anything,<br />

a cyber secure disk-based backup target<br />

may even delay restores and recoveries.<br />

Rather, fast restores and high availability<br />

fall more under the classification of<br />

resilience. This becomes problematic if<br />

organisations view those two features as<br />

prerequisites in their environment. In cases<br />

like this, they may need to make a trade-off<br />

between cyber security and resilience when<br />

selecting a backup target.<br />

EMERGING PRIORITIES<br />

As these three initial insights from DCIG's<br />

research into today's disk-based backup<br />

targets reveal, their features have evolved<br />

significantly. Notably, organisations no<br />

longer prioritise data deduplication ratios in<br />

the same way they once did.<br />

While these factors still matter, cyber<br />

security and resilience now emerge as new<br />

priorities in disk-based backups. This has<br />

led to multiple new entrants into this space<br />

of both existing and new players looking for<br />

a foothold in this space. Further, some have<br />

experienced success since they better<br />

deliver on the new cyber security and<br />

resilience features that more organisations<br />

prioritise.<br />

More info: www.dcig.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

13


Infinidat 14.qxd 01-<strong>Dec</strong>-23 10:57 AM Page 2<br />

OPINION:<br />

OPINION: CYBER RESILIENCE<br />

KEY TAKEAWAYS FOR CYBER RESILIENCE<br />

IF YOU HAVEN'T ALREADY, MAKE SURE YOU PUT ENTERPRISE <strong>ST</strong>ORAGE CYBER RESILIENCE AND<br />

RECOVERY ON YOUR TO-DO LI<strong>ST</strong>, ARGUES JT LEWIS, DIRECTOR OF CHANNELS EMEA AND APJ AT<br />

INFINIDAT<br />

af<br />

Pressures in the C-Suite have been<br />

rising for some time, due to<br />

escalating fears about what<br />

cybercriminals can do to threaten<br />

business operations. The <strong>2023</strong> Voice of<br />

the CISO report highlights this, showing<br />

that expectations of cyberattack risks<br />

among CISOs have risen to 68%,<br />

compared with 48% a year ago. Over<br />

60% believe their organisations are<br />

insufficiently prepared to cope with a<br />

targeted attack, compared with just 50% a<br />

year ago.<br />

They are not alone in worrying about<br />

cyber vulnerabilities. CEOs also rank<br />

cyber threats as one of their top worries<br />

and recent research suggests they do not<br />

feel confident when it comes to mitigating<br />

the threat of a cyber security incident<br />

either. This was a key finding of a recent<br />

CEO study conducted by Saïd Business<br />

School at the University of Oxford. Three<br />

quarters of the leaders surveyed admitted<br />

to feeling uncomfortable making<br />

decisions about cyber threats and security.<br />

One area that is consistently overlooked<br />

when planning a cybersecurity strategy is<br />

the resilience and recovery of enterprise<br />

data storage systems. Infinidat has<br />

dedicated years of research and product<br />

development to developing a proven<br />

portfolio of technology solutions to help<br />

detect cyber attacks, and, if you do have<br />

an attack, to help in delivering near<br />

instantaneous recovery. But before<br />

investigating these<br />

solutions, every<br />

member of<br />

the C-Suite<br />

should<br />

appreciate some key facts about why<br />

ensuring storage systems are cyber<br />

resilient is so important.<br />

#1 NOT IF, BUT WHEN…<br />

Every 39 seconds, an organisation<br />

somewhere in the world suffers a cyber<br />

attack. You are right to be concerned. The<br />

question is not if your enterprise will suffer<br />

a cyber attack, but when and how often.<br />

And if penetrating the firewall is a given,<br />

this also means it's highly likely that<br />

primary and secondary data being stored<br />

by an enterprise will be compromised at<br />

some point too.<br />

#2 CYBER CRIMINALS ARE<br />

PATIENT BEA<strong>ST</strong>S<br />

When cyber attackers<br />

target an enterprise, they<br />

don't immediately<br />

14 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Infinidat 14.qxd 01-<strong>Dec</strong>-23 10:57 AM Page 3<br />

OPINION:<br />

OPINION: CYBER RESILIENCE<br />

"Immutable snapshots are like the vital 'secret sauce' of storage cybersecurity. They<br />

allow the end user to effectively roll back the clock and recover guaranteed,<br />

uncorrupted copies of their data, before the execution of any malware or ransomware<br />

code introduced by an attacker. Immutable snapshots ensure data integrity because<br />

they prevent data copies from being altered or deleted by anyone."<br />

pounce, but wait for a while before<br />

demanding a ransom. Sometimes they will<br />

have planned their eventual move for over<br />

6 months. Research conducted by the<br />

Ponemon Institute verifies this, suggesting<br />

that the average number of days before a<br />

data breach is identified can be as high<br />

as 287.<br />

It means the hackers have a much<br />

greater chance of their ransomware<br />

demands being met because without the<br />

right controls in place, data stored can be<br />

fully compromised. In that timeframe,<br />

data could have been exposed to all kinds<br />

of criminal activity.<br />

#3 PREVENTION IS ALWAYS BETTER<br />

THAN CURE<br />

Data is one of the most important<br />

strategic assets for an enterprise -<br />

McKinsey has coined the phrase 'datadriven<br />

enterprise.' It describes how data<br />

is behind every decision, interaction and<br />

process. If effective cyber security is about<br />

being ready to thwart the problems that<br />

arise from a security breach, enterprises<br />

should be trying a different approach to<br />

protect their data: one that involves<br />

thinking beyond the traditional toolkits of<br />

firewalls or cyber management software<br />

and being ready with an antidote, to stop<br />

damage from spreading.<br />

TIME TO PROTECT DATA, BUT<br />

HOW?<br />

When it comes to securing an enterprise's<br />

data storage, there are some essentials to<br />

building a storage cyber defence strategy.<br />

These include ensuring the immutable<br />

nature of the data, recovered from a copy<br />

you can trust. Air-gapping to separate the<br />

management and data planes to protect<br />

the data. A secure forensic environment,<br />

to analyse the data thoroughly and ensure<br />

the fastest recovery speeds possible, is<br />

critical.<br />

Immutable snapshots are like the vital<br />

'secret sauce' of storage cybersecurity.<br />

They allow the end user to effectively roll<br />

back the clock and recover guaranteed,<br />

uncorrupted copies of their data, before<br />

the execution of any malware or<br />

ransomware code introduced by an<br />

attacker. Immutable snapshots ensure<br />

data integrity because they prevent data<br />

copies from being altered or deleted by<br />

anyone. Even internal systems<br />

administrators are locked out of<br />

immutable snapshots manipulation. It<br />

means that the enterprise can be<br />

confident that any disruption or damage<br />

caused by the intrusion can be minimised.<br />

Logical air gapping adds a further layer<br />

of security, by separating the storage<br />

management and data planes of the<br />

immutable snapshots. There are three<br />

types of air gapping. Local air gapping<br />

keeps the data on premises, remote air<br />

gapping makes use of a remotely hosted<br />

system and hybrid air gapping combines<br />

the two.<br />

Fenced forensic environments help<br />

speed up the recovery process by<br />

providing a secure area to perform a<br />

post-attack forensic analysis of the<br />

immutable snapshots. The purpose here<br />

is to carefully curate data candidates and<br />

find a known good copy. The last thing<br />

an enterprise wants to do after an attack<br />

is to start restoring infected data that has<br />

malware or ransomware infiltrated within<br />

it. Once the forensic analysis is complete,<br />

it is safe to restore a copy to primary<br />

storage systems.<br />

The right cyber storage resilience<br />

solution is part of a "set it and forget it"<br />

process. Once the immutable snapshots,<br />

logical air gapping, fenced forensic<br />

environment and cyber attack recovery<br />

processes have been established, the<br />

whole restoration will progress like<br />

clockwork. This is all part of being an<br />

agile enterprise, one that's cyber resilient<br />

as well as cyber secure. Significantly, very<br />

few enterprise storage vendors can offer<br />

this level of cyber resiliency on both<br />

primary and secondary data.<br />

Data is an important strategic asset,<br />

critical to long term business success and<br />

yet many enterprises lack a fully<br />

integrated a cyber storage resilience<br />

program. Storage must be regarded as<br />

an essential part of effective cyber<br />

resilience, so make sure it's on your to<br />

do list!<br />

More info: www.infinidat.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

15


Dremio 16.qxd 01-<strong>Dec</strong>-23 10:58 AM Page 2<br />

MANAGEMENT: DATA ARCHITECTURE<br />

af<br />

EMBRACING OPEN DATA ARCHITECTURE<br />

MATT PEACHEY, VICE PRESIDENT, INTERNATIONAL AT DREMIO,<br />

ARGUES THAT OPEN IS THE SMART WAY FORWARD FOR DATA<br />

MANAGEMENT<br />

For the past decades, data has been<br />

propelling business operations. Whether<br />

an organisation offers tangible goods or<br />

intangible services, crucial information about<br />

partners, workforce, processes, and clients<br />

forms the backbone of a company's wellbeing.<br />

At the heart of any computing system is data<br />

storage, therefore the selection of the<br />

appropriate solution to store it will significantly<br />

impact how efficiently an organisation's network<br />

and accompanying infrastructure cater to the<br />

business requirements.<br />

The primary expectation from a data storage<br />

system is to safely keep valuable data while<br />

allowing users and applications to retrieve it<br />

seamlessly and swiftly when required. However,<br />

with the volume of data growing exponentially<br />

and never deleted, businesses started to add<br />

more storage capacity.<br />

The issue deepens where data warehouse<br />

vendors store data in a proprietary format.<br />

Data gets locked into the platform, making it<br />

difficult and costly to extract if and when a<br />

business wants to. Further, maintaining and<br />

troubleshooting issues often require teams with<br />

deep subject matter expertise in the ecosystem -<br />

an expensive outlay.<br />

Given the multitude of data storage<br />

alternatives and system setups, organisations<br />

can get dragged into the rabbit hole whilst<br />

adding more data to their systems - a very<br />

inefficient approach. Organisations must<br />

embrace open-source standards, technologies<br />

and formats to ensure fast and cost-effective<br />

analytics with the best engine for each<br />

workload. This provides the agility to innovate<br />

with the next wave of technology without<br />

draining resources or time.<br />

EVOLUTION OF DATA ARCHITECTURE<br />

Previously, companies depended on<br />

conventional databases or warehouses for their<br />

Business Intelligence (BI) demands. However,<br />

these systems presented certain difficulties. The<br />

typical data warehouse setup requires investing<br />

in expensive on-premises hardware,<br />

maintaining structured data in proprietary<br />

formats, and dependence on a centralised IT<br />

and data department for analysis. Other<br />

obstacles included technical interoperability,<br />

system orchestration, and, more critically,<br />

scalability.<br />

However, things changed in 2006 with the<br />

launch of Hadoop, built on the Map-Reduce<br />

paradigm capable of parallel processing and<br />

producing enormous data sets over large<br />

clusters of commoditised hardware. This<br />

framework facilitated handling vast datasets<br />

distributed over computer clusters, making it<br />

immensely appealing for businesses<br />

accumulating more data with each passing<br />

day. Still, databases like Teradata and Oracle<br />

encapsulated storage, computation, and data<br />

within a single, interconnected system, offering<br />

no separation of compute and storage<br />

components.<br />

Between 2015 and 2020, however, the<br />

widespread usage of the public cloud altered<br />

this approach, enabling the separation of<br />

compute and storage. Cloud data vendors like<br />

AWS and Snowflake facilitated this separation<br />

in cloud warehouses, enhancing scalability and<br />

efficiency. Nevertheless, data still had to be<br />

ingested, loaded, and duplicated into a single<br />

proprietary system, which was attached to a<br />

solitary query engine. Employing multiple<br />

databases or data warehouses necessitated the<br />

storage of multiple data copies. Moreover,<br />

companies were still charged for transferring<br />

their data into and out of the proprietary<br />

system, which resulted in excessive costs.<br />

Enter more contemporary and open data<br />

architecture, where data exists as an<br />

independent layer. This includes highlighting a<br />

clear division between data and compute. Data<br />

is stored in open-source file formats and table<br />

formats and accessed by decoupled and elastic<br />

compute engines. Consequently, different<br />

engines can access the same data in a loosely<br />

tied architecture. In these architectures, data is<br />

stored as its own independent tier source in<br />

open formats within the company's cloud<br />

account and made accessible to downstream<br />

consumers through various services.<br />

This transformation parallels the shift in<br />

applications from monolithic architectures to<br />

microservices. A comparable transition is<br />

presently occurring in data analytics, with<br />

companies migrating from proprietary data<br />

warehouses and ceaseless ETL (Extract,<br />

Transform, Load) processes to open data<br />

architectures like cloud data lakes and<br />

lakehouses.<br />

SEPARATING COMPUTE AND <strong>ST</strong>ORAGE<br />

FOR EFFICIENCY<br />

Over the years, there have been many<br />

discussions around the detachment of compute<br />

from storage within the industry, primarily due<br />

to its contribution to enhancing efficiency, which<br />

resulted in several advantages.<br />

Firstly, the reduction in raw storage costs was<br />

so significant that they practically disappeared<br />

from IT budget spreadsheets. Secondly,<br />

compute costs became segregated, leading to<br />

customers paying only for what they utilised<br />

during data processing, which lowered overall<br />

expenses. Lastly, the independent scalability of<br />

both storage and compute facilitated ondemand,<br />

elastic resource precision, adding<br />

flexibility to architecture designs.<br />

However, these changes took time to<br />

materialise. Expensive Storage Area Networks<br />

(SANs) and less costly but often complex<br />

Network Attached Storage (NAS) systems have<br />

existed for quite a while. Both storage models<br />

were limited due to administrative and<br />

16 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Dremio 16.qxd 01-<strong>Dec</strong>-23 10:58 AM Page 3<br />

MANAGEMENT: DATA ARCHITECTURE<br />

"<strong>Dec</strong>oupling compute and storage in public clouds is<br />

more straightforward to administer and relatively<br />

inexpensive. Besides, these compute and storage cloud<br />

services are virtually unlimited in scalability, eliminating<br />

legacy hardware procurement issues. They also offer<br />

supreme levels of availability and performance."<br />

procurement overheads. Mass adoption of<br />

separating compute and storage only became<br />

feasible with public cloud computing.<br />

<strong>Dec</strong>oupling compute and storage in public<br />

clouds is more straightforward to administer<br />

and relatively inexpensive. Besides, these<br />

compute and storage cloud services are<br />

virtually unlimited in scalability, eliminating<br />

legacy hardware procurement issues. They also<br />

offer supreme levels of availability and<br />

performance. Therefore, the separation of<br />

compute from data brings forth three<br />

immediate benefits:<br />

A significant reduction in complicated and<br />

expensive data copies and movements as<br />

the data warehouse as the sole source of<br />

truth gets replaced by accessing data in<br />

open formats in the data lake, eliminating<br />

data silos.<br />

Open data standards and formats provide<br />

universal data access from infinite services<br />

and applications, creating the freedom to<br />

pick the best solutions.<br />

An open architecture ensures that future<br />

cloud services can directly access the data,<br />

avoiding going through a data warehouse<br />

vendor's proprietary format or<br />

moving/copying data from the data<br />

warehouse.<br />

THE OPPORTUNITIES OF OPEN<br />

ARCHITECTURE<br />

Cloud data warehouse providers enticed firms<br />

with the allure of scalability and cost-efficiency<br />

that was unsustainable with on-premises<br />

solutions. However, after uploading their data<br />

into the warehouse, organisations were<br />

restricted entirely to the vendor's ecosystem or<br />

denied access to other promising technologies<br />

that could extract more value from their data.<br />

Open architecture is a significant advantage<br />

of cloud data lake/lakehouse over the data<br />

warehouse. As a result, organisations are<br />

reassessing their strategies to use an open<br />

architecture that promotes flexibility and reestablishes<br />

ownership of their data. This shift<br />

signifies three things:<br />

The flexibility to utilise various superior<br />

services and engines on the company's<br />

data. This allows the use of diverse<br />

technologies like superior SQL, Databricks<br />

or any other data-processing tool. Given<br />

that companies have numerous use cases<br />

and requirements, utilising the best-suited<br />

tool yields higher productivity - especially for<br />

data teams - and lower cloud costs. It's also<br />

important to remember that no single<br />

vendor can offer all the processing<br />

capabilities a company requires.<br />

Not being confined to one vendor. Platform<br />

changes become profoundly challenging<br />

when dealing with a data warehouse<br />

holding up to a million tables and hundreds<br />

of complex ingestion pipelines.<br />

Comparatively, if an organisation uses a<br />

superior SQL on its cloud data lake today<br />

and a new tool emerges tomorrow, it's<br />

possible to query the existing data with the<br />

new system without migrating it.<br />

The ability to benefit from future<br />

technological advancements. Avoiding<br />

becoming locked-in is crucial, as it keeps<br />

vendors from exploiting a company<br />

financially. But more significant is the<br />

capacity to adopt and benefit from<br />

emerging technology, even if the current<br />

vendor remains favourable. If a superior<br />

machine learning service or a better batch<br />

processing engine is invented, organisations<br />

can have peace of mind that they can use<br />

the tool freely.<br />

Application architectures have demonstrated<br />

that a service-oriented approach allows<br />

maximum scale, flexibility, and agility. While<br />

separating compute and storage marked an<br />

essential first step in reducing analytic costs, it<br />

doesn't offer the kind of benefits visible in<br />

modern application architectures. However, by<br />

disengaging compute from data, the benefits of<br />

application design can now be used for data<br />

analytics, especially given the critical<br />

importance of data for all businesses.<br />

As a result, open data architecture brings forth<br />

many benefits, from flexibility, independence,<br />

and future-proofing to creating new avenues<br />

for gaining valuable business insights. In the<br />

rapidly evolving digital era, embracing open<br />

data architectures is more than a strategic<br />

choice; it's a decisive move towards a more<br />

flexible, scalable, and insightful future.<br />

More info: www.dremio.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

17


Spectra 18.qxd 01-<strong>Dec</strong>-23 10:59 AM Page 2<br />

<strong>ST</strong>RATEGY: TAPE <strong>ST</strong>ORAGE<br />

THE RESURGENCE OF TAPE<br />

MATT NINESLING, SENIOR DIRECTOR TAPE PORTFOLIO, SPECTRA LOGIC, SHARES THREE CURRENT USE<br />

CASES THAT ARE DRIVING ORGANISATIONS TO TAKE A SECOND LOOK AT TAPE TECHNOLOGY<br />

af<br />

As every aspect of our world becomes<br />

more digitised, data storage has been<br />

elevated from an "afterthought" to a<br />

primary enabler of business, research, social<br />

and financial progress. And as the role of data<br />

storage changes, the technology behind<br />

storage has advanced significantly. Nowhere is<br />

this more evident than in the advancement of<br />

tape technology.<br />

The inherent characteristics of tape - density,<br />

affordability, removability - have always made<br />

tape a top consideration for archiving,<br />

compliance and data protection goals. When<br />

combining these attributes with new<br />

technology, such as S3-compatible object<br />

storage, tape becomes a game changer in<br />

addressing the critical challenges facing<br />

modern businesses and organisations today.<br />

Too often, the discussion has focused on<br />

"either/or" scenarios for storage - disk or tape,<br />

on-premises or cloud. By front-ending disk,<br />

tape and cloud with an object interface, the<br />

conversation can now focus on the attributes<br />

required of storage for any particular job. And<br />

that's driving organisations to take a second<br />

look at tape technology.<br />

ARCHIVES: THE VISION FOR AI WILL<br />

REQUIRE TAPE ACCESS<br />

AI and associated machine learning models<br />

require massive amounts of data in order to<br />

"train" and provide improvements in everything<br />

from research algorithms to line manufacturing<br />

to self-driving cars.<br />

Depending on the level of automation, selfdriving<br />

cars will generate between 1.4 and 19<br />

terabytes (TB) of data per hour, as<br />

presented by autonomous driving<br />

technology Systems Architect<br />

Stephan Heinrich at the 2017 Flash Memory<br />

Summit. A single car could produce anywhere<br />

between 380 TB to 5.1 petabytes (PB) of data<br />

in a single year. This is just one example of the<br />

massive amounts of data created by and used<br />

by AI.<br />

Because AI is driving virtually every aspect of<br />

business, research, and development, multipetabyte<br />

archives are becoming organisational<br />

standards. Recent lawsuits over the use of<br />

copyrighted materials for the training of AI<br />

models, as well as defamation litigation<br />

responding to false information generated by<br />

AI chatbots, only highlight the need for this<br />

training data to be preserved for the long term.<br />

These archives must be accessible and<br />

searchable. The introduction of S3-compatible<br />

object-based tape makes today's tape<br />

technology the ideal building block for such<br />

archives. Object-based tape is highly scalable,<br />

searchable and can even be tagged for future<br />

retrieval.<br />

Combined with new developments in tape<br />

density, which enable a native storage capacity<br />

of 50TB on a single cartridge, tape not only<br />

maintains its cost-competitive edge against<br />

other storage approaches such as disk and<br />

cloud, it is the unequivocal dominant leader in<br />

affordability.<br />

Additionally, modern tape offerings provide<br />

much greater data integrity and reliability,<br />

incorporating error correction codes and<br />

automated data integrity verification checks to<br />

minimise the risk of data degradation over<br />

time. In the event of catastrophic data loss or<br />

corruption, having archived AI training data on<br />

tape provides a reliable means of data<br />

recovery as the tape is stored offline,<br />

presumably securely, and is less susceptible to<br />

accidental deletions. Archiving AI training data<br />

18 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Spectra 18.qxd 01-<strong>Dec</strong>-23 10:59 AM Page 3<br />

<strong>ST</strong>RATEGY: TAPE <strong>ST</strong>ORAGE<br />

"Object-based tape is highly scalable, searchable and can even be tagged for future<br />

retrieval. Combined with new developments in tape density, which enable a native<br />

storage capacity of 50TB on a single cartridge, tape not only maintains its costcompetitive<br />

edge against other storage approaches such as disk and cloud, it is the<br />

unequivocal dominant leader in affordability. Additionally, modern tape offerings<br />

provide much greater data integrity and reliability, incorporating error correction codes<br />

and automated data integrity verification checks to minimise the risk of data<br />

degradation over time."<br />

on tape ensures that the data remains intact<br />

and can be successfully retrieved if and when it<br />

is needed.<br />

COMPLIANCE: WITH OPPORTUNITY<br />

COMES RESPONSIBILITY<br />

The new digitised world comes with<br />

tremendous opportunity for advancements in<br />

virtually every sector of every market. It also<br />

comes with possible liabilities and<br />

vulnerabilities. Most industries and<br />

organisations have specific compliance and<br />

regulatory requirements regarding data<br />

retention and archiving.<br />

The geopolitical preservation of records is a<br />

critical aspect of national and international<br />

security, historical documentation, and<br />

governance. Records, ranging from<br />

government documents and historical archives<br />

to sensitive intelligence data, are mission<br />

critical in preserving a nation's history, ensuring<br />

accountability, and safeguarding sensitive<br />

information.<br />

Tape is the ideal choice for ensuring<br />

compliance with stringent regulatory<br />

requirements. Write-Once-Read-Multiple<br />

(WORM) tape media allows organisations to<br />

not only preserve data but also prove chain of<br />

command. Tape technology can be a valuable<br />

tool in this context for the archival and longterm<br />

preservation of records.<br />

Due to the low cost and density of tape<br />

storage, multiple copies of data can be stored<br />

on separate tapes, in separate locations,<br />

ensuring that historical versions are readily<br />

available for recovery in the event of natural<br />

disasters, fires, or other catastrophic events.<br />

Tape technology can help governments and<br />

organisations subject to legal requirements<br />

regarding the preservation of records meet<br />

compliance obligations by providing a secure<br />

and tamper-proof archiving solution.<br />

DATA PROTECTION: THE ULTIMATE<br />

LA<strong>ST</strong> LINE OF DEFENCE<br />

Meeting all compliance and regulatory<br />

requirements becomes meaningless if an<br />

organisation's data is compromised by<br />

ransomware. In today's world of storage, this is<br />

the ultimate vulnerability. The third quarter of<br />

<strong>2023</strong> set a new record with 1420 cases of<br />

ransomware reported in a single quarter,<br />

according to cyber threat intelligence solutions<br />

provider Cyberint. No industry is exempt:<br />

business services, manufacturing, retail,<br />

finance, insurance, real estate, healthcare and<br />

critical infrastructure have all been attacked.<br />

There are many ways to protect online data,<br />

but even with the most ardent of approaches, if<br />

data is electronically accessible, a ransomware<br />

virus is capable of infecting it. This was recently<br />

demonstrated by the attack on cloud provider,<br />

CloudNordic. Ransomware succeeded in<br />

encrypting the disks on all servers, including<br />

their primary and secondary backup servers.<br />

The majority of its customers lost all data<br />

hosted with the company.<br />

Tape is the ultimate last line of defence<br />

against ransomware attack. Once a tape is<br />

ejected, it is completely offline, creating what is<br />

referred to as an "air gap." Without an<br />

electronic connection, data can't be hacked,<br />

deleted, or encrypted. A successful<br />

ransomware attack is virtually impossible when<br />

tape is employed. Even if an organisation's<br />

storage systems are compromised, including<br />

their online or cloud backups, offline tape<br />

backups are immune to ransomware.<br />

TAKE ANOTHER LOOK<br />

Tape has long been the single most costeffective<br />

solution for long-term storage of large<br />

data sets and archives. Continued research<br />

and development spanning several decades<br />

has delivered key advancements in data<br />

accessibility, transfer speeds, and storage<br />

capacity. In an era marked by burgeoning data<br />

sets, evolving data governance standards, and<br />

increasing cybersecurity threats, tape's ability to<br />

facilitate secure, rapid storage and retrieval of<br />

petabytes to exabytes of data makes it shine as<br />

a dependable and cost-effective option for<br />

meeting these pressing needs.<br />

More info: www.spectralogic.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

19


Storpool 20.qxd 01-<strong>Dec</strong>-23 10:59 AM Page 2<br />

CASE <strong>ST</strong>UDY: HO<strong>ST</strong>ED.NL <strong>ST</strong>UDY:<br />

"…IT JU<strong>ST</strong> WORKS"<br />

DUTCH IT FIRM HO<strong>ST</strong>ED.NL IS 'BREAKING THE BOUNDARIES OF TRADITIONAL HO<strong>ST</strong>ING<br />

INFRA<strong>ST</strong>RUCTURE' WITH HIGH-PERFORMANCE MANAGED <strong>ST</strong>ORAGE CLU<strong>ST</strong>ERS RUNNING ON<br />

<strong>ST</strong>ORPOOL AND VMWARE<br />

af<br />

Hosted.nl is a leading Dutch cloud<br />

and managed IT services provider.<br />

With over 14 years on the market,<br />

1000+ customers, and 2500+ servers, the<br />

company is recognised for its high-quality<br />

service and innovative technology solutions.<br />

Hosted.nl offers laaS solutions, public and<br />

private cloud, containers as a service,<br />

network solutions, and connectivity.<br />

MIXED WORKLOADS<br />

The team at Hosted.nl was looking for an<br />

alternative to replace their old and poorlyperforming<br />

iSCSI storage. They had used<br />

storage from different vendors in the past,<br />

including DELL Equallogic, EMC VNXe,<br />

Tintri.<br />

Hosted.nl was searching for a solution to<br />

support mixed workloads - from shared web<br />

hosting, hosted desktops, and e-mail<br />

hosting (Exchange) to private and hybrid<br />

clouds. Currently, the company is running<br />

workloads on VMware, Hyper-V, KVM, Xen,<br />

etc. Other requirements for the new storage<br />

solution were high performance, scalability,<br />

reliability, and good support.<br />

StorPool has fully replaced the legacy DELL<br />

and EMC storage which Hosted.nl was<br />

using previously - and it works perfectly with<br />

VMware. Hosted.nl currently runs VMware<br />

ESXi 6.0.0 and will be migrating towards<br />

6.7 very soon.<br />

"We do not need to worry about<br />

maintaining the storage platform itself as it<br />

just works. With StorPool we experience<br />

extremely low response times, very high skill,<br />

and knowledge from both the tech and the<br />

business teams. Also, they are very kind<br />

people in real life and that is one unique<br />

selling point," shared Erik Jan Visscher,<br />

Director and Founding Partner of Hosted.nl.<br />

A considerable advantage is that the team<br />

is able to focus on the things they need to<br />

do in their daily operation. There is no more<br />

need to worry about maintaining the storage<br />

platform itself as 'it just works'.<br />

In the past, when Hosted.nl needed more<br />

storage and/or performance, they it meant<br />

way more work to scale out their operation,<br />

balance workloads, etc. Now, they just add<br />

drives or nodes, and expand their storage<br />

capacity.<br />

PAY-AS-YOU-GROW<br />

More importantly, they get more<br />

performance. Hosted.nl confirmed that<br />

StorPool's support has never let them down<br />

and are available to help 24/7. Another<br />

advantage which needs to be considered is<br />

the pay-as-you-grow model, which<br />

Hosted.nl finds very interesting and<br />

attractive.<br />

Key benefits of the StorPool solution include:<br />

High performance<br />

Improved scalability and flexibility<br />

Saved time on staff<br />

Downtime reduction<br />

Exceptional team of experts<br />

"We are looking to replace Tintri in the<br />

future as well. StorPool outperforms all the<br />

other storage platforms we have used, not<br />

only performance-wise but also in<br />

scalability, support and redundancy,"<br />

concluded Erik Jan Visscher.<br />

More info: www.storpool.com<br />

20 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


The future is here.<br />

Tiered Backup Storage<br />

• Fastest backups<br />

• Fastest restores<br />

• Scalability for fixed-length backup window<br />

• Comprehensive security with ransomware recovery<br />

• Low cost up front and over time<br />

Thank you so much<br />

to all who voted, and<br />

congratulations to our fellow<br />

Storage Awards <strong>2023</strong> winners!<br />

Visit our website to learn more<br />

about ExaGrid’s award-winning<br />

Tiered Backup Storage.<br />

LEARN MORE


Hornetsecurity 22.qxd 01-<strong>Dec</strong>-23 11:00 AM Page 2<br />

RESEARCH: RANSOMWARE<br />

RANSOMWARE AN ONGOING ISSUE<br />

NEARLY 60% OF COMPANIES ARE 'VERY' TO 'EXTREMELY' CONCERNED ABOUT RANSOMWARE ATTACKS,<br />

ACCORDING TO NEW RESEARCH FROM HORNETSECURITY<br />

no insurmountable losses."<br />

af<br />

RANSOMWARE PROTECTION IS A<br />

NECESSITY<br />

Reassuringly, 93.2% of respondents rank<br />

ransomware protection as 'very' to<br />

'extremely' important in terms of IT<br />

priorities for their organisation, and<br />

87.8% of respondents confirmed they<br />

have a disaster recovery plan in place for<br />

a ransomware attack.<br />

In its annual ransomware survey,<br />

Hornetsecurity revealed that more than<br />

nine in ten (92.5%) businesses are<br />

aware of ransomware's potential for<br />

negative impact, but just 54% of<br />

respondents said their leadership is<br />

'actively involved in conversations and<br />

decision-making' around preventing such<br />

attacks. Four in ten (39.7%) said they<br />

were happy to 'leave it to IT to deal with<br />

the issue'.<br />

Commenting on the findings,<br />

Hornetsecurity CEO Daniel Hofmann,<br />

said: "Our annual Ransomware Survey is<br />

a timely reminder that ransomware<br />

protection is key to ongoing success.<br />

Organisations cannot afford to become<br />

victims - ongoing security awareness<br />

training and multi-layered ransomware<br />

protection is critical to ensure there are<br />

However, that leaves more than one in<br />

eight organizations (12.2%) without a<br />

disaster recovery plan. Of those<br />

companies, more than half cited a 'lack of<br />

resources or time' as the primary reason.<br />

Additionally, one-third of respondents<br />

said a disaster recovery plan is 'not<br />

considered a priority by management'.<br />

CHANGES OVER TIME<br />

This survey has been conducted annually<br />

over the past three years and has<br />

included asking respondents if their<br />

organisation has fallen victim to a<br />

ransomware attack.<br />

Since 2021, Hornetsecurity has found<br />

relatively small changes in the percentage<br />

of respondents saying their organisations<br />

have fallen victim to a ransomware<br />

attack: 21.1% in 2021, 23.9% in 2022,<br />

but a new low of 19.7% in <strong>2023</strong>.<br />

Additionally, companies that reported<br />

paying a ransom are down from 9.1% in<br />

2021 to 6.9% in <strong>2023</strong>.<br />

Some of the data in this survey show<br />

positive results, with a majority of<br />

respondents reporting they understand the<br />

22 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Hornetsecurity 22.qxd 01-<strong>Dec</strong>-23 11:00 AM Page 3<br />

RESEARCH: RANSOMWARE<br />

"Although organisations have reported fewer ransomware attacks in <strong>2023</strong>, the threats<br />

haven't necessarily decreased. Cybersecurity awareness among all users remains a<br />

crucial element to further decrease the risk of falling for these threats, especially as<br />

attacks become more sophisticated with new technologies."<br />

importance of protection, and a drop in<br />

ransomware attack victims in <strong>2023</strong>,<br />

showing companies are becoming more<br />

vigilant in their data protection.<br />

Ransomware attacks continue to evolve,<br />

though, so organisations must maintain<br />

this vigilance. In <strong>2023</strong>, 81% of<br />

respondents reported they are receiving<br />

end-user training in comparison to 2021,<br />

when only 71.2% reported they had<br />

received training.<br />

"Although organisations have reported<br />

fewer ransomware attacks in <strong>2023</strong>, the<br />

threats haven't necessarily decreased,"<br />

Hofmann said. "Cybersecurity awareness<br />

among all users remains a crucial element<br />

to further decrease the risk of falling for<br />

these threats, especially as attacks<br />

become more sophisticated with new<br />

technologies."<br />

TOOLS TO COMBAT ATTACKS<br />

The survey also revealed the most used<br />

tools to combat potential threats:<br />

ransomware are:<br />

<br />

<br />

Immutable storage (40.6% of<br />

respondents)<br />

Tight control of user and application<br />

permissions (38.3%)<br />

<br />

<br />

<br />

87.8% used to end-point detection<br />

software with anti-ransomware<br />

capabilities<br />

84.4% cited 'email filtration and<br />

threat analysis'<br />

22.4% mentioned 'AI-enabled security<br />

solutions' as a tool they are now using<br />

to combat ransomware within their<br />

organisation.<br />

The most common primary security<br />

features to protect backups from<br />

Air-gapped storage (27.8%).<br />

Given the unpredictable nature of<br />

ransomware attacks, 76.2% of<br />

respondents said their business has<br />

changed the way it backs up its data. The<br />

73.6% of respondents who have a<br />

recovery plan in place for their Microsoft<br />

365 data are 'very' to 'extremely' confident<br />

in their chosen solution, while 55.1% of<br />

respondents are 'very' to 'extremely'<br />

confident that their data backups would<br />

be safe from a ransomware attack today.<br />

More info: www.hornetsecurity.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

23


Roundtable 24.qxd 01-<strong>Dec</strong>-23 11:01 AM Page 2<br />

ROUNDTABLE: FLASH/AI<br />

WOULD WE HAVE TODAY'S AI WITHOUT FLASH?<br />

<strong>ST</strong>ORAGE MAGAZINE GATHERED A SELECTION OF INDU<strong>ST</strong>RY EXPERTS TO DISCUSS HOW FLASH<br />

<strong>ST</strong>ORAGE AND AI HAVE IMPACTED EACH OTHER, AND HOW FLASH PRICES ARE SET TO CHANGE<br />

af<br />

Flash memory was, without doubt, a<br />

ground-breaking technology when it<br />

first entered enterprise data centres<br />

around two decades ago and immediately<br />

began transforming the performance of a<br />

wide range of applications. We wanted to<br />

understand the relationship between flash<br />

and the revolutionary developments that<br />

are now happening in artificial intelligence<br />

(AI), so we assembled a group of experts<br />

and asked them how much impact flash<br />

has had on AI and on the related fields of<br />

analytics, IoT and edge computing.<br />

We also asked our experts how much<br />

those technologies will change the<br />

adoption rate of flash. Because cost is a<br />

driving factor affecting the implementation<br />

of any technology, and flash prices have<br />

tumbled over the last twenty years, we then<br />

asked whether our experts expect flash<br />

prices to continue falling over the next<br />

five years.<br />

A HARD RELATIONSHIP TO DEFINE<br />

The more data that an application needs to<br />

access, the greater the performance boost<br />

delivered by storing data in flash rather<br />

than on spinning disk. Because AI is a<br />

highly data-intensive application, it might<br />

seem reasonable to presume that without<br />

flash, there would be no modern AI.<br />

However more than one member of our<br />

panel questioned that notion.<br />

"Some in this industry argue that we would<br />

not have today's AI without the past<br />

decade's shift to solid state storage. While<br />

that may be true, it's enormously difficult to<br />

prove. AI training consumes enormous<br />

resources, and SSDs have accelerated the<br />

advancement of computing performance<br />

across the board, so AI will have benefited<br />

from this," said Jim Handy, general director<br />

at analyst firm Objective Analysis. He<br />

added: "The same holds true of any<br />

discipline based on advanced computing<br />

technology, whether it's analytics, nuclear<br />

physics, or meteorology."<br />

David Norfolk, practice leader for<br />

development and government at analyst<br />

firm Bloor Research, said: "Insofar as flash<br />

makes storage faster, cheaper, and more<br />

reliable, it enables data-intensive<br />

innovations such as AI/ML, analytics, IoT,<br />

and edge processing. Conversely, these<br />

innovations need more fast, cheap, reliable<br />

storage and I'd expect flash take-up to track<br />

the take-up of these innovations."<br />

Leander Yu, president and CEO of Graid<br />

Technology said: "Flash memory and allflash<br />

array storage solutions are all about<br />

performance. The killer apps of AI/ML and<br />

analytics are where customers are investing<br />

24 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Roundtable 24.qxd 01-<strong>Dec</strong>-23 11:02 AM Page 3<br />

ROUNDTABLE: FLASH/AI<br />

"The future price of flash would depend on cost of capital, energy costs,<br />

supply and demand, which is heavily dependent on overall growth of IT<br />

infrastructure. So, if you believe that the world economy is going to grow<br />

and IT infrastructure will grow even faster, then in the next 1-2 years the<br />

price of flash will be increasing. Then fab capacity will catch up and in a<br />

few years price will go back to the slow downward trend." - Boyan Krosnov, StorPool<br />

in their IT infrastructure, and these<br />

workloads demand the performance<br />

delivered by all-flash storage."<br />

A wider view is taken by Peter Donnelly,<br />

director of products at storage networking<br />

vendor ATTO, who said: "I believe that<br />

we're in the middle of a dramatic change in<br />

how and where data is collected and<br />

consumed. This is driving the need for the<br />

disaggregation of the data centre. It's not to<br />

say that data centres will cease to exist, but<br />

they are becoming less structured and more<br />

flexible. This is an important dynamic that is<br />

driving the need for flash memory and flash<br />

storage. How do we access and use data<br />

that is across the country, or even around<br />

the world, in a way that makes it seem like<br />

it's located locally? Flash helps answer that<br />

challenge, and it enables emerging<br />

technologies like AI and data analytics at a<br />

scale that was impossible until now."<br />

CHANGING THE ARCHITECTURE OF<br />

FLASH-POWERED <strong>ST</strong>ORAGE<br />

But even if the impact of flash on AI,<br />

analytics, IoT, and edge processing is<br />

difficult to quantify, flash is certainly a key<br />

element in the IT infrastructure built to<br />

handle those workloads. When it comes to<br />

implementing AI, that infrastructure is about<br />

to receive more attention than it has to<br />

date, according to Randy Kerns, senior<br />

strategist at analyst firm the Futurum<br />

Group.<br />

"I think we are just beginning to see the<br />

importance of the underlying device<br />

technology used for AI/ML," he says.<br />

"Currently the focus has been on the<br />

algorithms and data conditioning from<br />

multiple sources to operate on and build<br />

the training and test data. Rightfully so,<br />

getting the functional aspects working has<br />

been where the attention has been placed.<br />

Now, as this is maturing, the importance of<br />

improving the technology and getting<br />

results faster will bring the technologies for<br />

storage into greater consideration. Some<br />

implementations may be further along than<br />

others, but we will see more importance in<br />

AI/ML and use of flash storage as a given."<br />

The ability of flash to handle small,<br />

random data accesses or IO operations fits<br />

the needs of AI/ML and analytics. "Hard<br />

disk drives are steampunk devices. SSDs<br />

have, as a result of their enormous IOPs<br />

advantages, taken over all workloads that<br />

involve small and/or random transfers.<br />

AI/ML training and analytics involve<br />

randomness in their I/O workloads, and<br />

IoT is dominated by extremely small<br />

transfers, making both early success stories<br />

for all-flash storage systems," said Curtis<br />

Anderson, software architect at Panasas.<br />

As well as contributing to the take-up of<br />

all-flash storage systems, the performance<br />

needs of AI/ML are also driving<br />

architectural changes within those systems.<br />

"Architectural considerations around how<br />

data enters and leaves the storage are also<br />

important. This is why traditional HPC<br />

storage is well suited to AI workloads, and<br />

there are many new storage companies<br />

entering the marketplace who are<br />

leveraging flash and NVMe [the storage<br />

protocol used to access flash] to deliver low<br />

latency across the board and eradicate any<br />

potential bottlenecks at the storage layer,"<br />

said Amos Ankrah, solutions specialist at<br />

Boston.<br />

SCALE OF FLASH USAGE VARIES<br />

HUGELY<br />

AI applications such as autonomous driving<br />

and large language models (LLMs) are in<br />

Anderson's words 'poster children' for the<br />

use of huge datasets to train AI models. As<br />

an example, he cites Tesla's use of more<br />

than a staggering 200PB of what the car<br />

maker calls 'hot-tier cache capacity'.<br />

However, Anderson says most organisations<br />

are using far smaller datasets for AI<br />

development. "The vast majority (by count)<br />

of AI/ML projects have (significantly) less<br />

than 100TB of capacity needs," he said.<br />

That is 2,000 times less capacity than<br />

Tesla's hot tier.<br />

Anderson and his colleagues at Panasas<br />

expect that these more typical AI datasets<br />

will grow, but only slowly. That is just as<br />

well, because flash is significantly more<br />

expensive than disk, but its usage is often<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

25


Roundtable 24.qxd 01-<strong>Dec</strong>-23 11:02 AM Page 4<br />

ROUNDTABLE: FLASH/AI<br />

"Hard disk drives are steampunk devices. SSDs have, as a result of their enormous<br />

IOPs advantages, taken over all workloads that involve small and/or random<br />

transfers. AI/ML training and analytics involve randomness in their I/O workloads,<br />

and IoT is dominated by extremely small transfers, making both early success stories<br />

for all-flash storage systems."- Curtis Anderson, Panasas<br />

essential for AI training. The gap between<br />

disk and flash performance is even wider<br />

for AI than for other applications, because<br />

of the general random nature of AI data<br />

access.<br />

For decades, storage vendors have<br />

compensated for the relatively low speed<br />

at which disk drives handle semi-random<br />

requests to access data by identifying hot<br />

or frequently-accessed data and storing it<br />

in very fast DRAM-memory read caches.<br />

"Read caching helps a lot when a small<br />

percentage of the data is being accessed<br />

multiple times. AI/ML doesn't fit often with<br />

those traditional I/O access patterns which<br />

forces organisations to take a largely<br />

flash-based approach for many AI/ML<br />

workloads," said Steven Umbehocker,<br />

founder and CEO at OSNexus.<br />

PERFORMANCE NOT THE ONLY<br />

FLASH VIRTUE<br />

Performance is not the sole advantage that<br />

flash offers compared to disk, as SSDs<br />

consume less power while also potentially<br />

being more reliable and able to withstand<br />

challenging environments. "In applications<br />

like IoT, edge processing, and TinyML<br />

(machine learning at the edge) one of the<br />

top design priorities is the ever-increasing<br />

drive to decrease power consumption -<br />

both dynamic and standby power - while<br />

ensuring the highest possible<br />

performance. On top of this, for any IoT<br />

design, keeping costs down is another<br />

huge priority," said Coby Hanoch, CEO<br />

and founder of Weebit Nano.<br />

The ability of flash to survive harsh<br />

environments is another advantage. "If we<br />

mean at the edge, infrastructure at cell<br />

towers and other local infrastructure, then<br />

solid state storage, particularly SSDs, are a<br />

definite enabling technology since they<br />

perform better at extremes of temperature<br />

that other storage technology, such as hard<br />

disk drives, which would find these difficult,"<br />

said Tom Coughlin, president of analyst<br />

firm Coughlin Associates, and member of<br />

the Compute, Memory and Storage<br />

Initiative at industry body SNIA.<br />

Roy Illsley, chief analyst at research firm<br />

Omdia, highlighted another physical<br />

characteristic of flash when he said: "A<br />

second aspect worthy of note is that for<br />

edge use-cases the ability to operate from<br />

a small footprint so the AI inferencing<br />

workloads can be deployed in remote<br />

locations means flash is the storage of<br />

choice when physical space is a restraining<br />

factor."<br />

Dennis Hahn, principal analyst at Omdia,<br />

said that flash storage at the edge is often<br />

within hyperconverged infrastructure (HCI.)<br />

"In use-cases like edge, processing realtime<br />

results is often the case, so fast flash<br />

storage local to the processing servers is<br />

necessary. In its research, Omdia has<br />

found that these edge systems are<br />

frequently HCI systems using SSD devices."<br />

But this does not mean that IoT data is<br />

always stored in flash. " Data collection like<br />

that of IoT often focuses more on cost, and<br />

the data frequently travels over the<br />

relatively slow internet. [As a result] bulk<br />

storage solutions like HDD are more<br />

frequently used. But, ultimately, flash comes<br />

into play for its speed in enabling IoT data<br />

processing."<br />

Referring to the NOR variant of flash that<br />

is embedded in system-on-a-chip<br />

processors, Weebit Nano's Hanoch adds:<br />

"In devices performing AI or ML at the<br />

edge, flash is used not only for code /<br />

firmware storage and boot, but importantly<br />

flash, and even more so newer types of<br />

NVM like ReRAM, is also used to store the<br />

neural network weights needed for AI<br />

calculations. To support this functionality<br />

while keeping cost and power to a<br />

minimum, we're seeing designs pushing to<br />

more advanced nodes such as 28nm and<br />

22nm, currently the sweet spot for IoT and<br />

edge devices. This requires NVM that is<br />

embedded in an SoC monolithically, but<br />

embedded flash can't scale to 28nm and<br />

below, so designers can't integrate it with<br />

other functionality on a single die. This is a<br />

huge challenge in designing these small,<br />

inexpensive, often battery powered<br />

devices."<br />

PRICE GAP BETWEEN DISK AND<br />

FLASH TO REMAIN<br />

The variant of flash memory that hugely<br />

dominates flash usage is NAND flash. Until<br />

the late 90s, NAND flash was a very<br />

expensive and rarely used technology. This<br />

situation changed in the late 90s when<br />

makers of battery-powered devices such as<br />

MP3 players and mobile phones were<br />

26 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Roundtable 24.qxd 01-<strong>Dec</strong>-23 11:02 AM Page 5<br />

ROUNDTABLE: FLASH/AI<br />

"Some in this industry argue that we would not have today's AI without the past<br />

decade's shift to solid state storage. While that may be true, it's enormously difficult to<br />

prove. AI training consumes enormous resources, and SSDs have accelerated the<br />

advancement of computing performance across the board, so AI will have benefited<br />

from this."- Jim Handy, Objective Analysis<br />

searching for a data storage medium that<br />

consumed less power than miniature disk<br />

drives. NAND flash fit the bill, production<br />

soared, and prices plummeted. Surprisingly<br />

however, it was not until around 2004 that<br />

NAND flash became cheaper than DRAM<br />

memory.<br />

However, the important price comparison<br />

has always been between flash and disk.<br />

Although the price of flash has been falling<br />

for the last twenty years, so has the price of<br />

disk drives, when both are measured in<br />

terms of dollars per unit of storage<br />

capacity. For the last decade the gap<br />

between the two has been relatively steady.<br />

"SSD $/TB have maintained roughly the<br />

same 5x-7x multiplier over HDD $/TB over<br />

the last ten years," said Anderson. That<br />

estimate of the price difference was echoed<br />

by Umbehocker and by Giorgio Regni,<br />

CTO at Scality, who both put the per-TB<br />

price difference at five-fold.<br />

"We don't think the market is pushing flash<br />

vendors very hard to change that in the<br />

future," said Anderson. Referring to socalled<br />

fabs - the fabrication plants that<br />

make flash and other semiconductor chips<br />

- Anderson added: "There are only a<br />

handful of flash fabs around the world and<br />

new ones aren't being built at a rate that<br />

will outstrip the growth in the demand for<br />

flash." Again, this view was shared by other<br />

experts, who pointed to the need to build<br />

new fabs to increase global output, and the<br />

enormous expense of doing so, which<br />

ranges from hundreds of millions to billions<br />

of dollars per fabrication plant, and the<br />

years of planning and construction<br />

required.<br />

FLASH CO<strong>ST</strong> TO CONTINUE<br />

FALLING<br />

On a short-term basis, flash prices have a<br />

history of dramatic variations. For<br />

Objective Analysis, Handy said: "During<br />

shortages prices typically flatten, but<br />

sometimes they increase a little. In very rare<br />

cases they increase substantially, like they<br />

did in 2018. When the shortage flips to an<br />

oversupply there's always an alarmingly<br />

rapid price collapse. We had that collapse<br />

in the second half of 2022, when prices fell<br />

by up to 70%."<br />

Boyan Krosnov, CTO and co-founder at<br />

StorPool, outlined the factors that influence<br />

long term price trends for flash. "The future<br />

price of flash would depend on cost of<br />

capital, energy costs, supply and demand,<br />

which is heavily dependent on overall<br />

growth of IT infrastructure. So, if you<br />

believe that the world economy is going to<br />

grow and IT infrastructure will grow even<br />

faster, then in the next 1-2 years the price<br />

of flash will be increasing. Then fab<br />

capacity will catch up and in a few years<br />

price will go back to the slow downward<br />

trend."<br />

Shawn Meyers, field CTO at Tintri agrees:<br />

"The worldwide economy will be the largest<br />

driving factor, outside of new revolutionary<br />

breakthroughs in flash manufacturing.<br />

Supply chain ripples will follow the bullwhip<br />

effect for the foreseeable future." However,<br />

between price collapse and price surges,<br />

per-TB prices slowly fall, according to<br />

Objective Analysis' Handy, who said the<br />

price trends are surprisingly predictable<br />

and that his company produces the<br />

industry's most consistently accurate price<br />

forecasts. So how fast does Objective<br />

Analysis believe flash prices will fall over<br />

the next five years? "From now until mid-<br />

2028, the average price decline will be<br />

about 15% per annum," Handy said,<br />

adding that a possible shortage in mid-tolate<br />

2024 would be followed by oversupply<br />

and price collapse in 2026.<br />

However, Regni at Scality predicted an<br />

ever faster decline in price for the lowestcost<br />

QLC variant of flash. "Based on<br />

roadmaps from hardware and disk<br />

manufacturers, we see a decline in the cost<br />

(measured as $ per Terabyte) of highdensity<br />

(QLC) flash SSDs to decrease<br />

dramatically. Data shared with us shows a<br />

60%+ decline between 2022 and 2025,"<br />

said Regni for Scality.<br />

That 60% decline in price cited by Regni<br />

for QLC flash equates to a 26% compound<br />

average reduction from 2022 to 2025,<br />

which would be significantly faster than<br />

Handy 's 15% prediction for overall flash<br />

prices over the longer time period of <strong>2023</strong><br />

to 2028. Regni added: "While this is a<br />

faster decrease than equivalent highdensity<br />

HDDs, we still see HDDs<br />

maintaining a 5x cost advantage over SSDs<br />

in the same time frame," he said. <strong>ST</strong><br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

27


Adsignal 28.qxd 01-<strong>Dec</strong>-23 11:05 AM Page 2<br />

ANALYSIS:<br />

ANALYSIS: CONTENT GROWTH<br />

af<br />

IS AI MAKING TECH EVEN LESS<br />

ENVIRONMENTALLY FRIENDLY?<br />

TOM DUNNING, CEO OF AD SIGNAL, LOOKS AT THE POTENTIAL<br />

ENVIRONMENTAL IMPACT OF RELENTLESS CONTENT GROWTH<br />

The amount of digital data in the world is<br />

growing by 23 per cent year on year<br />

and, as a result, is quickly becoming a<br />

serious environmental issue. Crucially, many<br />

people are unaware that this is even an issue<br />

at all.<br />

At any point in time the world is only using 20<br />

per cent of the available capacity, yet<br />

organisations continue to provision more<br />

capacity, partly due to the concern over limited<br />

availability driven by rarity of resources.<br />

Backup upon backup is created of each item<br />

of content so that each organisation in the<br />

chain can meet Service Level Agreements<br />

(SLAs) and Disaster Recovery (DR)<br />

requirements; multiply this by the rapid growth<br />

of social content and the number of<br />

photos/videos people now take for each shot<br />

they use. Streaming services continue to grow<br />

in viewership and offer an ever-increasing<br />

library of content.<br />

We are collectively stumbling down a path of<br />

environmental damage that will grow<br />

exponentially unless we take immediate action.<br />

THE DANGER OF RISING DATA<br />

We're living in an era of rapid technological<br />

development and it's unreasonable to expect<br />

people and businesses to stop innovating, stop<br />

adopting, and stop using technology<br />

altogether in order to reduce the carbon<br />

emissions caused by rising data.<br />

The key, therefore, is to find solutions that can<br />

scale the reduction of carbon emissions of<br />

data alongside the growth of data content.<br />

Over 3.5 per cent of global CO2 emissions<br />

are estimated to be generated by data centres<br />

and network traffic. This makes network traffic<br />

responsible for even more CO2 emissions<br />

than the global aviation industry (2.1 per<br />

cent).<br />

That figure is a significant issue, and one<br />

made even worse by predictions that data<br />

centres will generate 14 per cent of global<br />

CO2 emissions by 2040, comparable to the<br />

agricultural industry.<br />

Businesses and large organisations in<br />

particular are taking data and technology for<br />

granted, focusing on how they can leverage it<br />

to boost efficiency and generate bigger<br />

profits. All the time this is happening, data<br />

volumes are rising, and network traffic is<br />

increasing with little to no thought from the<br />

people doing the damage.<br />

Video storage is a particular<br />

environmental danger, accounting for an<br />

estimated 70 per cent of the CO2 emissions<br />

generated by data centres. It's the densest<br />

content format that we have and equates to<br />

roughly 1.84 per cent of the world's CO2<br />

emissions.<br />

Ultimately there are only three ways to<br />

reduce the carbon related to content storage:<br />

1. Reduce file sizes - normally through<br />

compression rather than reduced quality<br />

2. Store less - challenging as content grows<br />

3. Store it on more sustainable solutions<br />

In reality, a high volume of the large video<br />

storage comes from duplicated versions, many<br />

of which producers struggle to identify. As a<br />

result, content producers and holders can<br />

have a huge impact on emissions reduction<br />

just by de-duplicating high volumes of video<br />

storage. One UK broadcaster is holding 127<br />

versions of the same episode but only around<br />

20 of these are unique needed versions.<br />

The technology to achieve this is already out<br />

in the marketplace, solutions like Ad Signal's<br />

Match and Compose products can reduce<br />

emissions down from 1.84 to 0.5 per cent, for<br />

example, while also making it commercially<br />

beneficial to do so by reducing the storage<br />

and data transfer burden of duplicated videos.<br />

Video is the low hanging fruit; it isn't feasible<br />

for industries like aviation to make such a<br />

drastic reduction.<br />

HOW 'AI WILL BURN THE WORLD'<br />

The biggest threat to sustainability across the<br />

board is AI. AI has seen rapid adoption across<br />

the world this year, whether it be businesses<br />

utilising large language models (LLMs) such as<br />

ChatGPT to drive efficiencies or generating<br />

data to understand the content and its<br />

potential usage. However, there is an<br />

enormous environmental cost associated with<br />

AI that's only increasing alongside adoption.<br />

While businesses and people have seen<br />

benefits from AI, the components that power<br />

it, such as Graphics Processing Units (GPUs),<br />

require carbon-heavy materials to produce.<br />

Alongside that, the powering of these<br />

28 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Adsignal 28.qxd 01-<strong>Dec</strong>-23 11:05 AM Page 3<br />

ANALYSIS:<br />

ANALYSIS: CONTENT GROWTH<br />

in this, bringing together innovative businesses<br />

that are pulling together to make a difference<br />

in the field. That starts with universal<br />

transparent monitoring led by legislation in<br />

order to measure the climate impact of<br />

organisations and hold them accountable.<br />

Carbon calculators should be available for<br />

all AI products that use non-renewable energy,<br />

even if it is carbon offset. Carbon offset does<br />

not outright prevent damage: it seeks to<br />

mitigate it and it cannot be the whole solution<br />

to our problems, though mitigation is better<br />

than nothing.<br />

components and the colossal cooling required<br />

significantly multiply the carbon emissions<br />

produced by AI.<br />

AI is plotted for a 37.3 per cent Compound<br />

Annual Growth Rate (CAGR) by 2030 which<br />

we believe to be underestimated. The training,<br />

reviewing and retraining process for AI models<br />

typically takes numerous iterations before it is<br />

ready for commercialisation. It's this training<br />

process which has the biggest compute<br />

demand and therefore the majority of the<br />

environmental damage, which we should<br />

expect to happen in the next 2-3 years, rather<br />

than by 2030. Our Match product can reduce<br />

the cost and carbon of AI processing on video<br />

by more than 75 per cent.<br />

A further damaging feature is the object<br />

storage of AI models, requiring libraries of<br />

images, audio and video for training<br />

purposes. In this area at least, companies can<br />

adopt carbon efficient solutions for<br />

deduplication to reduce some of these<br />

emissions.<br />

AI IS NOT THE PEG FOR EVERY HOLE<br />

Collectively, we need to understand that the<br />

best thing for businesses and for longevity is a<br />

healthy environment.<br />

That has not been the approach taken by<br />

many business decision makers when it comes<br />

to AI. IDC forecasts 90 per cent of new<br />

enterprise apps will use AI by 2025. In reality,<br />

many of these applications do not need AI<br />

(many using it as a gimmick) or could achieve<br />

the same benefits with other technology.<br />

While it's impossible to expect everyone to<br />

give up AI, especially since it has brought<br />

some benefits for businesses, it's those jumping<br />

on the AI trend blindly that are contributing to<br />

the stark rise in network traffic, data volumes<br />

and subsequently carbon emissions.<br />

HOW TECH PLAYERS CAN REDUCE<br />

EMISSIONS<br />

The biggest and arguably most important step<br />

that we can collectively take to reduce the<br />

environmental impact of data content growth<br />

is raising awareness of the extent of the issue.<br />

We all have our own preconceived notions of<br />

'sustainability' but ultimately, we're talking<br />

about climate change and reducing carbon<br />

emissions. The definition of our collective goal<br />

as a society is important and as part of that, it<br />

is essential that greenwashing is stopped.<br />

Organisations such as the Digital<br />

Sustainability Alliance are playing a crucial role<br />

We also need monitoring for thermal<br />

damage not related to power, such as<br />

immersion cooling. Immersion cooling reduces<br />

power for cooling by 30 per cent, however the<br />

server heat is still released on the planet,<br />

causing damage. Without transparent<br />

monitoring services, consumers cannot make<br />

informed choices and we are obscuring key<br />

information on the impact on our planet.<br />

Between businesses, organisations and<br />

education bodies, we can all gather the pieces<br />

of the sustainability puzzle and it's important to<br />

aggregate that data together so that we all<br />

have a combined understanding.<br />

SU<strong>ST</strong>AINABLE TECH FOR THE FUTURE<br />

Ultimately, sustainability is a choice. That<br />

remains true even through economic<br />

uncertainty and budget constraints. There is a<br />

plethora of companies out there with<br />

sustainable technology solutions and these<br />

companies should be considered when<br />

businesses are making tech choices.<br />

A common misconception is that<br />

sustainability comes at the expense of cost or<br />

performance, but many of the companies<br />

developing these sustainable technologies are<br />

doing it with an environmental-first approach.<br />

This not only ensures carbon savings, but also<br />

cost savings and performance boosts, making<br />

them an obvious tech choice and an easy way<br />

to reduce the emissions of rising data.<br />

More info: www.ad-signal.io<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

29


Syniti 30.qxd 01-<strong>Dec</strong>-23 11:06 AM Page 2<br />

MANAGEMENT: DATA MIGRATION<br />

FACING MIGRATION ISSUES HEAD ON<br />

af<br />

KEVIN WILD, HEAD OF<br />

PRESALES AT SYNITI, EXPLAINS<br />

HOW TO MANAGE COMPLEX<br />

DATA MIGRATION IN THE REAL<br />

WORLD<br />

Data migrations are complex and<br />

they can be daunting, especially<br />

when it comes to moving large<br />

amounts of data across enterprise<br />

resource planning (ERP) systems. That's<br />

not hyperbole, it's a statement borne of<br />

experience. I've seen first-hand that there<br />

is no such thing as a simple migration<br />

and when you add in multiple source<br />

systems or multiple targets, complexity<br />

ramps up quickly.<br />

How do we manage that complexity and<br />

get to a stage where the outcome is<br />

predictable? Is that even possible in the<br />

real world, where we're also managing<br />

our day-to-day priorities too?<br />

THE CHALLENGES<br />

Before we get started, it's important to<br />

recognise that no two migrations are the<br />

same. Single source to single target<br />

migration has its share of challenges. But<br />

it's rare to find a large-scale<br />

transformation that is a straightforward,<br />

single source to single target affair.<br />

Instead you're more likely to find multiple<br />

source systems or multiple targets, which<br />

exponentially increases the complexity.<br />

Often, data exists in different formats,<br />

meets different standards or, sometimes,<br />

may not exist at all. This is a challenge I<br />

regularly come across when companies<br />

migrate as part of a merger or acquisition<br />

But although it is true that each<br />

migration is different, the solution is<br />

universal. A successful migration relies on<br />

preparation and putting in the<br />

groundwork.<br />

PREPARING FOR MIGRATION<br />

To tackle migration complexity head-on, a<br />

solid, repeatable methodology is<br />

essential. A strategy that not only<br />

addresses the challenges at hand but also<br />

ensures a predictable outcome. By<br />

'predictable outcome,' I mean a scenario<br />

where, on day one after cut-over, the data<br />

seamlessly supports the business<br />

processes. Anything short of this is a<br />

failure.<br />

But it won't happen without data<br />

30 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Syniti 30.qxd 01-<strong>Dec</strong>-23 11:06 AM Page 3<br />

MANAGEMENT: DATA MIGRATION<br />

"Single source to single target migration has its share of challenges. But it's rare to find<br />

a large-scale transformation that is a straightforward, single source to single target<br />

affair. Instead you're more likely to find multiple source systems or multiple targets,<br />

which exponentially increases the complexity. Often, data exists in different formats,<br />

meets different standards or, sometimes, may not exist at all. But although it is true that<br />

each migration is different, the solution is universal. A successful migration relies on<br />

preparation and putting in the groundwork."<br />

preparation and cleansing: a crucial part<br />

of any migration.<br />

Many customers tell me that their data<br />

only needs light-touch cleansing and<br />

preparation. My experience says<br />

otherwise. Often-times, these statements<br />

are based on gut feeling or blind faith<br />

rather than solid fact. Would you be<br />

willing to bet millions of dollars on that?<br />

That is often what is at stake when<br />

migrations fail.<br />

I understand that no company has<br />

unlimited time and resources, so it's<br />

important to think carefully about how you<br />

prepare and focus on the activities that<br />

will have the most impact.<br />

UNDER<strong>ST</strong>AND YOUR DATA -<br />

OBJECTIVELY<br />

Poke your data, prod it, manipulate it.<br />

Whatever you do, make sure you are<br />

walking into your transformation with eyes<br />

open.<br />

Armed with this understanding, identify<br />

the gaps and put a plan in place to<br />

address them. Most gaps can be<br />

addressed during the migration process<br />

but the trickier ones will require time and<br />

expertise. Identifying these gaps in<br />

advance means you can plan<br />

appropriately and make sure activity is<br />

completed ahead of the migration - so<br />

you don't run out of time and have to<br />

compromise on quality.<br />

ROBU<strong>ST</strong> REPORTING<br />

Establish a reporting process to track how<br />

preparation activities are progressing and,<br />

more importantly, identify potential delays<br />

as early as possible. Everybody gets side<br />

tracked at some point and most of the<br />

business people involved in the process<br />

will also have a day job running at the<br />

same time.<br />

So take time at the start of the project to<br />

put in place performance indicators and<br />

alerts to identify potential delays before<br />

they snowball into substantial issues. If it<br />

looks like progress is stalling, have a<br />

contingency plan to make sure data<br />

quality remains a priority in the lead up to<br />

the migration.<br />

JU<strong>ST</strong> ONE THING<br />

With every migration, priorities compete<br />

and it can be difficult to give everything<br />

the focus it needs. But preparation is key,<br />

and this is amplified as you start to layer<br />

in the additional complexity of multiple<br />

systems, large data volumes, or complex<br />

processes. So if you do just one thing<br />

differently ahead of your next migration,<br />

spend time on preparation for a<br />

successful, less stressful process.<br />

More info: www.syniti.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

31


11-11 32.qxd 01-<strong>Dec</strong>-23 11:06 AM Page 2<br />

<strong>ST</strong>RATEGY:<br />

<strong>ST</strong>RATEGY: DISA<strong>ST</strong>ER RECOVERY<br />

af<br />

THE GROWING NEED FOR APPLICATION<br />

RECOVERY PLANNING<br />

SAM WOODCOCK, SENIOR DIRECTOR OF CLOUD <strong>ST</strong>RATEGY, 11:11 SY<strong>ST</strong>EMS, EXPLAINS THE<br />

IMPORTANCE OF ARCHITECTING A DR PLAN FOR APPLICATION RECOVERY IN THE CLOUD<br />

Today being impacted by a cybersecurity<br />

incident is almost inevitable and it is not<br />

a question of if, or even when, but how<br />

often an organisation will be attacked.<br />

According to Veeam's <strong>2023</strong> Data Protection<br />

Trends Report, which surveyed 4,200 business<br />

and IT leaders on their IT and data protection<br />

strategies and plans, 85% of organisations<br />

said they have had at least one ransomware<br />

attack in the last 12 months and 79% of<br />

respondents said they have a protection gap.<br />

Additionally, according to the UK's<br />

Information Commissioner's Office, one in<br />

three data breaches in 2022 were caused by<br />

ransomware. Therefore, as well as considering<br />

preventative cybersecurity measures,<br />

companies also need to think about whether<br />

and how they will recover from a malicious<br />

attack, how quickly they can recover their<br />

systems and applications and the cost of<br />

recovery to the business.<br />

DO YOU HAVE A PLAN?<br />

A fundamental question every organisation<br />

should ask themselves is: would we survive a<br />

cybersecurity attack? Along with: do we know<br />

our level of preparedness; do we know how to<br />

recover our critical applications; do we<br />

understand the risks the organisation faces<br />

and do we regularly test our recovery plans?<br />

Here at 11:11 Systems, we know that<br />

recovering from a data-compromising cyber<br />

attack requires planning, investment,<br />

capabilities, procedures, and so much more.<br />

Additionally, we understand how important it is<br />

for organisations to recognise the difference<br />

between traditional disaster recovery - in<br />

response to incidents such as wildfires,<br />

earthquakes, and extreme weather conditions<br />

- and compromised data recovery in the event<br />

of a cybersecurity incident.<br />

Unfortunately, as the statistics above<br />

highlight, the latter is the more likely and more<br />

impactful disaster recovery event and<br />

unfortunately an interruption to operations<br />

caused by a cyber attack can cost businesses<br />

an enormous amount, financially and<br />

reputationally.<br />

CYBER AND BACK-UP TEAMS MU<strong>ST</strong> BE<br />

ALIGNED<br />

Another key finding from the Veeam research<br />

was that the vast majority of organisations<br />

surveyed had a hybrid environment, with an<br />

even split across architectures and workloads<br />

in the cloud, in virtual set ups, and onpremises.<br />

The key takeaway here is that<br />

modern data protection solutions must provide<br />

equitable capabilities across all architectures<br />

(physical, virtual, and cloud).<br />

32 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


11-11 32.qxd 01-<strong>Dec</strong>-23 11:07 AM Page 3<br />

<strong>ST</strong>RATEGY:<br />

<strong>ST</strong>RATEGY: DISA<strong>ST</strong>ER RECOVERY<br />

In addition, organisations should plan for<br />

workloads moving across clouds and even<br />

back on-premises, and data protection<br />

strategies should accommodate for that<br />

fluidity. This means cyber and backup teams<br />

must be aligned, and backup must be part of<br />

an organisation's wider cybersecurity strategy<br />

and integrate with modern systems<br />

management.<br />

Interestingly, the research went on to<br />

highlight that 37% of victim organisations of a<br />

ransomware attack had a 'no pay' policy, but<br />

regardless of their policy 80% of companies<br />

paid the ransom anyway. More concerningly,<br />

15 to 20% of those who paid still couldn't<br />

recover their data. So, what should an<br />

organisation do to ensure that it can recover<br />

important applications?<br />

In our cloud-centric world, organisations are<br />

creating an incredible amount of applications<br />

and data to drive their operations. IT teams<br />

use complex software programmes and<br />

applications that rely on other applications,<br />

external services, distributed systems, and<br />

various data sources. Application recovery<br />

planning helps organisations quickly recover<br />

critical data, applications, or systems in case<br />

of an unexpected outage or a cyber incident.<br />

APPLICATION RECOVERY REQUIRES<br />

CAREFUL PLANNING<br />

Determining how to protect and recover an<br />

application can often be easier than<br />

determining how quickly your business needs<br />

that application recovered. Establishing the<br />

correct recovery point objective (RPO) targets<br />

at an application level is a critical part of DR<br />

planning.<br />

It's important to understand best practices for<br />

building an application recovery plan for both<br />

simple and complex applications - especially<br />

with dependent external software programmes<br />

and services.<br />

The key to this is understanding how different<br />

components of the application interact with<br />

each other. This involves identifying all the<br />

external services and dependencies that the<br />

application relies on and understanding how<br />

they work together.<br />

CONSIDER COMPATIBILITY AND<br />

VERIFICATION<br />

In other words what technology should it<br />

leverage and where is the best place to bring<br />

applications back up and running from a<br />

compatibility perspective? Key questions to<br />

consider here are:<br />

Is my hypervisor and versioning compatible<br />

with the cloud solution?<br />

Is my virtual guest hardware also<br />

compatible?<br />

How can I ensure that the architecture of<br />

my VMs is considered and compatible with<br />

the cloud?<br />

When considering connectivity, questions to<br />

think about include whether you have enough<br />

bandwidth to leverage cloud services as well<br />

as how to select the best cloud location for a<br />

positive and seamless end user experience.<br />

Likewise, how can you validate that<br />

connectivity will allow the organisation to meet<br />

its RPO objectives?<br />

In a complex hybrid environment with many<br />

different components, it is also important to<br />

consider application dependency and to map<br />

out how applications work, how they<br />

communicate, and which are dependent on<br />

each other. To tackle these challenges, it is<br />

essential to understand the application<br />

architecture and the dependencies between its<br />

different components. This may involve<br />

conducting a detailed application analysis<br />

and identifying all external software services,<br />

systems, and dependencies.<br />

This type of exercise should be undertaken<br />

on a continuous basis because the situation is<br />

dynamic and can change very quickly. A<br />

deep understanding of application and<br />

infrastructure is critical to successful<br />

application recovery, as is understanding end<br />

user access and ensuring a seamless user<br />

experience.<br />

So, what are the key steps an organisation<br />

should take to recover applications from an<br />

attack?<br />

1. Identify and isolate the affected systems: As<br />

soon as the attack is detected, the first step is<br />

to identify the affected systems and isolate<br />

them from the rest of the network to prevent<br />

further spread of the infection.<br />

2. Assess the damage: The next step is to<br />

assess the extent of the damage caused by the<br />

attack, including the loss of data and the<br />

compromise of critical systems. This<br />

assessment will help determine the application<br />

recovery strategy.<br />

3. Restore from back-ups: If you have<br />

backups available, you can use them to<br />

restore the system to its previous state. To<br />

ensure data integrity and system functionality,<br />

you should thoroughly test the recovery<br />

process.<br />

4. Rebuild affected systems: If backups are<br />

unavailable or the data gets corrupted, you<br />

must rebuild the affected systems from scratch.<br />

This process involves rebuilding the operating<br />

system, applications, and data, which can be<br />

time-consuming and challenging.<br />

5. Improve security measures: Once the<br />

system has been restored or rebuilt, it is<br />

essential to improve the security posture to<br />

prevent attacks in the future.<br />

To mitigate these risks, it is critical to have a<br />

robust application recovery DR plan in place<br />

that includes regular back-ups, testing, and<br />

security measures to prevent such attacks.<br />

Having a clear communication plan is vital to<br />

inform stakeholders of the situation and the<br />

recovery. As the statistics in the Veeam Data<br />

Protection Trends report highlight, having a<br />

DR plan for application recovery in the cloud<br />

isn't an option - it's a must.<br />

More info: www.1111systems.com<br />

www.storagemagazine.co.uk<br />

@<strong>ST</strong>MagAndAwards <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

<strong>ST</strong>ORAGE<br />

MAGAZINE<br />

33


Zerto 34.qxd 01-<strong>Dec</strong>-23 11:07 AM Page 2<br />

RESEARCH: CYBER-ATTACKS<br />

RANSOMWARE: ONLY ONE IN SEVEN<br />

BUSINESSES RECOVER 100% OF THEIR DATA<br />

TO MITIGATE RANSOMWARE ATTACKS, I.T. PROFESSIONALS MU<strong>ST</strong> CONSIDER BOTH BUSINESS-RELATED<br />

AND INFRA<strong>ST</strong>RUCTURE DATA EQUALLY, SUGGE<strong>ST</strong>S NEW RESEARCH<br />

af<br />

As the time and cost of securing data through<br />

the entirety of the data backup process<br />

continue to rise, new methods of protection<br />

arise to ensure the maximum security of<br />

backed up data. Air-gapping has become a<br />

viable solution for these environments, with<br />

more than three-quarters of organisations<br />

using, testing, or expressing interest in this<br />

solution. By leveraging backups stored in<br />

volumes inaccessible by default and only<br />

accessible during protected backup sessions,<br />

cyber attackers are prevented from displacing<br />

or destroying backup data.<br />

Amajor new study announced by Zerto<br />

has confirmed that ransomware<br />

continues to pose a serious threat and<br />

is viewed today as one of the top concerns for<br />

viability within organisations. Companies are<br />

becoming increasingly aware of the damage<br />

caused by these attacks and understanding<br />

the dire reality of the potential compromise.<br />

The research indicates that nearly two-thirds<br />

(65%) of respondents consider ransomware to<br />

be one of the top three most serious threats to<br />

the viability of the organisation.<br />

The study was conducted by ESG and cosponsored<br />

by Zerto. Its findings, published in<br />

a new e-book (available via the URL below)<br />

titled "<strong>2023</strong> Ransomware Preparedness:<br />

Lighting the Way to Readiness and Mitigation,"<br />

show that organisations can lose vital minutes<br />

to hours of time in recovery, resulting in<br />

significant and unacceptable consequences<br />

for large-scale operations. Combined with<br />

evolving techniques and targets designed to<br />

motivate payment from victim organisations,<br />

this data highlights the crucial need to<br />

reengineer recovery processes for ransomware<br />

attacks.<br />

In addition, nearly 60% of respondent<br />

organisations report an impact to regulated<br />

data, such as personally identifiable<br />

information, in successful ransomware attacks.<br />

The study also indicates that configuration<br />

data faces an increasingly significant risk of<br />

compromise, with more than half of<br />

respondents indicating this data class was<br />

affected by a successful ransomware attack.<br />

This shows that attackers understand affecting<br />

infrastructure of a company at the core is an<br />

effective way to halt production in its tracks. As<br />

a result, IT professionals preparing to mitigate<br />

ransomware attacks must consider both<br />

business-related and infrastructure data<br />

equally in their efforts.<br />

Despite the importance of this solution, the<br />

response breakdown shows only slightly more<br />

than one in four (27%) organisations have<br />

deployed it at this point, while 18% are in the<br />

process of testing and deploying an airgapped<br />

solution. This confirms that while it is<br />

seen as a viable strategy, there is still much<br />

work to be done in the market overall to<br />

ensure that the vast majority have it in place.<br />

"Given the high frequency of ransomware<br />

attacks and the impacts of successful ones<br />

such as data and infrastructure loss, many<br />

organisations are left with damages that have<br />

an effect well beyond IT," commented<br />

Christophe Bertrand, practice director at ESG.<br />

"Attackers often go beyond valuable data<br />

assets by undermining key infrastructure<br />

components and exposing significant gaps,<br />

including those in the backup infrastructure<br />

itself. IT leaders must understand that the<br />

nature of the threat goes well beyond just data<br />

and focus on protecting and further leveraging<br />

their backup and recovery infrastructure to derisk<br />

and minimise business impact through<br />

advanced capabilities."<br />

More info: www.zerto.com/page/esg-the-longroad-ahead-to-ransomware-preparedness/<br />

34 <strong>ST</strong>ORAGE <strong>Nov</strong>/<strong>Dec</strong> <strong>2023</strong><br />

@<strong>ST</strong>MagAndAwards<br />

www.storagemagazine.co.uk<br />

MAGAZINE


Advancing EPYC<br />

Performance and Density<br />

with New H13 Systems<br />

New Cloud, AI, and Technical Computing Solutions with 128-Core<br />

AMD EPYC 9004 Series Processors and 3D V-Cache Technology<br />

Learn More at www.supermicro.com/Aplus<br />

© Supermicro and Supermicro logo are trademarks of Super Micro Computer, Inc. in the U.S. and/or other countries.


eNews.qxd 12-Apr-19 10:51 AM Page 1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!