02.02.2017 Views

NC1701

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

NETWORKEXPERT<br />

The high throughput, low-latency transmission of real-time data feeds is essential to so<br />

many aspects of both personal and professional life. In fact, for many, it has become so<br />

mission critical that a specialised streaming platform is required.<br />

K is for Kafka. Apache Kafka is an open source distributed streaming platform developed by<br />

the Apache Software Foundation. It is used to connect sensor-based data, operations and<br />

people to enable real-time analysis of data. Kafka is designed as a distributed system and its<br />

structure makes it highly scalable with fast and reliable performance, offering high throughput<br />

for both publishing and subscribing, supporting hundreds of thousands of messages per second.<br />

Sham Chotai, Chief Technology Officer at GE Power Digital Solutions says, "Kafka's open<br />

source nature makes it incredibly flexible. The source code can be customised, allowing for new<br />

collaborations and integration with various software solutions, ultimately ensuring scalable<br />

functionality through shifts in various technology architectures."<br />

Focusing on the inexorable adoption of IoT, Sham explains that, "Kafka's capacity to rapidly<br />

stream and ingest large data quantities make it a critical component for the Internet of Things<br />

(IoT) and across the Industrial Internet of Things (IIoT) by helping to make sense of the terabytes<br />

and exabytes of sensor data. The IIoT in particular relies on Kafka's real-time capabilities to build applications that can effectively monitor<br />

equipment, including propellers, turbines and engines, and then predict outages and mechanical issues. Kafka's ability to ingest data at<br />

scale allows for batch, micro-batch and real-time processing, so a wind energy farm equipped with sensors that generate billions of data<br />

points can, for instance, provide operators with access to terabytes of machine data to improve generation and reduce operational cost."<br />

As you can imagine, operations and employee productivity are dramatically improved through the use of this technology and Sham adds<br />

that "Sensor-to-machine and machine-to-machine interactions require sophisticated networks to handle the high-throughput needed for<br />

consistent performance. The sheer magnitude of increasing volumes of data gathered by enterprises and smart consumer devices creates<br />

a complex challenge for network operators. A lack of advanced infrastructure and protocols vital to the IIoT foundation results in<br />

frequently congested networks."<br />

As the creation of real-time data pipelines and the importance of reliably delivering that data to systems and applications increases, the<br />

desire to transform or react to the streams of data will increase. Sham concludes, "Apache Kafka's interoperability allows it to not only<br />

work with various technologies but also assist in swiftly ingesting incoming traffic, so data capture workflows take place and relieve<br />

network traffic."<br />

Any permitted network connection comes replete with an element of risk. It<br />

therefore follows that the first step in managing this risk starts with only<br />

granting access to those that have a defined requirement which is then further<br />

controlled by ensuring that their level of access is adequate for their requirement,<br />

and no more.<br />

L is for Least Privilege. Privilege accounts are a necessary evil in all aspects of<br />

network environments, from infrastructure through operating systems and onto<br />

applications. Brian Chappell, Director of Technical Services at BeyondTrust points out<br />

that, "At every point, we are battling the need for privileged access against securing<br />

the environment against malicious acts. It's difficult and complex. Least Privilege is a<br />

best practice first posited by Jerome Saltzer, one of the architects of the Multics<br />

operation system, in 1973/4. It's the idea that each user and process in the<br />

environment will have the least privilege necessary to be productive."<br />

WWW.NETWORKCOMPUTING.CO.UK @NCMagAndAwards<br />

JANUARY/FEBRUARY 2017 NETWORKcomputing 13

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!