26.03.2015 Views

19SafQB

19SafQB

19SafQB

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

9.1 Introduction 317<br />

that actual implementations of the idea of middleware became widespread, as<br />

Remote Method Invocation (RMI), dependent on Java as the coding language,<br />

or Common Object Request Broker Architecture (CORBA). This middleware<br />

advances suited fine for the electronic capabilities and expectations of computing<br />

devices in the early 2000s, a time when computers had turned from one<br />

single huge device that would be used by several people into machines closer<br />

to a 1 computer: 1 person usage ratio.<br />

Nevertheless, new developments on information technology were underway.<br />

It was Mark Weiser the first person to define the term ubiquitous computing<br />

back in 1988, and according to his perspective, tiny electronic devices<br />

would be incorporated to the most common entities of a regular daily routine,<br />

such as machinery, furniture or wearable pieces of garment. These tiny devices<br />

would compose a smart grid so pervasive — or, to use another word with similar<br />

connotations, ubiquitous — that the whole grid would merge with the<br />

area it was deployed onto, being used and interacted in an unconscious manner<br />

by human beings. In Mark Weiser ’s own words, “Ubiquitous computing<br />

names the third wave in computing, just now beginning. First were mainframes,<br />

each shared by lots of people. Now we are in the personal computing<br />

era, person and machine staring uneasily at each other across the desktop.<br />

Next comes ubiquitous computing, or the age of calm technology, when technology<br />

recedes into the background of our lives” [2, 3]. Another definition<br />

from Mark Weiser for ubiquitous computing has been quoted by Judy York<br />

and Parag C. Pendharkar deeming ubiquitous computing as “machines that fit<br />

the human environment instead of forcing humans to enter theirs” [4]. What<br />

was also being foretold here was a shift from a computer-centric model, where<br />

one computer would be used by at least one person, to a user-centric one, with<br />

several computers surrounding a single person providing them information,<br />

services and applications. Thus, the foundations of the Internet of Things (the<br />

IoT) were established.<br />

This new scenario does not negate the need for a middleware architecture.<br />

If something, it has become even more pressuring than before, as devices and<br />

protocols dependent and linked with the Internet of Things (mobile phones,<br />

motes, 6LowPAN, etc.) offer a plethora of new issues that have hardly ever<br />

been tackled in a combined fashion before. It has to be born in mind, though,<br />

that the middleware solutions that proved to be useful are clumsy and obsolete<br />

in the Internet of Things, mostly because they were conceived and designed at a

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!