Making use of shotgun-type microphones has improved thedirectionality of our initial prototype. The use of lasermicrophonesmight increase range significantly. With networkcapabilities we could create a worn antenna array capableof sound localization using time-of-arrival.One can imagine a type of wearable simultaneous localizationand mapping (SLAM) system. This could be a fusion ofantenna-array sound localization and laser ranging and detection(LADAR). Such a system might use a Bayesian networkto estimate object location based on data provided byboth audio and optical sensing systems.Figure 2. An Aural Antenna module incorporating lithium-ion polymerbattery, 20 MHz, 8-bit microcontroller, and vibrotactile motor.SMA = s t + s t−1 + ···+ s t−(k−1)k(1)δ = |s t − SMA| (2)If δ is greater than 21010(10% of the dynamic range of the analogto digital converter), then the vibrator is activated with100% duty cycle until the next sample is processed. Thismoving average works as an extremely rudimentary adaptivebackground noise filter. The vibrating motor is controlled bya MOSFET transistor whose gate is tied to a digital outputpin of the ATtiny microcontroller.Our initial experiments with Haptic Antennae indicated thatblindfolded participants readily interpreted the vibrotactilestimulus and associate it with approaching objects. We expectthat similar phenomena will be observed in forthcomingexperiments with the aural antennae.The device exploits our innate ability to process (in a parallelmanner) haptic stimulus applied to skin or the Vellus hairwhich covers most areas of our bodies. Other recent work onelectronic travel aids [16] as well as the use of vibrotactilecuing in virtual environments [12] make use of this phenomena.Experiments have also documented that strong hapticstimulus can induce a startle reflex [25], which may be usefulin emergency situations.EXTENSIONSWhile independent modules may be worn simultaneously,when networked together the augmentations provided by thedevices would be greatly enhanced. We are in the processof evaluating low-power wireless chips such as Zigbee toincorporate into the modules. We anticipate that wirelessantennae would be able to work together to provide “rabbit”perceptual illusions of motion between the actuators.Another extension of this work is in the area of actuation.The “pancake” style vibration motor we are using(KOTL C1030B028F) has the advantage of being compact,but presents substantial initial friction which makes responsesomewhat limited. Other researchers have reported on theuse of air puffs and acoustic cues to elicit startles [23]. Stillother researchers have thoroughly investigated using electricalstimulation to provide haptic cues [10].AS OTHER SPECIES HEARWe have developed an example of aural antennae which providehaptic feedback. Often thinking about haptic devices isconstrained by our experience of our existing senses. Wehave instead sought to break with this convention by seekingto emulate insect perception.Thinking more openly, we can imagine a myriad of newbiomimetic ways of seeing the world. Compound eyes andocellus suggest worn garments that have thousands of cameras.Mimicry of insect’s abilities to acutely detect subtlevibrations [19] and act on this information could lead to extensionof touch in the manner that optics have extended thesight.ACKNOWLEDGMENTSThe authors would like to thank Tomohiko Hayakawa,Kenichiro Otani, and Alexis Zerroug for work on early prototypes.REFERENCES1. P. Bach-Y-Rita, C. C. Collins, F. A. Saunders, B. White,and L. Scadden. Vision substitution by tactile imageprojection. Nature, 221(5184):963–964, March 1969.2. J. M. Camhi and E. N. Johnson. High-frequencysteering maneuvers mediated by tactile cues: antennalwall-following in the cockroach. J Exp Biol, 202(Pt5):631–643, March 1999.3. A. Cassinelli, C. Reynolds, and M. Ishikawa.Augmenting spatial awareness with haptic radar. InWearable Computers, 2006 10th IEEE InternationalSymposium on, pages 61–64, 2006.4. A. Cassinelli, C. Reynolds, and M. Ishikawa. Hapticradar. In SIGGRAPH ’06: ACM SIGGRAPH 2006Sketches, New York, NY, USA, 2006. ACM.28
5. R. Cytowic. Synesthesia: Phenomenology andneuropsychology. Psyche, 2(10):2–10, 1995.6. R. C. Fitzpatrick and B. L. Day. Probing the humanvestibular system with galvanic stimulation. J ApplPhysiol, 96(6):2301–2316, June 2004.7. R. H. Gault. Recent developments in vibro-tactileresearch. Journal of the Franklin Institute,221(6):703–719, June 1936.8. F. A. Geldard and C. E. Sherrick. The cutaneous”rabbit”: a perceptual illusion. Science,178(57):178–179, October 1972.9. Japan’s greatest mysteries: gaffer tape, April 14th1996. http://www.glumbert.com/media/cattape.10. H. Kajimoto, N. Kawakami, T. Maeda, and S. Tachi.Tactile feeling display using functional electricalstimulation. In In Proceedings of the 9th InternationalConference on Artificial Reality and Telexistence, 1999.11. C. Kayser, C. I. Petkov, M. Augath, and N. K.Logothetis. Integration of touch and sound in auditorycortex. Neuron, 48(2):373–384, October 2005.12. R. W. Lindeman, J. L. Sibert, E. Mendez-Mendez,S. Patil, and D. Phifer. Effectiveness of directionalvibrotactile cuing on a building-clearing task. In CHI’05: Proceedings of the SIGCHI conference on Humanfactors in computing systems, pages 271–280, NewYork, NY, USA, 2005. ACM Press.13. T. Maeda, H. Ando, T. Amemiya, N. Nagaya,M. Sugimoto, and M. Inami. Shaking the world:galvanic vestibular stimulation as a novel sensationinterface. In SIGGRAPH ’05: ACM SIGGRAPH 2005Emerging technologies, New York, NY, USA, 2005.ACM.14. S. J. Norton, M. C. Schultz, C. M. Reed, L. D. Braida,N. I. Durlach, W. M. Rabinowitz, and C. Chomsky.Analytic study of the tadoma method: Background andpreliminary results. J Speech Hear Res, 20(3):574–595,September 1977.15. S. Perrin, A. Cassinelli, and M. Ishikawa. Laser-BasedFinger Tracking System Suitable for MOEMSIntegration. Proceedings of Image and VisionComputing, New Zealand (IVCNZ), pages 131–136,2003.16. S. Ram and J. Sharf. The people sensor: A mobility aidfor the visually impaired. iswc, 00, 1998.17. C. M. Reed, N. I. Durlach, and L. A. Delhorne.Historical overview of tactile aid research. InProceedings of the second international conference ontactile aids, hearing aids and Cochlear Implants, 1992.18. C. Reynolds, A. Cassinelli, and M. Ishikawa.Meta-perception: reflexes and bodies as part of theinterface. In CHI ’08: CHI ’08 extended abstracts onHuman factors in computing systems, pages3669–3674, New York, NY, USA, 2008. ACM.19. D. Robert and M. C. Göpfert. Novel schemes forhearing and orientation in insects. Current Opinion inNeurobiology, 12(6):715–720, December 2002.20. F. Saunders, W. Hill, and B. Franklin. A wearabletactile sensory aid for profoundly deaf children. Journalof Medical Systems, 5(4):265–270, December 1981.21. G. Stetten, R. Klatzky, B. Nichol, J. Galeotti,K. Rockot, K. Zawrotny, D. Weiser, N. Sendgikoski,S. Horvath, and S. Horvath. Fingersight: Fingertipvisual haptic sensing and control. In Haptic, Audio andVisual Environments and Games, 2007. HAVE 2007.IEEE International Workshop on, pages 80–83, 2007.22. H. Tan and A. Pentland. Tactual displays for wearablecomputing. Personal and Ubiquitous Computing,1(4):225–230, December 1997.23. B. K. Taylor, R. Casto, and M. P. Printz. Dissociation oftactile and acoustic components in air puff startle.Physiology & Behavior, 49(3):527–532, March 1991.24. B. S. Wilson, C. C. Finley, D. T. Lawson, R. D.Wolford, D. K. Eddington, and W. M. Rabinowitz.Better speech recognition with cochlear implants.Nature, 352(6332):236–238, July 1991.25. J. S. Yeomans, L. Li, B. W. Scott, and P. W. Frankland.Tactile, acoustic and vestibular systems sum to elicitthe startle reflex. Neuroscience & BiobehavioralReviews, 26(1):1–11, January 2002.29
- Page 3 and 4: We would also like to extend a spec
- Page 5 and 6: ∙ USE‐03 Clinical Proof‐of‐
- Page 7 and 8: Ubiquitous Sustainability: Citizen
- Page 9 and 10: Devices that Alter Perception (DAP
- Page 11 and 12: incorporated into the body; if it d
- Page 13 and 14: vise. Beginning improvisers typical
- Page 15 and 16: Location-based Social Networking Sy
- Page 17 and 18: PrivacyWith any social networking a
- Page 19 and 20: BOXED EGO INSTALLATIONA pair of cam
- Page 21 and 22: the natural egocentric visuospatial
- Page 23 and 24: HUMAN SENSES AND ABSTRACT DANGERSOu
- Page 25 and 26: Gesture recognition as ubiquitous i
- Page 27 and 28: een developed to overcome this. For
- Page 29 and 30: 16. Myvu Corporation. Myvu Crystal.
- Page 31 and 32: especially in the fields of Ambient
- Page 33 and 34: 14. G. Riva (Editor), F. Vatalaro (
- Page 35: system, reflex arcs, and even muscl
- Page 39 and 40: RELATED WORKLet us consider the loc
- Page 41 and 42: (a) Actual traces.(a) Actual traces
- Page 43 and 44: Figure 2. Schematic picture of the
- Page 45 and 46: notion of partial/total immersion w
- Page 47 and 48: A Quantitative Evaluation Model of
- Page 49 and 50: a group of users. We, therefore, in
- Page 51 and 52: During the above discussion, user g
- Page 53 and 54: Usability Study of Indoor Mobile Na
- Page 56 and 57: 11%3%0%0%30%19%5%3%0%11%56%Notifyed
- Page 58 and 59: CONCLUSIONFor this study, we have d
- Page 60 and 61: are suited to provide sufficient cl
- Page 62 and 63: attention to your blood pressure re
- Page 64 and 65: invested in final development and c
- Page 66 and 67: 4. rhythms - the highly predictable
- Page 68 and 69: technologies, especially their coll
- Page 70 and 71: Situvis: Visualising Multivariate C
- Page 72 and 73: all. A user selecting a range withi
- Page 74 and 75: constraints to include traces that
- Page 76 and 77: Simulation Framework in Second Life
- Page 78 and 79: VirtualEmitterEstimatedPositionVirt
- Page 80 and 81: odies) then our approach would faci
- Page 82 and 83: Design and Integration Principles f
- Page 84 and 85: asis for interaction, it focuses on
- Page 86 and 87:
to support POST and GET messages as
- Page 88 and 89:
senor node’s functionalities in a
- Page 90 and 91:
service model and provide paradigms
- Page 92 and 93:
Virtualization of resources will fa
- Page 94 and 95:
Figure 3. Heating and lighting cont
- Page 96 and 97:
sat down at the same time” can be
- Page 98 and 99:
10!; !< !=!"#$%&’"()%* +$(%,-’#
- Page 100 and 101:
posite event operators such as conj
- Page 102 and 103:
objects? If these particles were sm
- Page 104 and 105:
Over-the-air-programmingOTAP (Over-
- Page 106 and 107:
Randomised Collaborative Transmissi
- Page 108 and 109:
Figure 2. Illustration of periodic
- Page 110 and 111:
Figure 3. Illustration of the recei
- Page 112 and 113:
Experimental Wired Co-operation Arc
- Page 114 and 115:
The structure of network is matrix
- Page 116 and 117:
Altera Quartus II v7.2SP3 FPGA soft
- Page 118 and 119:
Using smart objects as the building
- Page 120 and 121:
To the ambient ecology concepts des
- Page 122 and 123:
event. A more complicated approach
- Page 124 and 125:
Multi-Tracker: Interactive Smart Ob
- Page 126 and 127:
interfacing with whole space, but c
- Page 128 and 129:
operate interactions by many partic
- Page 130 and 131:
An Augmented Book and Its Applicati
- Page 132 and 133:
its withy material. So, the movemen
- Page 134 and 135:
Table 1. The Performance of Page Fl
- Page 136 and 137:
Ambient Information SystemsWilliam
- Page 138 and 139:
We, too, believe there is a certain
- Page 140 and 141:
management system and forwarded to
- Page 142 and 143:
CONCLUSIONWe have presented the des
- Page 144 and 145:
Ambient interface design for a Mobi
- Page 146 and 147:
as ‘sensitive’ and filtered awa
- Page 148 and 149:
Ambient Life: Interrupted Permanent
- Page 150 and 151:
The log files revealed the actual r
- Page 152 and 153:
Stay-in-touch: a system for ambient
- Page 154 and 155:
Figure 1. The Stay-in-touch display
- Page 156 and 157:
User Generated Ambient PresenceGerm
- Page 158 and 159:
Figure 3: Cross-platform system tra
- Page 160 and 161:
The Invisible Display - Design Stra
- Page 162 and 163:
On a more general level Mimikry cre
- Page 164 and 165:
Rand in fact few examples of public
- Page 166 and 167:
Ambient Displays in Academic Settin
- Page 168 and 169:
usefulProfiles 22 (37%)Time and dat
- Page 170 and 171:
UTILIZE THE POTENTIAL TO FULLEST: D
- Page 172 and 173:
A notification system for a landmin
- Page 174 and 175:
As for the investigation phase, whe
- Page 176 and 177:
Ubiquitous Sustainability: Citizen
- Page 178 and 179:
Live Sustainability: A System for P
- Page 180 and 181:
(a) (b) (c)Figure 3. Screenshot for
- Page 182 and 183:
Motivating Sustainable BehaviorIan
- Page 184 and 185:
dialog with policy makers and servi
- Page 186 and 187:
5. Fogg, B. J. (2002). “Persuasiv
- Page 188 and 189:
on mode of transportation such as t
- Page 190 and 191:
In order to be useful, PET requires
- Page 192 and 193:
provides a relevant description and
- Page 194 and 195:
Star, L. S., The Ethnography of Inf
- Page 196 and 197:
door environment, the accuracy of G
- Page 198 and 199:
Figure 5. Variations of “Parasiti
- Page 200 and 201:
can view where it has been, who ans
- Page 202 and 203:
Nevermind UbiquityJeff BurkeCenter
- Page 204 and 205:
innovating within existing capacity
- Page 206 and 207:
Since the Brundlandt report a serie
- Page 208 and 209:
our current research in mobile gami
- Page 210 and 211:
conceive only of human-computer int
- Page 212 and 213:
human behaviour will encounter. The
- Page 214 and 215:
mental models of the world are test
- Page 216 and 217:
alternative, the place that could h
- Page 218 and 219:
elow the mode button indicate the c
- Page 220 and 221:
212
- Page 222 and 223:
digital collection of features to a
- Page 224 and 225:
immediately followed by a subset of
- Page 226 and 227:
forecast the future accesses to tho
- Page 228 and 229:
Enhanced and Continuously Connected
- Page 230 and 231:
than two application windows execut
- Page 232 and 233:
Secure and Dynamic Coordination ofH
- Page 234 and 235:
AudioProducer/ConsumerVideoConsumer
- Page 236 and 237:
228
- Page 238 and 239:
MotivationsMotivations occurred on
- Page 240:
Kray Christian ··········