System for Multi-axial Mechanical Stabilization of Digital Camera 189 [3] Hilkert, J.M.: A comparison of <strong>in</strong>ertial l<strong>in</strong>e-of-sight stabilization technique us<strong>in</strong>g mirrors. In: Proc. SPIE 5430, Acquisition, Track<strong>in</strong>g and Po<strong>in</strong>t<strong>in</strong>g XVIII, Orlando (July 2004) [4] Ko, S.J., Sung-Hee, L., Seung-Won, J., Eui-Sung, K.: Fast Digital Image Stabilizer <strong>Based</strong> On Gray-Coded Bit-Plane Match<strong>in</strong>g. IEEE Transactions on Consumer Electronics 45(3) (1999) [5] <strong>Jędrasiak</strong>, K., Bereska, D., <strong>Nawrat</strong>, A.: The Prototype of Gyro-Stabilized <strong>UAV</strong> Gimbal for Day-Night Surveillance. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems of National Border Security. SCI, vol. 440, pp. 107–116. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [6] Świtoński, A., <strong>Bieda</strong>, R., Wojciechowski, K.: Multispectral imag<strong>in</strong>g for support<strong>in</strong>g colonoscopy and gastroscopy diagnoses. In: Hippe, Z.S., Kulikowski, J.L., Mroczek, T. (<strong>eds</strong>.) Human - Computer Systems Interaction. AISC, vol. 99, pp. 131–145. Spr<strong>in</strong>ger, Heidelberg (2012) [7] Hurak, Z.: Image-<strong>Based</strong> Po<strong>in</strong>t<strong>in</strong>g and Track<strong>in</strong>g for Inertially Stabilized Airborne Camera Platform. IEEE Transactions on Control Systems Technology 20(5), 1146– 1159 (2012) [8] Wenguang, H., M<strong>in</strong>gyue, D., Nannan, Q., Xudong, L.: Digital deformation model for fishye image rectification. Optics Express 20(20), 22252–22261 (2012) [9] Xiaodong, L., Aouf, N., Nemra, A.: 3D Mapp<strong>in</strong>g <strong>Based</strong> VSLAM for <strong>UAV</strong>s. In: Control & Automation 20th Mediterrenean Conference, Barcelona (2012) [10] Hyuncheol, K., Haehyun, I., Taekyung, K., Jongsue, B., Sanghoon, L., Joonk, P.: Part-template match<strong>in</strong>g-based target detection and identification <strong>in</strong> <strong>UAV</strong> videos. In: Proc. SPIE 8360, Airborne <strong>Intelligence</strong>, Surveillance, Reconnaissance (ISR) Systems and <strong>Applications</strong> IX, Maryland (April 2012) [11] Gałuszka, A., Pacholczyk, M., Bereska, D., Skrzypczyk, K.: Plann<strong>in</strong>g as Artifficial <strong>Intelligence</strong> Problem-short <strong>in</strong>troduction and overview. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems of National Border Security. SCI, vol. 440, pp. 95–104. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [12] Skrzypczyk, K., Gałuszka, A., Pacholczyk, M., Daniec, K.: Probabilistic approach to plann<strong>in</strong>g collision free path of <strong>UAV</strong>. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems of National Border Security. SCI, vol. 440, pp. 59–68. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [13] Krzeszowski, T., Kwolek, B., Michalczuk, A., Świtoński, A., Josiński, H.: View <strong>in</strong>dependent human gait recognition us<strong>in</strong>g markerless 3d human motion capture. In: Bolc, L., Tadeusiewicz, R., Chmielewski, L.J., Wojciechowski, K. (<strong>eds</strong>.) ICCVG 2012. LNCS, vol. 7594, pp. 491–500. Spr<strong>in</strong>ger, Heidelberg (2012) [14] Jos<strong>in</strong>ski, H., Switonski, A., Jedrasiak, K., Kostrzewa, D.: Human identification based on gait motion capture data. Lecture Notes <strong>in</strong> Eng<strong>in</strong>eer<strong>in</strong>g and Computer Science, vol. 1, pp. 507–510 (2012) [15] Jos<strong>in</strong>ski, H., Switonski, A., Michalczuk, A., Wojciechowski, K.: Motion Capture as data source for gait-based human identification. Przegląd Elektrotechniczny 88(12B), 201–204 (2012) [16] Iwaneczko, P., <strong>Jędrasiak</strong>, K., Daniec, K., <strong>Nawrat</strong>, A.: A prototype of unmanned aerial vehicle for image acquisition. In: Bolc, L., Tadeusiewicz, R., Chmielewski, L.J., Wojciechowski, K. (<strong>eds</strong>.) ICCVG 2012. LNCS, vol. 7594, pp. 87–94. Spr<strong>in</strong>ger, Heidelberg (2012)
Probabilistic Approach to Plann<strong>in</strong>g Collision Free Path of <strong>UAV</strong> Dawid Cedrych, Adam Gałuszka, Marc<strong>in</strong> Pacholczyk, Krzysztof Skrzypczyk, and <strong>Aleksander</strong> <strong>Nawrat</strong> * Abstract. In this chapter we present simulation results of an algorithm designed to plann<strong>in</strong>g collision free path based on probabilistic search. The aim of this study is to design an algorithm for off-l<strong>in</strong>e plann<strong>in</strong>g a set of way-po<strong>in</strong>ts which make the collision free route of an Unmanned Aerial Vehicle (<strong>UAV</strong>). The plann<strong>in</strong>g process is based on the graph that represents the map conta<strong>in</strong><strong>in</strong>g <strong>in</strong>formation about altitude of the terra<strong>in</strong> over which the <strong>UAV</strong> is <strong>in</strong>tended to perform its mission. The size of the graph <strong>in</strong>creases polynomially with the resolution of the map and the number of way po<strong>in</strong>ts of <strong>UAV</strong>. The probabilistic method of mak<strong>in</strong>g the representative model of the terra<strong>in</strong> was applied <strong>in</strong> order to reduce a complexity of plann<strong>in</strong>g the collision free path. The function<strong>in</strong>g and efficiency of the approach proposed was <strong>in</strong>troduced <strong>in</strong> our previous work, here we use real terra<strong>in</strong> map and <strong>in</strong>troduce <strong>in</strong>dicators to illustrate properties of the algorithm. 1 Introduction In this chapter, it is assumed that <strong>UAV</strong> is <strong>in</strong>tended to perform autonomously the mission assigned by the operator. Dur<strong>in</strong>g the process of the design of <strong>UAV</strong> control system there are many serious and difficult problems to solve. It is enough to Adam Gałuszka . Krzysztof Skrzypczyk . Marc<strong>in</strong> Pacholczyk . Dawid Cedrych . <strong>Aleksander</strong> <strong>Nawrat</strong> Silesian University of Technology, Institute of Automatic Control, Akademicka 16, 44-101 Gliwice, Poland e-mail: {adam.galuszka,krzysztof.skrzypczyk, marc<strong>in</strong>.pacholczyk,anawrat}@polsl.pl <strong>Aleksander</strong> <strong>Nawrat</strong> Ośrodek Badawczo-Rozwojowy Urządzeń Mechanicznych “OBRUM” sp. z o.o., ul. Toszecka 102, 44-117 Gliwice, Poland e-mail: anawrat@obrum.gliwice.pl A. <strong>Nawrat</strong> and Z. <strong>Kuś</strong> (Eds.): <strong>Vision</strong> <strong>Based</strong> Systems for <strong>UAV</strong> <strong>Applications</strong>, SCI <strong>481</strong>, pp. 191–203. DOI: 10.1007/978-3-319-00369-6_12 © Spr<strong>in</strong>ger International Publish<strong>in</strong>g Switzerland <strong>2013</strong>
- Page 1 and 2:
Studies in Computational Intelligen
- Page 3 and 4:
Aleksander Nawrat · Zygmunt Kuś E
- Page 5 and 6:
“In God We Trust, All others we o
- Page 7 and 8:
VIII Preface valuable information i
- Page 9 and 10:
Contents Part I: Design of Object D
- Page 11 and 12:
Contents XIII Technology Developmen
- Page 13 and 14:
XVI List of Contributors Adam Czorn
- Page 15 and 16:
XVIII List of Contributors Henryk M
- Page 17 and 18:
XX List of Contributors Józef Wron
- Page 19 and 20:
2 Design of Object Detection, Recog
- Page 21 and 22:
4 A. Babiarz et al. 1 Introduction
- Page 23 and 24:
6 A. Babiarz et al. The description
- Page 25 and 26:
8 A. Babiarz et al. a) b) Fig. 5. T
- Page 27 and 28:
10 A. Babiarz et al. o o o “micvo
- Page 29 and 30:
12 A. Babiarz et al. a) b) Fig. 7.
- Page 31 and 32:
14 A. Babiarz et al. • The camera
- Page 33 and 34:
16 A. Babiarz et al. the camera ser
- Page 35 and 36:
18 A. Babiarz et al. a) b) Fig. 11.
- Page 37 and 38:
20 A. Babiarz et al. patrol point i
- Page 39 and 40:
22 A. Babiarz et al. particular ang
- Page 41 and 42:
24 A. Babiarz et al. The sample cod
- Page 43 and 44:
Recognition and Location of Objects
- Page 45 and 46:
Recognition and Location of Objects
- Page 47 and 48:
Recognition and Location of Objects
- Page 49 and 50:
Recognition and Location of Objects
- Page 51 and 52:
Recognition and Location of Objects
- Page 53 and 54:
Recognition and Location of Objects
- Page 55 and 56:
Recognition and Location of Objects
- Page 57 and 58:
Recognition and Location of Objects
- Page 59 and 60:
Recognition and Location of Objects
- Page 61 and 62:
Recognition and Location of Objects
- Page 63 and 64:
48 P. Demski et al. Fig. 1. Automat
- Page 65 and 66:
50 P. Demski et al. E-Grip firing m
- Page 67 and 68:
52 P. Demski et al. 4 Automatic Tar
- Page 69 and 70:
54 P. Demski et al. implementing su
- Page 71 and 72:
Object Tracking for Rapid Camera Mo
- Page 73 and 74:
Object Tracking for Rapid Camera Mo
- Page 75 and 76:
Object Tracking for Rapid Camera Mo
- Page 77 and 78:
Object Tracking for Rapid Camera Mo
- Page 79 and 80:
Object Tracking for Rapid Camera Mo
- Page 81 and 82:
Object Tracking for Rapid Camera Mo
- Page 83 and 84:
Object Tracking for Rapid Camera Mo
- Page 85 and 86:
Object Tracking for Rapid Camera Mo
- Page 87 and 88:
Object Tracking for Rapid Camera Mo
- Page 89 and 90:
Object Tracking for Rapid Camera Mo
- Page 91 and 92:
Object Tracking in a Picture during
- Page 93 and 94:
Object Tracking in a Picture during
- Page 95 and 96:
Object Tracking in a Picture during
- Page 97 and 98:
Object Tracking in a Picture during
- Page 99 and 100:
Object Tracking in a Picture during
- Page 101 and 102:
Object Tracking in a Picture during
- Page 103 and 104:
Object Tracking in a Picture during
- Page 105 and 106:
Object Tracking in a Picture during
- Page 107 and 108:
94 Construction of Image Acquisitio
- Page 109 and 110:
96 G. Bieszczad et al. effectivenes
- Page 111 and 112:
98 G. Bieszczad et al. such a way t
- Page 113 and 114:
100 G. Bieszczad et al. Table 1. No
- Page 115 and 116:
102 G. Bieszczad et al. Fig. 7. Pri
- Page 117 and 118:
104 G. Bieszczad et al. Fig. 10. Re
- Page 119 and 120:
106 G. Bieszczad et al. Fig. 13. Ti
- Page 121 and 122:
108 G. Bieszczad et al. Fig. 16. Op
- Page 123 and 124:
110 G. Bieszczad et al. has to make
- Page 125 and 126:
112 G. Bieszczad et al. MTF functio
- Page 127 and 128:
114 G. Bieszczad et al. [11] Krupi
- Page 129 and 130:
116 D. Bereska et al. Presented in
- Page 131 and 132:
118 D. Bereska et al. Fig. 2. Image
- Page 133 and 134:
120 D. Bereska et al. Fig. 4. Ortho
- Page 135 and 136:
Omnidirectional Video Acquisition D
- Page 137 and 138:
Omnidirectional Video Acquisition D
- Page 139 and 140:
Omnidirectional Video Acquisition D
- Page 141 and 142:
Omnidirectional Video Acquisition D
- Page 143 and 144:
Omnidirectional Video Acquisition D
- Page 145 and 146:
Omnidirectional Video Acquisition D
- Page 147 and 148:
Omnidirectional Video Acquisition D
- Page 149 and 150: Part III Design of Vision Based Con
- Page 151 and 152: Vision System for Group of Mobile R
- Page 153 and 154: Vision System for Group of Mobile R
- Page 155 and 156: Vision System for Group of Mobile R
- Page 157 and 158: Vision System for Group of Mobile R
- Page 159 and 160: Vision System for Group of Mobile R
- Page 161 and 162: Vision System for Group of Mobile R
- Page 163 and 164: Vision System for Group of Mobile R
- Page 165 and 166: Vision System for Group of Mobile R
- Page 167 and 168: Vision System for Group of Mobile R
- Page 169 and 170: A Distributed Control Group of Mobi
- Page 171 and 172: A Distributed Control Group of Mobi
- Page 173 and 174: A Distributed Control Group of Mobi
- Page 175 and 176: A Distributed Control Group of Mobi
- Page 177 and 178: A Distributed Control Group of Mobi
- Page 179 and 180: A Distributed Control Group of Mobi
- Page 181 and 182: A Distributed Control Group of Mobi
- Page 183 and 184: A Distributed Control Group of Mobi
- Page 185 and 186: A Distributed Control Group of Mobi
- Page 187 and 188: A Distributed Control Group of Mobi
- Page 189 and 190: 178 D. Bereska et al. The aim of th
- Page 191 and 192: 180 D. Bereska et al. 3 Control Alg
- Page 193 and 194: 182 D. Bereska et al. Fig. 5. Schem
- Page 195 and 196: 184 D. Bereska et al. 4.1 Mechanica
- Page 197 and 198: 186 D. Bereska et al. 4.1.3 Open-Lo
- Page 199: 188 D. Bereska et al. Fig. 16. Comp
- Page 203 and 204: Probabilistic Approach to Planning
- Page 205 and 206: Probabilistic Approach to Planning
- Page 207 and 208: Probabilistic Approach to Planning
- Page 209 and 210: Probabilistic Approach to Planning
- Page 211 and 212: Probabilistic Approach to Planning
- Page 213 and 214: Probabilistic Approach to Planning
- Page 215 and 216: 206 Practical Applications of Class
- Page 217 and 218: 208 M. Mellado and K. Skrzypczyk an
- Page 219 and 220: 210 M. Mellado and K. Skrzypczyk a
- Page 221 and 222: 212 M. Mellado and K. Skrzypczyk wh
- Page 223 and 224: 214 M. Mellado and K. Skrzypczyk Fi
- Page 225 and 226: 216 M. Mellado and K. Skrzypczyk 6
- Page 227 and 228: Prototyping the Autonomous Flight A
- Page 229 and 230: Prototyping the Autonomous Flight A
- Page 231 and 232: Prototyping the Autonomous Flight A
- Page 233 and 234: Prototyping the Autonomous Flight A
- Page 235 and 236: Prototyping the Autonomous Flight A
- Page 237 and 238: Prototyping the Autonomous Flight A
- Page 239 and 240: Prototyping the Autonomous Flight A
- Page 241 and 242: Feature Extraction and HMM-Based Cl
- Page 243 and 244: Feature Extraction and HMM-Based Cl
- Page 245 and 246: Feature Extraction and HMM-Based Cl
- Page 247 and 248: Feature Extraction and HMM-Based Cl
- Page 249 and 250: Feature Extraction and HMM-Based Cl
- Page 251 and 252:
Feature Extraction and HMM-Based Cl
- Page 253 and 254:
Feature Extraction and HMM-Based Cl
- Page 255 and 256:
248 K. Daniec et al. In this articl
- Page 257 and 258:
250 K. Daniec et al. Fig. 2. Applic
- Page 259 and 260:
252 K. Daniec et al. by the ability
- Page 261 and 262:
254 K. Daniec et al. The idea of us
- Page 263 and 264:
Selection of Individual Gait Featur
- Page 265 and 266:
Selection of Individual Gait Featur
- Page 267 and 268:
Selection of Individual Gait Featur
- Page 269 and 270:
Selection of Individual Gait Featur
- Page 271 and 272:
Selection of Individual Gait Featur
- Page 273 and 274:
Selection of Individual Gait Featur
- Page 275 and 276:
Selection of Individual Gait Featur
- Page 277 and 278:
Selection of Individual Gait Featur
- Page 279 and 280:
274 A. Nawrat et al. alia weather c
- Page 281 and 282:
276 A. Nawrat et al. have an embedd
- Page 283 and 284:
278 A. Nawrat et al. particular cha
- Page 285 and 286:
280 A. Nawrat et al. • 1-phase el
- Page 287 and 288:
282 A. Nawrat et al. The phase cond
- Page 289 and 290:
284 A. Nawrat et al. Table 1. Compa
- Page 291 and 292:
286 A. Nawrat et al. P3 P3 P3 71101
- Page 293 and 294:
288 A. Nawrat et al. S, VA 4,510 4,
- Page 295 and 296:
290 A. Nawrat et al. Comparison dif
- Page 297 and 298:
Technology Development of Military
- Page 299 and 300:
Technology Development of Military
- Page 301 and 302:
Technology Development of Military
- Page 303 and 304:
Technology Development of Military
- Page 305 and 306:
Technology Development of Military
- Page 307 and 308:
Technology Development of Military
- Page 309 and 310:
Technology Development of Military
- Page 311 and 312:
Technology Development of Military
- Page 313 and 314:
Technology Development of Military
- Page 315 and 316:
312 A. Czornik, A. Nawrat, and M. N
- Page 317 and 318:
314 A. Czornik, A. Nawrat, and M. N
- Page 319 and 320:
316 A. Czornik, A. Nawrat, and M. N
- Page 321 and 322:
318 A. Czornik, A. Nawrat, and M. N
- Page 323 and 324:
320 A. Czornik, A. Nawrat, and M. N
- Page 325 and 326:
322 A. Czornik, A. Nawrat, and M. N
- Page 327 and 328:
324 A. Czornik, A. Nawrat, and M. N
- Page 329 and 330:
326 A. Czornik, A. Nawrat, and M. N
- Page 331 and 332:
328 M. Koźlak, A. Kurzeja, and A.
- Page 333 and 334:
330 M. Koźlak, A. Kurzeja, and A.
- Page 335 and 336:
332 M. Koźlak, A. Kurzeja, and A.
- Page 337 and 338:
334 M. Koźlak, A. Kurzeja, and A.
- Page 339 and 340:
336 M. Koźlak, A. Kurzeja, and A.
- Page 341 and 342:
338 M. Koźlak, A. Kurzeja, and A.
- Page 343 and 344:
340 M. Koźlak, A. Kurzeja, and A.
- Page 345 and 346:
342 M. Koźlak, A. Kurzeja, and A.
- Page 347 and 348:
344 Conclusions Control algorithms