90 Z. <strong>Kuś</strong> and A. <strong>Nawrat</strong> [3] Irani, M., Peleg, S.: Improv<strong>in</strong>g resolution by image registration. CVGIP: Graph. Modelsand Image Process. 53, 231–239 (1991) [4] Chesnaud, C., Refegier, P., Boulet, V.: Statistical region snake-based segmentation adapted to different physical noise models. IEEE Trans. Patt. Anal. Mach. Intell. 21, 1145–1157 (1999) [5] Gordon, N., Ristic, B., Arulampalam, S.: Beyond the Kalman Filter: Particle Filters for Track<strong>in</strong>g <strong>Applications</strong>. Artech House, Boston (2004) [6] Sharp, C., Shakernia, O., Sastry, S.: A <strong>Vision</strong> System for Land<strong>in</strong>g an Unmanned Aerial Vehicle. In: Proceed<strong>in</strong>gs of the 2001 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1720–1727. IEEE, Los Alamitos (2001) [7] Casbeer, D., Li, S., Beard, R., Mehra, R., McLa<strong>in</strong>, T.: Forest Fire Monitor<strong>in</strong>g With Multiple Small <strong>UAV</strong>s, Porland, OR (April 2005) [8] <strong>Kuś</strong>, Z., Fraś, S.: Helicopter control algorithms from the set orientation to the set geographical location. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems of National Border Security. SCI, vol. 440, pp. 3–14. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [9] <strong>Nawrat</strong>, A.: Modelowanie i sterowanie bezzałogowych obiektów latających. Wydawnictwo Politechniki Slaskiej, Gliwice (2009) [10] Valavanis, K.P. (ed.): Advances In Unmanned Aerial Vehicles. Spr<strong>in</strong>ger (2007) [11] Castillo, P., Lozano, R., Dzul, A.E.: Modell<strong>in</strong>g and Control of M<strong>in</strong>i-Fly<strong>in</strong>g Mach<strong>in</strong>es. Spr<strong>in</strong>ger (2005) [12] Padfield, G.D.: Helicopter Flight Dynamics. Backwell Science Ltd. (1996) [13] Manerowski, J.: Identyfikacja modeli dynamiki ruchu sterowanych obiektów lataja˛cych, WN ASKON, Warszawa (1999) [14] Kearney, J.K., Thompson, W.B.: Optical flow estimation: An error analysis of gradient-based methods with local optimization. IEEE Transactions on Pattern Analysis and Mach<strong>in</strong>e <strong>Intelligence</strong> 9, 229–243 (1987) [15] Anandan, P.: A computational framework and an algorithm for the measurement of visual motion. International Journal of Computer <strong>Vision</strong> 2, 283–310 (1989) [16] Barman, H., Haglund, L., Knutsson, H., Granlund, G.: Estimation of velocity, acceleration, and disparity <strong>in</strong> time sequences. In: Proc. IEEE Workshop on Visual Motion, Pr<strong>in</strong>ceton, NJ, pp. 44–51 (1991) [17] Bascle, B., Bouthemy, E., Deriche, N., Meyer, E.: Track<strong>in</strong>g complex primitives <strong>in</strong> an image sequence. In: Proc. 12th International Conference on Pattern Recognition, Jerusalem, pp. 426–431 (1994) [18] Butt, E.J., Yen, C., Xu, X.: Local correlation measures for motion analysis: A comparativestudy. In: Pattern Recognition and Image Process<strong>in</strong>g Conference, Las Vegas, pp. 269–274 (1982) [19] Butt, E.J., Bergen, J.R., H<strong>in</strong>gorani, R., Kolczynski, R., Lee, W.A., Leung, A., Lub<strong>in</strong>, J., Shvayster, H.: Object track<strong>in</strong>g with a mov<strong>in</strong>g camera. In: Proc. IEEE Workshop on Visual Motion, Irv<strong>in</strong>g, pp. 2–12 (1989) [20] Buxton, B.E., Buxton, H.: Computation of optical flow from the motion of edges features <strong>in</strong> image sequences. Image and <strong>Vision</strong> Comput<strong>in</strong>g 2(2), 59–75 (1984) [21] Campani, M., Verri, A.: Comput<strong>in</strong>g optical flow from an overconstra<strong>in</strong>ed system ofl<strong>in</strong>ear algebraic equations. In: Proc. 3rd International Conference on Computer <strong>Vision</strong>, Osaka, pp. 22–26 (1990) [22] Campani, M., Verri, A.: Motion analysis from firstorder properties of optical flow. CVGIP: Image Understand<strong>in</strong>g 56(1), 90–107 (1992) [23] Carlsson, S.: Information <strong>in</strong> the geometric structure of ret<strong>in</strong>al flow field. In: Proc. 2nd International Conference on Computer <strong>Vision</strong>, pp. 629–633 (1988) [24] Gess<strong>in</strong>g, R.: Control Fundamentals. Silesian University of Technology, Gliwice (2004)
Object Track<strong>in</strong>g <strong>in</strong> a Picture dur<strong>in</strong>g Rapid Camera Movements 91 [25] <strong>Babiarz</strong>, A., Jaskot, K., Koralewicz, P.: The control system for autonomous mobile platform. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems. SCI, vol. 440, pp. 15–28. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [26] <strong>Babiarz</strong>, A., Jaskot, K.: The concept of collision-free path plann<strong>in</strong>g of <strong>UAV</strong> objects. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems. SCI, vol. 440, pp. 81–94. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [27] Jaskot, K., <strong>Babiarz</strong>, A.: The <strong>in</strong>ertial measurement unit for detection of position. Przegląd Elektrotechniczny 86, 323–333 (2010) [28] Kostrzewa, D., Josiński, H.: Verification of the search space exploration strategy based on the solutions of the jo<strong>in</strong> order<strong>in</strong>g problem. In: Czachórski, T., Kozielski, S., Stańczyk, U. (<strong>eds</strong>.) Man-Mach<strong>in</strong>e Interactions 2. AISC, vol. 103, pp. 447–455. Spr<strong>in</strong>ger, Heidelberg (2011) [29] Demski, P., Mikulski, M., Koteras, R.: Characterization of Hokuyo UTM-30LX laser range f<strong>in</strong>der for an autonomous mobile robot. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems. SCI, vol. 440, pp. 143–153. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>) [30] Skorkowski, A., Topor-Kam<strong>in</strong>ski, T.: Analysis of EGNOS-augmented receiver position<strong>in</strong>g accuracy. Acta Physica Polonica A 122(5), 821–824 (2012) [31] Daniec, K., Jedrasiak, K., Koteras, R., <strong>Nawrat</strong>, A.: Embedded micro <strong>in</strong>ertial navigation system. Applied Mechanics and Materials 249-250, 1234–1246 (<strong>2013</strong>) [32] Iwaneczko, P., <strong>Jędrasiak</strong>, K., Daniec, K., <strong>Nawrat</strong>, A.: A prototype of unmanned aerial vehicle for image acquisition. In: Bolc, L., Tadeusiewicz, R., Chmielewski, L.J., Wojciechowski, K. (<strong>eds</strong>.) ICCVG 2012. LNCS, vol. 7594, pp. 87–94. Spr<strong>in</strong>ger, Heidelberg (2012) [33] <strong>Jędrasiak</strong>, K., <strong>Nawrat</strong>, A., Wydmańska, K.: SETh-l<strong>in</strong>k the distributed management system for unmanned mobile vehicles. In: <strong>Nawrat</strong>, A., Simek, K., Świerniak, A. (<strong>eds</strong>.) Advanced Technologies for Intelligent Systems. SCI, vol. 440, pp. 247–256. Spr<strong>in</strong>ger, Heidelberg (<strong>2013</strong>)
- Page 1 and 2:
Studies in Computational Intelligen
- Page 3 and 4:
Aleksander Nawrat · Zygmunt Kuś E
- Page 5 and 6:
“In God We Trust, All others we o
- Page 7 and 8:
VIII Preface valuable information i
- Page 9 and 10:
Contents Part I: Design of Object D
- Page 11 and 12:
Contents XIII Technology Developmen
- Page 13 and 14:
XVI List of Contributors Adam Czorn
- Page 15 and 16:
XVIII List of Contributors Henryk M
- Page 17 and 18:
XX List of Contributors Józef Wron
- Page 19 and 20:
2 Design of Object Detection, Recog
- Page 21 and 22:
4 A. Babiarz et al. 1 Introduction
- Page 23 and 24:
6 A. Babiarz et al. The description
- Page 25 and 26:
8 A. Babiarz et al. a) b) Fig. 5. T
- Page 27 and 28:
10 A. Babiarz et al. o o o “micvo
- Page 29 and 30:
12 A. Babiarz et al. a) b) Fig. 7.
- Page 31 and 32:
14 A. Babiarz et al. • The camera
- Page 33 and 34:
16 A. Babiarz et al. the camera ser
- Page 35 and 36:
18 A. Babiarz et al. a) b) Fig. 11.
- Page 37 and 38:
20 A. Babiarz et al. patrol point i
- Page 39 and 40:
22 A. Babiarz et al. particular ang
- Page 41 and 42:
24 A. Babiarz et al. The sample cod
- Page 43 and 44:
Recognition and Location of Objects
- Page 45 and 46:
Recognition and Location of Objects
- Page 47 and 48:
Recognition and Location of Objects
- Page 49 and 50:
Recognition and Location of Objects
- Page 51 and 52:
Recognition and Location of Objects
- Page 53 and 54: Recognition and Location of Objects
- Page 55 and 56: Recognition and Location of Objects
- Page 57 and 58: Recognition and Location of Objects
- Page 59 and 60: Recognition and Location of Objects
- Page 61 and 62: Recognition and Location of Objects
- Page 63 and 64: 48 P. Demski et al. Fig. 1. Automat
- Page 65 and 66: 50 P. Demski et al. E-Grip firing m
- Page 67 and 68: 52 P. Demski et al. 4 Automatic Tar
- Page 69 and 70: 54 P. Demski et al. implementing su
- Page 71 and 72: Object Tracking for Rapid Camera Mo
- Page 73 and 74: Object Tracking for Rapid Camera Mo
- Page 75 and 76: Object Tracking for Rapid Camera Mo
- Page 77 and 78: Object Tracking for Rapid Camera Mo
- Page 79 and 80: Object Tracking for Rapid Camera Mo
- Page 81 and 82: Object Tracking for Rapid Camera Mo
- Page 83 and 84: Object Tracking for Rapid Camera Mo
- Page 85 and 86: Object Tracking for Rapid Camera Mo
- Page 87 and 88: Object Tracking for Rapid Camera Mo
- Page 89 and 90: Object Tracking for Rapid Camera Mo
- Page 91 and 92: Object Tracking in a Picture during
- Page 93 and 94: Object Tracking in a Picture during
- Page 95 and 96: Object Tracking in a Picture during
- Page 97 and 98: Object Tracking in a Picture during
- Page 99 and 100: Object Tracking in a Picture during
- Page 101 and 102: Object Tracking in a Picture during
- Page 103: Object Tracking in a Picture during
- Page 107 and 108: 94 Construction of Image Acquisitio
- Page 109 and 110: 96 G. Bieszczad et al. effectivenes
- Page 111 and 112: 98 G. Bieszczad et al. such a way t
- Page 113 and 114: 100 G. Bieszczad et al. Table 1. No
- Page 115 and 116: 102 G. Bieszczad et al. Fig. 7. Pri
- Page 117 and 118: 104 G. Bieszczad et al. Fig. 10. Re
- Page 119 and 120: 106 G. Bieszczad et al. Fig. 13. Ti
- Page 121 and 122: 108 G. Bieszczad et al. Fig. 16. Op
- Page 123 and 124: 110 G. Bieszczad et al. has to make
- Page 125 and 126: 112 G. Bieszczad et al. MTF functio
- Page 127 and 128: 114 G. Bieszczad et al. [11] Krupi
- Page 129 and 130: 116 D. Bereska et al. Presented in
- Page 131 and 132: 118 D. Bereska et al. Fig. 2. Image
- Page 133 and 134: 120 D. Bereska et al. Fig. 4. Ortho
- Page 135 and 136: Omnidirectional Video Acquisition D
- Page 137 and 138: Omnidirectional Video Acquisition D
- Page 139 and 140: Omnidirectional Video Acquisition D
- Page 141 and 142: Omnidirectional Video Acquisition D
- Page 143 and 144: Omnidirectional Video Acquisition D
- Page 145 and 146: Omnidirectional Video Acquisition D
- Page 147 and 148: Omnidirectional Video Acquisition D
- Page 149 and 150: Part III Design of Vision Based Con
- Page 151 and 152: Vision System for Group of Mobile R
- Page 153 and 154: Vision System for Group of Mobile R
- Page 155 and 156:
Vision System for Group of Mobile R
- Page 157 and 158:
Vision System for Group of Mobile R
- Page 159 and 160:
Vision System for Group of Mobile R
- Page 161 and 162:
Vision System for Group of Mobile R
- Page 163 and 164:
Vision System for Group of Mobile R
- Page 165 and 166:
Vision System for Group of Mobile R
- Page 167 and 168:
Vision System for Group of Mobile R
- Page 169 and 170:
A Distributed Control Group of Mobi
- Page 171 and 172:
A Distributed Control Group of Mobi
- Page 173 and 174:
A Distributed Control Group of Mobi
- Page 175 and 176:
A Distributed Control Group of Mobi
- Page 177 and 178:
A Distributed Control Group of Mobi
- Page 179 and 180:
A Distributed Control Group of Mobi
- Page 181 and 182:
A Distributed Control Group of Mobi
- Page 183 and 184:
A Distributed Control Group of Mobi
- Page 185 and 186:
A Distributed Control Group of Mobi
- Page 187 and 188:
A Distributed Control Group of Mobi
- Page 189 and 190:
178 D. Bereska et al. The aim of th
- Page 191 and 192:
180 D. Bereska et al. 3 Control Alg
- Page 193 and 194:
182 D. Bereska et al. Fig. 5. Schem
- Page 195 and 196:
184 D. Bereska et al. 4.1 Mechanica
- Page 197 and 198:
186 D. Bereska et al. 4.1.3 Open-Lo
- Page 199 and 200:
188 D. Bereska et al. Fig. 16. Comp
- Page 201 and 202:
Probabilistic Approach to Planning
- Page 203 and 204:
Probabilistic Approach to Planning
- Page 205 and 206:
Probabilistic Approach to Planning
- Page 207 and 208:
Probabilistic Approach to Planning
- Page 209 and 210:
Probabilistic Approach to Planning
- Page 211 and 212:
Probabilistic Approach to Planning
- Page 213 and 214:
Probabilistic Approach to Planning
- Page 215 and 216:
206 Practical Applications of Class
- Page 217 and 218:
208 M. Mellado and K. Skrzypczyk an
- Page 219 and 220:
210 M. Mellado and K. Skrzypczyk a
- Page 221 and 222:
212 M. Mellado and K. Skrzypczyk wh
- Page 223 and 224:
214 M. Mellado and K. Skrzypczyk Fi
- Page 225 and 226:
216 M. Mellado and K. Skrzypczyk 6
- Page 227 and 228:
Prototyping the Autonomous Flight A
- Page 229 and 230:
Prototyping the Autonomous Flight A
- Page 231 and 232:
Prototyping the Autonomous Flight A
- Page 233 and 234:
Prototyping the Autonomous Flight A
- Page 235 and 236:
Prototyping the Autonomous Flight A
- Page 237 and 238:
Prototyping the Autonomous Flight A
- Page 239 and 240:
Prototyping the Autonomous Flight A
- Page 241 and 242:
Feature Extraction and HMM-Based Cl
- Page 243 and 244:
Feature Extraction and HMM-Based Cl
- Page 245 and 246:
Feature Extraction and HMM-Based Cl
- Page 247 and 248:
Feature Extraction and HMM-Based Cl
- Page 249 and 250:
Feature Extraction and HMM-Based Cl
- Page 251 and 252:
Feature Extraction and HMM-Based Cl
- Page 253 and 254:
Feature Extraction and HMM-Based Cl
- Page 255 and 256:
248 K. Daniec et al. In this articl
- Page 257 and 258:
250 K. Daniec et al. Fig. 2. Applic
- Page 259 and 260:
252 K. Daniec et al. by the ability
- Page 261 and 262:
254 K. Daniec et al. The idea of us
- Page 263 and 264:
Selection of Individual Gait Featur
- Page 265 and 266:
Selection of Individual Gait Featur
- Page 267 and 268:
Selection of Individual Gait Featur
- Page 269 and 270:
Selection of Individual Gait Featur
- Page 271 and 272:
Selection of Individual Gait Featur
- Page 273 and 274:
Selection of Individual Gait Featur
- Page 275 and 276:
Selection of Individual Gait Featur
- Page 277 and 278:
Selection of Individual Gait Featur
- Page 279 and 280:
274 A. Nawrat et al. alia weather c
- Page 281 and 282:
276 A. Nawrat et al. have an embedd
- Page 283 and 284:
278 A. Nawrat et al. particular cha
- Page 285 and 286:
280 A. Nawrat et al. • 1-phase el
- Page 287 and 288:
282 A. Nawrat et al. The phase cond
- Page 289 and 290:
284 A. Nawrat et al. Table 1. Compa
- Page 291 and 292:
286 A. Nawrat et al. P3 P3 P3 71101
- Page 293 and 294:
288 A. Nawrat et al. S, VA 4,510 4,
- Page 295 and 296:
290 A. Nawrat et al. Comparison dif
- Page 297 and 298:
Technology Development of Military
- Page 299 and 300:
Technology Development of Military
- Page 301 and 302:
Technology Development of Military
- Page 303 and 304:
Technology Development of Military
- Page 305 and 306:
Technology Development of Military
- Page 307 and 308:
Technology Development of Military
- Page 309 and 310:
Technology Development of Military
- Page 311 and 312:
Technology Development of Military
- Page 313 and 314:
Technology Development of Military
- Page 315 and 316:
312 A. Czornik, A. Nawrat, and M. N
- Page 317 and 318:
314 A. Czornik, A. Nawrat, and M. N
- Page 319 and 320:
316 A. Czornik, A. Nawrat, and M. N
- Page 321 and 322:
318 A. Czornik, A. Nawrat, and M. N
- Page 323 and 324:
320 A. Czornik, A. Nawrat, and M. N
- Page 325 and 326:
322 A. Czornik, A. Nawrat, and M. N
- Page 327 and 328:
324 A. Czornik, A. Nawrat, and M. N
- Page 329 and 330:
326 A. Czornik, A. Nawrat, and M. N
- Page 331 and 332:
328 M. Koźlak, A. Kurzeja, and A.
- Page 333 and 334:
330 M. Koźlak, A. Kurzeja, and A.
- Page 335 and 336:
332 M. Koźlak, A. Kurzeja, and A.
- Page 337 and 338:
334 M. Koźlak, A. Kurzeja, and A.
- Page 339 and 340:
336 M. Koźlak, A. Kurzeja, and A.
- Page 341 and 342:
338 M. Koźlak, A. Kurzeja, and A.
- Page 343 and 344:
340 M. Koźlak, A. Kurzeja, and A.
- Page 345 and 346:
342 M. Koźlak, A. Kurzeja, and A.
- Page 347 and 348:
344 Conclusions Control algorithms