15.07.2013 Views

Handbook of Propagation Effects for Vehicular and ... - Courses

Handbook of Propagation Effects for Vehicular and ... - Courses

Handbook of Propagation Effects for Vehicular and ... - Courses

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

10-2<br />

10.2 General Methodology<br />

<strong>Propagation</strong> <strong>Effects</strong> <strong>for</strong> <strong>Vehicular</strong> <strong>and</strong> Personal Mobile Satellite Systems<br />

The method consists <strong>of</strong> the following steps: (1) Images seen through a fisheye lens are<br />

photographed at potential user locations. (2) Clear, shadowed, or blocked path states are<br />

extracted as a function <strong>of</strong> look angle. (3) Path states are combined <strong>for</strong> single or multiple<br />

satellite scenarios with frequency-appropriate statistical fade models to predict fade<br />

distributions <strong>and</strong> diversity per<strong>for</strong>mance as a function <strong>of</strong> elevation angle. Measurements<br />

<strong>of</strong> optical brightness along the line-<strong>of</strong>-sight path have demonstrated a statistical link<br />

between fading <strong>and</strong> optical intensity [Vogel <strong>and</strong> Hong, 1988]. The images are acquired<br />

with a 35-mm camera with a fisheye lens having a full 90° field <strong>of</strong> view in elevation <strong>and</strong><br />

360° in azimuth. The camera may be mounted on a vehicle, h<strong>and</strong> carried, or fixed on a<br />

tripod. A compass is used to align the top <strong>of</strong> each frame towards north. Resulting slides<br />

are scanned into a personal computer <strong>and</strong> subsequently analyzed.<br />

Figure 10-1 shows an example <strong>of</strong> a fish-eye lens image in an urban location in<br />

Austin, Texas where the zenith is at the center [Akturan <strong>and</strong> Vogel, 1995]. The 24-bit<br />

color image is reduced to an 8 bit gray-scale <strong>and</strong> it is unwrapped from a circular image<br />

(zenith at center) to a rectangular one with the zenith at the top. An image histogram is<br />

constructed based on the gray scale value <strong>of</strong> 32,400 pixels (360° azimuth times 90°<br />

elevation, where each pixel size is 1° by 1°. The second derivative <strong>of</strong> the image<br />

histogram, after filtering, is used to estimate the gray-level threshold value separating sky<br />

<strong>and</strong> obstacles [Bovik, 1991]. A binary ski/no-sky image is subsequently produced <strong>and</strong><br />

used to calculate the elevation angle <strong>for</strong> each azimuth at which the sky becomes visible<br />

(e.g., the skyline). The corresponding skyline <strong>of</strong> the unwrapped image <strong>of</strong> Figure 10-1 is<br />

shown in Figure 10-2 with the abscissa representing the azimuth (relative to north at 0°)<br />

<strong>and</strong> the ordinate is the corresponding elevation angle.<br />

Figure 10-1: Example <strong>of</strong> fisheye lens image in Austin, Texas

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!