Introduction Study area Method: Photogrammetry Method: Water depth estimation Final riverbed surface Conclusions References Links & Acknowledgements |
4: Method: Dealing with water using image analysis
4.1. Theory
Photogrammetry is a 'line-of-sight' survey method, meaning that it can only be used to derive an elevation for those objects or landforms visible on the image. In the same way as shadows or vegetation can hide the true topography (e.g. Lane, 1994), turbid water masks the submerged bed. For these areas, image analysis is used to derive an empirical relationship between water colour and water depth from which depths can be estimated, a method previously used successfully by Winterbottom and Gilvear (1997) and Gilvear et al. (1998). This study takes this methodology one step further by combining the estimated water depth data with a modelled water surface to give fully three-dimensional topography from inundated areas.
Bathymetric mapping of lakes and oceans from aerial photographs has been used quite frequently in the past. However, rivers are considered optically more complex. In particular, the colour of river water is modified by three factors not usually of concern in lakes or oceans: (i) particulate matter that rivers maintain in suspension (but which tends to settle out in lakes and oceans); (ii) the visual presence of the banks and bed due to the comparatively shallow nature of rivers; and (iii) areas of white water, which will change the optical properties of rivers due to the spectrally non-selective scattering by air bubbles entrained in rapids and riffles (Davies-Colley et al., 1993).
On aerial photographs of turbid river water, colour can be thought of as determined by the relative proportions of light reflected from the submerged bed and from within the water column itself. The range of colours is bounded by zero water depth at one extreme (in which case all light is reflected from the bed) and the depth at which all light is reflected from the water column at the other (in which case a maximum predictable depth is reached). Between these two end points, the relationship between water colour and water depth may be determined empirically, giving a relationship of the form water depth = ln(water colour) (Lyzenga, 1981).
4.2. Method
Colour on digital aerial photography is determined by the relative importance of the red, green and blue (RGB) bands in each pixel. The RGB values of wetted channels were obtained by classifying the imagery into 'wet', 'dry' and 'vegetated' areas (Figure 5) and then taking the (x,y) position and RGB values those pixels that fell in a wetted channel. The RGB values were then transformed using a natural logarithmic transformation to permit a linear relationship between water colour and water depth (Lyzenga, 1981). The transformed RGB values were then used as the independent variables in multivariate analysis relating water colour to water depth.
To calibrate and validate the models, water depth data sets were concurrent with the photography, using National Institute of Water and Atmospheric Research (NIWA) and Environment Canterbury (EC) field teams (Hicks et al., 1999). The ground survey of wetted channels involved both total station and roving GPS survey, and provided information on water depth for a large proportion of the study reach. The total station survey was continuous, allowing around 1200 points to be collected each day. The GPS survey used a Trimble RTK system which gave real-time depths to an accuracy of a few centimetres. This was mounted on a quad bike for water depths of up to about 0.6 m, and 'hand-held' for water depths up to 1.0 m. In deeper, navigable channels, initially a jet boat and later a wooden kayak fitted with GPS and an echo-sounder provided a large number of water depth measurements. The data for each epoch was divided into two equal sets to act as calibration and independent check data-sets respectively.
The aerial photographs of the study reach were geo-referenced using the photo-control points established for the photogrammetric survey. The imagery was rectified rather than ortho-rectified because the DEMs were generated using black and white and high-resolution scans, yet colour imagery was required for the empirical colour-depth relationship. This was justified by the low vertical relief of the riverbed, which reduces the need for vertical image displacement to be considered. For example, given the scale and size of the aerial photographs used, a feature 1 m high would be displaced laterally by less than one pixel (1 m), even if it were on the extreme edge of the image.
Water depths were assigned to specific pixels using a simple automated matching routine. This took each surveyed water depth in turn, and calculated the average RGB values of all pixels that fell within a specified maximum search radius. The maximum search radius was set to 0.5 m (the object space pixel size) as this gives a search diameter equal to the best (x,y) accuracy that can be expected from the imagery.
4.3. Results
The quality of depth predictions was assessed using mean error (ME) and standard deviation of error (SDE) as compared with the independent 'check' data-set (Table 1).
Table 1: The results of the multivariate analyses between water colour and water depth for each epoch.
Epoch | 'Best' predictors | Number of tested points | ME (cm) | SDE (cm) | R (%) |
Feb 1999 | RB | 12996 | -1.5 | 19.9 | 48 |
March 1999 | RB | 1025 | -0.8 | 19.2 | 61 |
Feb 2000 | RG | 14303 | +0.2 | 16.8 | 57 |
Despite the relatively low R values (around half of the variance in water depth is explained), the standard deviation of errors (SDE) are quite encouraging given the range of water depths (0 to about 2 m). The variation in the 'best' predictors is thought to be explained by changes in the turbidity of the water. Using the expressions derived, estimated water depth maps could be calculated (Figure 6).
Figure 6 Water depth map for the Waimakariri study reach for February 1999.
Introduction Study area Method: Photogrammetry Method: Water depth estimation Final riverbed surface Conclusions References Links & Acknowledgements