DOC

post-classification

By Dorothy Stewart,2014-12-25 01:42
7 views 0
post-classification

    Dynamic monitor on urban expansion based on a object-oriented approach

    Chunyang He; Jing Li; *JinShui Zhang;Yaozhong Pan; YunHao Chen

    Key Laboratory of Environmental Change and Natural Disaster,

     Ministry of Education of China, Beijing Normal University;

    College of Resources Science & Technology,

    Beijing Normal University, the Peoples Republic of China, 100875

    * Corresponding author: zhangjsh@ires.cn

    AbstractIn this paper, a new object-based change detection encroachment from 1937 to 2003 in southern New Mexico [4]. approach is developed. The approach consists of three steps: (1) L. Bruzzone brought an adaptive parcel-based technique for producing multi-scale objects from multi-temporal remote unsupervised change detection. Ola Hall take a multiscale sensing images by combining the spectrum, texture and context object-specific approach to digital change detection. All of information; (2) extracting potential change object by the above have made some success in change detection with comparison of the attributes of shape, structure, texture, etc. of objected-oriented change detection approach [5], whereas such each object; (3) determining the changed object and detecting application needs prior knowledge that leads to error urban expansion area with the help of in-situ investigation. When accumulation, or average spectral information as object unit the object-based approach was applied to the urban expansion that destroy the spatial information. detection in Haidian District, Beijing, China with the support of

    two Landsat Thematic Mapper (TM) data in 1997 and 2004, the Given the shortcoming of traditional land use/cover change satisfactory results were obtained. The overall accuracy is about detection and object-oriented approach, which are influenced 80.3%, Kappa about 0.607, which are more accurate than post-by the sensor and weather. In this paper, making full use of the classification change detection. The newly developed object-stability of spatial object texture between two temporal based change detection approach possesses the advantage of its remotely sensed data, a new urban change detection approach reduction to error accumulation of image classification of that computes the object texture similarity between the two individual date and its independence to the radiometric image at different is developed to extract the urban change correction to some extent. information in order to obtain the urban/non-urban changed

    information accurately, which makes up for the shortcoming Keywords- change detection; object-orient; similarity; remote based on post-classification that produces the error sensing; texture; land use/cover accumulation and the present object-oriented approach that

    destroys the spatial information.

    I. INTRODUCTION

    II. OBJECT-ORIENTED CHANGE DETECTION Timely and accurate change detection of earth’s surface

    features is extremely important for understanding relationships Object, which is extracted from remotely sensed images, is and interactions between human and natural phenomena in the pixel collection that contains spectral, spatial properties. order to promote better decision making [1]. Urban is the most Let us consider two co-registered multispectral images, X1 and sensitive in land use/cover change. To obtain the urban land X2, acquired in the same area at two different times, t1 and t2. use/cover change information is important for urban decision-Objects A and B are extracted from the X1 and X2, respectively. making and sustainable development. The current change Let C be defined the shared object of the two images, which detection approach with remotely sensed data can be generally satisfies the following conditions: grouped as two types of spectral classification-based approach

    and pixel-by-pixel radiometric comparison approach. The (1) CAB

    former has the obvious limitation for its cumulative error in tt12 (3) image classification of an individual date and the latter needs SRRCCthe strict radiometric correction, which is becoming the t1t1t1t1t1R = (f;f;ff) is object A the property obstacle for their wide application in current urban expansion Ai1i2iviRt1detection. collection, where f is the object A v property, such as ivt2t2t2t2t2 Objected-oriented change detection extracts the change spectral, texture and etc. R= (f;f;ff) is Bi1i2iviRinformation from the two temporal remotely sensed data based t2object B the property collection, where f is the object B v ivon object unit using the texture, structure and etc. There are property. S is the relationship of the entire object C features some studies on the objected-oriented change detection.

    which is shared by the two images. S will determine whether Ecological theory, in particular hierarchy theory, predicts that changes in landscape spatial pattern and temporal scales at the object C will be changed or not at some rules. which they are assessed [2]. Volker Walter segments the image

    using GIS database to obtain the change information [3].

    Andreas S. Laliberte takes analysis for mapping shrub

    The research is supported by National Hi-Tech Research and Development Program of China, (No. 2003AA131080)

    property of the same land type possess stabilization without III. FLOW OF OBJECT-ORIENTED URBAN CHANGE DETECTION

    changing with times, which is usually to be adopted in change The approach of object-oriented urban change detection analysis about images at different times. The texture property is computes the two co-registered images shared object related to the pixel window size, and texture is extracted at similarity in order to determine whether the object has changed different pixel window from different resolution images. So or not based on the multiscale object unit. Flow chart is experiments are necessary to make certain the optimum illustrated in Fig. 1 window size in order to extract the texture information.

    Data Pre-ProcessingD. Compute potential change objects similarity

    After step (3) and (4), the objects similarities are computed

    by use of the potential objects’ properties including texture and

    structure. There are two kinds of modes to compute objects Unban/Non-unbanTexture Extractionsimilarity: Distance method and correlation coefficient. object extraction

    (a) Distance method Similarity is computed by use of

    pixels gray value, statistical value and property value. Potential ChangedTexture Extraction WithGenerally. Distance is long, while the similarity is little. Objects ExtractionOptimum Window SizeThe method of absolute minus, average absolute minus and

    square minus are usually used during the distance

    computation.

    Object's Similarity(b) Correlation method Similarity is computed by use of Computationtwo images vectors angles. The normalized multiplied

    correlation and correlation coefficient method are usually used.

    Let Xij belong to object A, while Yij belong to object B, then Changed Objectthe similarity can depicted as follow.

    Extraction X*Y??ijijij(3) Figure 1. Flow Chat of Object-Oriented Change Detection S(A,B)22X*Y???ijijiijj A. Extract the urban/non urban objects

    Xij and Yij are the gray value of the two images, Multiscale objects are extracted from the two images at respectively, or some statistical value such as values of the different times according to the spectrum, texture, structure and mean, variance peak and etc, or the single pixel’s change context. In this study, we focus on urban and non-urban land property value such as texture value, grades value and blur type. Some land types such as water, vegetation are merged property value. into non-urban, while others are merged into urban. The

    urban/non urban objects will be extracted from the two images The objects similarity is computed by use of spatial by use of object-oriented supervised classification method. information, which are not same when the indices are different.

    Let us define the optimum spatial information indices that can B. Extract the potential change objects separate the change/non-change object adequately. There are two parts in urban change detection including The segregative degree of change/non-change objects, urban changing into non-urban and non-urban changing into which are described as follow, is defined in order to separate urban. From the first step, make the two result images from the changed/unchanged objects. which the urban/non-urban objects are extracted by transfer

    image operation. There are four changed land type: urban to ||MeanMeancuc(4) fcuc(,)urban, non-urban to non-urban, urban to non-urban, non-urban VarVar;to urban. The last two land change types are defined as the cuc

    potential changed land type, while the former two change land fcuc(,)is the segregative degree of changed/unchanged types, which are classified accurately, are considered as objects. Mean and Mean is the mean value of change/non-cucunchanged land type. The potential changed objects are change objects similarity collection, respectively. Var and cemphases during the urban object-oriented change detection. Varuc are variance values of the change/unchanged objects

    fcuc(,)similarities variance. The higher of isthe more C. Extract the texture information with the optimum window different between changed/unchanged object is, which can size distinguish the changed object from non-changed object more Texture property that is the pixel frequency of the images is fcuc(,)effectively, otherwise the is lower, which can’t the synthesis of the objects shape, size, shadow and hue, and distinguish them effectively. reflects the local pixels gray value and hue rules. The texture

E. set the threshold to extract the changed objects C. Extract the potential changed objects

    The difference of texture between urban and non-urban is Overlapping the maps by use of the map algebra produces high, so the texture properties change a lot during the process four land types, including urban, urban to non-urban, non-of urban to non-urban or non-urban to urban. The changed urban to urban, non-urban. The non-urban to urban and urban objects that are extracted satisfy the follow: to non-urban are looked upon as the potential changed objects.

    1 (S > T) D. Extract the texture information with optimum window size

    O In this study, the objects’ similarities are computed using

    texture property. At present, the gray co-occurrence matrix 0 (S < T) brought forward by Haralick is used widely. The variance gray S is the similarity degree. T is the threshold that is the co-occurrence matrixes are extracted by Envi4.0. There are boundary of changed and unchanged objects. If S is greater twenty typical changed and unchanged objects that are selected than T, objects have changed. out to compute their similarities in order to obtain the optimum

    window size under which can distinguish changed objects form

    unchanged objects effectively. IV. CHANGE DETECTION EXPERIMENTS IN HAIDIAN

    DISTRICT The twenty changed and unchanged objects values of

    The study area is located westward and northwestward of mean and variance are computed according to the formula 3, Beijing, which is the center of scientific research of Beijing and the segregative degrees about changed and unchanged Capital and the important base of vegetable production. The objects with different window size are computed. The urbanization in such place is mightiness. Obtaining urban segregative degree under 3×3 window size is the highest, change information is important for urban planning decision-about 1.3360. We can draw a conclusion that the optimum making. window size that computes the objects similarity from TM

    image is 3×3. In this study, the cloud-free Landsat TM (123/32) images

    are used which are acquired on 1997-05-16 and 2004-5-19. The E. Combine the object to extract the changed information pixel size is 30m×30m. TM3, TM4 and TM5 contain the most

    abundant vegetation information. So such three bands are During the object-oriented change detection, the object that selected to extract urban change information. After image pre-shared by the two images are defined as the minimum unit. processing, extracting urban/non-urban, determining the Illustrated in Fig2, the green parcels are the common part of the potential change objects, extracting texture information with red polygons (1997) and the blue polygons (2004), which are optimum window size, computing the potential objects’ the shared objects of the two images.

    similarity, setting threshold to extract the change objects, the

    results of urban/non-urban changed objects of Haidian District

    are obtained from 1997 to 2004, which compares with the post-

    classification.

    A. Data Pre-processing

    The data pre-processing includes the geometric correction

    and study area extraction. Firstly, the geometric correction that

    adopts two rank polynomial and double linear interpolation

    algorithm are applied at the 1997 image based on the 1999

    image. Then 2004 image is corrected based on the 1997 image.

    After examination, RMS is less than one pixel. Finally, the

    study area is extracted from the corrected images.

    B. Extract the urban and non-urban objects

     In this study, eCognition3.0, which is the first object-

    oriented software, is adopted to extract the multiscale objects Figure 2. The Shared Objects of two images from the two images. The spectral parameter is set 0.8, while

    shape parameter is set 0.2, which includes smooth and density The crossing point value of changed/unchanged curve is set parameter, 0.9 and 0.1 respectively. The homogeneous object the threshold, which is about 0.1157 [6]. The urban/non-urban contains at least 20 pixels, which avoids producing minute size objects' changed information is extracted. (Fig.3). objects and satisfies the urban/non-urban change's basic unit.

    Because the transformation between urban and non-urban is the

    emphasis, urban, street and etc are grouped into urban, while

    forest, grass, rice field, garden are grouped into non-urban.

Loss error 30.4% 29.0%

    Overall accuracy =74.0% Kappa = 0.480

    VI. CONCLUSIONS AND DISCUSSIONS

    In this study, urban land use change information is extracted with the texture similarity by use the object-oriented approach. The results show that the change detection approach can extract the urban/non-urban change information effectively. Firstly, the urban/non-urban change information are extracted by use of the texture similarity, which can avoid the strict radiometric correction. Because the texture property of the same land type will not change at different times, which have the strong stability, such property can be usually applied Figure 3. Result of the Objected-Oriented Change Detection

    at the multitemporal images change analysis. Such method can also be able to be applied at the change information extraction V. ACCURACY ASSESSMENT from the images acquired by the different sensors.

    The 300 checked points are selected by use of the equalized Secondly, the accuracy of the object-oriented, which random sample in order to illuminate the object-oriented urban overall accuracy is about 80.3% and Kappa is about 0.607, is change detection. The results are illustrated in Tab.2 and Tab.3. higher than the post-classifications, which overall accuracy is The accuracy of the object-oriented, overall accuracy about about 74.0% and Kappa is about 0.480. The object-oriented 78.7%, Kappa about 0.607, is higher than the post-change detection approach integrates the multi-datasource classifications, which shows that the object-oriented change including spectrum, texture and shape to extract the urban/non-detection can obtain the change detection information urban objects, which possesses the more accuracy than post-effectively. classification that extracts the urban/non-urban information with the only spectral information. And the potential changed objects similarity that can further identify that determine TABLE I. ACCURACY OF THE OBJECTED-ORIENTED CHANGE DETECTION whether the changed objects will be changed or not can reduce error accumulation at some degree. While the error Validated data accumulation is the key factor that reduces the accuracy during the post-classification change detection. Changed Unchanged Total Production Loss

    accuracy error Finally, the accuracy of the changed pixels extraction Changed 135 15 150 90.0% 10.0% achieves 70.7%, which shows that the objects similarity can

    separate changed/unchanged object effectively based on the Unchanged 44 106 150 70.7% 29.3% object as the basic unit. Total 179 121 300 There are some aspects to make the further study. (1) The Production 75.4% 87.6% multi-datasource including texture, structure and shape are accuracy applied to compute the objects similarity to separate the Loss error 24.6% 12.4% change/unchanged objects more effectively. (2) Look for the Overall accuracy=80.3% Kappa=0.607 effective method to determine the similarity threshold

    TABLE II. ACCURACY OF THE POST-CLASSIFICATION CHANGE REFERENCES DETECTION [1] IIASA. Modeling Land-use and Land-Cover change in Europe and Northern Asia[R]. 1999 Research Plan, 1998 Validated data

    [2] Allen,T.F.H,Starr, T,B., 1982. Hierarchy Perspective for Ecological Changed UnchaTotal ProducComplexity. University of Chicago Press, Chicago, 310 pp. nged tion [3] Volker Walter. Object-based classification of remote sensing data for accurachange detection. Photogrammetry & Remote Sensing. 2004 (58): 225-238 cy

    [4] Andrea S. Labiberte, Albert Rango, Kris M. Havstad. Object-oriented Changed image analysis for mapping shrub encroachment from 1937 to 2003 in 14.7128 22 150 85.3% southern New Mexico. Remote Sensing of Environment. 2004 (93) ,198-% Unchanged 210 37.356 94 150 60.0%

    [5] Ola Hall, Geoffrey J. Hay. A Multiscale Object-Specific Approach to % Total 184 116 300 Digital Change Detection. International Journal of Applied Earth Earth Production Observation and Geoinformation, Article In Press 69.6% 81.0% accuracy [6] Kittler, J., Illingworth, J., 1986. Minimum error thresholding. Pattern Recognition19(1),4147.

Report this document

For any questions or suggestions please email
cust-service@docsford.com