검색결과

검색조건
좁혀보기
검색필터
결과 내 재검색

간행물

    분야

      발행연도

      -

        검색결과 2

        1.
        2017.10 구독 인증기관·개인회원 무료
        As a non-parametric data mining method, decision tree classification has performed well in many applications. The complexity of the model increases as the decision tree algorithm proceeds to grow the decision tree as the rule of decision making. While the increase of the complexity enhances the accuracy, it degrades the generalization which predicts the unseen data. This phenomenon is called as overfitting. To avoid the overfitting, pruning has been introduced. Pruning enables to make the generalization better, reduces the complexity, and avoids the overfitting. Although various pruning methods have been proposed, selecting the best pruning methods or making balance between complexity and generalization with pruning is not a simple problem. In this paper, we explore the methods of pruning and analyze them to suggest the optimal approach for applications.
        2.
        2016.02 KCI 등재 서비스 종료(열람 제한)
        Tree canopy is a valuable component consisting of urban ecosystem. The purpose of this study was to classify urban tree canopy (UTC) by using high resolution imagery and object-oriented classification (OOC), which was used to classify the different land cover types. With an urban canopy mapping system based on OOC and Decision Tree Classification (DTC), a site mapping was carried out by merging spectral data of high resolution imagery. This methodological approach showed high classification accuracy to distinguish small patches and continuous UTC boundaries on the high resolution imagery. For shadow removal, decision tree classification with various environmental variables such as brightness channel and band combination could effectively work. Our proposed methodology can be successfully used for the assessment and restoration of fragmented urban ecosystem and offer an opportunity to obtain high classification accuracy for the distinction of UTC components in urban landscape areas.