PURPOSES : This study aims to suggest how to utilize "standby data" of shared mobility that does not contain personal information and examine whether "standby data" can derive existing shared mobility operation analysis items similarly.
METHODS : An existing Personal Mobility (PM) traffic pattern analysis was performed by identifying the user (User ID) and the user's route in a time frame. In this study, the PM traffic pattern analysis focuses on a vehicle (ID of the standby vehicle) and its standby location. We examined whether the items derived from the User ID-based traffic pattern analysis could also be derived from the standby Vehicle ID-based analysis.
RESULTS : The analysis showed that all five items (traffic volume by time slot, peak time, average travel time, average travel distance, and average travel speed) of the existing User ID-based PM travel analysis result could be derived similarly using the standby Vehicle ID-based PM traffic analysis. However, the disadvantage is that the average driving distance is calculated as a straight-line distance. It seems possible to overcome this limitation by correcting the average driving distance through linkage analysis with road network data. However, it is not possible to derive the instantaneous maximum speed or acceleration/deceleration.
CONCLUSIONS : In an era in which various means of transportation are being introduced, data sharing is not preferred because of legal issues.Consequently, it is difficult to understand the use of new means of transportation and formulate new policies. To address this, data sharing can be active based on standby data that is not related to personal information.
최근 GPS에 기반한 위치 수집 기술의 발전과 스마트폰과 같은 GPS를 탑재한 디바이스의 폭발적인 증가로 사람, 차량, 선박, 항공체와 같은 움직이는 물체의 지리적 위치에 대한 엄청난 양의 데이터가 실시간으로 수집되고 있다. 이는 사물의 움직임과 관련된 중요한 학문적 및 실용적 가치를 가지고 있다. 이와 같은 데이터를 분석하기 위한 데이터 마이닝 방법 또한 함께 발전하고 있으며 연구자들은 궤적 데이터를 활용하여 도시에서 일어나는 이동 현상과 도시를 구성하는 장소 간의 관계 등을 탐색함으로써 다양한 도시 문제에 대한 해결방안을 제시하고 있다. 궤적은 다양한 물체의 움직임을 추적할 수 있는 만큼 그 활용 분야와 목적 역시 매우 다양하여 도시 계획, 교통, 행동생태학, 공공안전, 이상 및 위반 탐지, 감시 등과 같은 분야에서 널리 활용되고 있다. 특히 최근 데이터 마이닝 방법론과 딥러닝 기술의 발전으로 궤적 데이터 분석에 다양한 분석방법이 융합적으로 접목되어 의미 있는 연구결과 도출되고 있어 이에 대한 체계적 분석이 필요하다. 이러한 배경하에 본 연구는 궤적 데이터를 활용한 국내외 약 150여 편의 연구를 응용분야 및 활용방법론 별로 구분하고, 응용분야별, 궤적 데이터 분석 방법론별 최근 동향을 분석하였다. 이는 향후 궤적 데이터에 적용가능한 방법론 탐색, 궤적 데이터 분석과 관련된 구체적 사례 탐색, 궤적 데이터를 활용한 응용서비스 도출의 자료로 활용될 수 있을 것으로 사료된다.
PURPOSES : The actual service life of repair methods applied to cement concrete pavement is analyzed based on de-icing agent usage.
METHODS : Highway PMS data pertaining to de-icing agent usage are classified into three grades: low (1~5 ton/lane/year), medium (5~8 ton/lane/year), and high (greater than 8 ton/lane/year). The repair methods considered include diamond grinding, patching, joint repair, partial depth repair, and asphalt overlay on five major highways. The service life of each repair method is analyzed based on the usage level of the de-icing agent.
RESULTS : The service lives of the applied repair methods are much shorter than expected. It is confirmed that the service life afforded by diamond grinding, patching, and joint repair methods are not significantly affected by the use of de-icing agents, whereas that afforded by asphalt overlay and partial depth repair methods is affected significantly. The service life afforded by the asphalt overlay and partial depth repair methods decreases at high usage levels of the de-icing agent (greater than 8 ton/lane/year).
CONCLUSIONS : Among the repair methods considered, the service life afforded by partial depth repair and asphalt overlay is affected significantly by the amount of de-icing agent used. Additionally, the differences between the expected and actual analyzed service lives should be considered in the next-generation maintenance strategy for cement concrete pavements.
PURPOSES : This study analyzes the service life of the repair methods of jointed plain concrete pavement (JPCP) on expressways in Korea using PMS data.
METHODS : The Korea Expressway Corporation PMS data acquired from five major expressways in Korea were used for the analysis. The service lives of the repair methods were considered for two different cases: 1) the previous repair methods had been completely rerepaired by another or the same method due to their damage, and 2) the current repair methods were still in use.
RESULTS : The service lives of D/G and section repair were shown to be at least 30 % and 50 % shorter than expected, respectively. Joint sealing and crack sealing exhibited a service life similar to that expected. The Mill-and-Asphalt-overlay method showed an approximately 30 % longer service life; this might be because some damage to the asphalt overlay is typically neglected until subsequent maintenance and repair. When multiple repairs were applied in series for an identical pavement section, the service life of repairs on previously damaged secti ons become even shorter compared to their first application.
CONCLUSIONS : It was found that the analyzed service life of most important repair methods did not reach the expected service life, and that the service life of the same repair method becomes shorter as applied to the previously repaired concrete pavement sections. These shorter service lives should be seriously considered in future JPCP repair strategy development.
본 연구에서는 초고층건축물의 풍진동 모니터링을 위한 시스템식별기법의 현장적용성을 평가하였다. 실제 아웃리거-벨트월 을 횡력저항 시스템으로 가지는 실제 63층 RC구조물을 대상으로 상시 및 강풍시 응답을 모니터링하였으며, 진동수영역분해(FDD), 랜덤감소(RDT)기법, 부분공간시스템식별(SSI)법을 사용하여 진동특성을 식별하였다. 건물의 평면이 정방형이고, 두 개의 횡방향 모드의 진동수는 매우 유사하였다. 모든 식별기법에서 태풍과 같이 강한 외력이 존재할 경우 뿐만 아니라 상시미진동 에서도 구조물의 모드 특성을 식별할 수 있었다. 현장에서의 적용성 평가결과, 계산속도는 FDD가 가장 빨랐으며, RDT가 가장 간단한 프로그래밍 절차를 가지고 있음을 확인하였다.
We are creating all-sky diffuse maps from the AKARI mid-infrared survey data with the two photometric bands centered at wavelengths of 9 and 18 m. The AKARI mid-infrared diffuse maps achieve higher spatial resolution and higher sensitivity than the IRAS maps. In particular, the 9 m data are unique resources as an all-sky tracer of the emission of polycyclic aromatic hydrocarbons (PAHs). However, the original data suffer many artifacts. Thus, we have been developing correction methods. Among them, we have recently improved correction methods for the non-linearity and the reset anomaly of the detector response. These corrections successfully reduce the artifact level down to 0.1 MJy sr1 on average, which is essential for discussion on faint extended emission (e.g., the Galactic PAH emission). We have also made progress in the subtraction of the scattered light caused in the camera optics. We plan to release the improved diffuse maps to the public within a year.
Many edge detection methods, based on horizontal and vertical derivatives, have been introduced to provide us with intuitive information about the horizontal distribution of a subsurface anomalous body. Understanding the characteristics of each edge detection method is important for selecting an optimized method. In order to compare the characteristics of the individual methods, this study applied each method to synthetic magnetic data created using homogeneous prisms with different sizes, the numbers of bodies, and spacings between them. Seven edge detection methods were comprehensively and quantitatively analyzed: the total horizontal derivative (HD), the vertical derivative (VD), the 3D analytic signal (AS), the title derivative (TD), the theta map (TM), the horizontal derivative of tilt angle (HTD), and the normalized total horizontal derivative (NHD). HD and VD showed average good performance for a single-body model, but failed to detect multiple bodies. AS traced the edge for a single-body model comparatively well, but it was unable to detect an angulated corner and multiple bodies at the same time. TD and TM performed well in delineating the edges of shallower and larger bodies, but they showed relatively poor performance for deeper and smaller bodies. In contrast, they had a significant advantage in detecting the edges of multiple bodies. HTD showed poor performance in tracing close bodies since it was sensitive to an interference effect. NHD showed great performance under an appropriate window.
We have created new catalogues of AKARI/IRC 2 − 24 μm North Ecliptic Pole Deep survey through new methods of image analysis. In the new catalogues the number of false detection decreased by a factor of 10 and the number of objects detected in multiple bands increased by more than 1,500 compared to the previous work. In this proceedings the new methods of image analysis and the performance of the new catalogues are described.
We present a photometric catalog of infrared (IR) sources based on the North Ecliptic Pole Wide field (NEP-Wide) survey of AKARI, which covered a 5.4 deg 2 circular area centered on NEP. The catalog contains about 115,000 sources detected at the 9 IRC filter bands, comprehensively covering a wavelength range from 2 to 24 μm . This is a band-merged catalog including all of the photometry results from the supplementary optical data as well as the IRC bands. To validate a source at a given IRC band, we searched for counterparts in the other bands. The band-merging was done based on this cross-matching of the sources among the filter bands. The NIR sources without any counterpart in any other bands are finally excluded to avoid false objects.
The North Ecliptic Pole (NEP) Wide survey covered about 5.4 deg 2 , a nearly circular area centered on the NEP, using nine passbands of InfraRed Camera (IRC). We present the photometric properties of the data sets, and the nature of the sources detected in this field. The number of detected sources varied according to the filter band: with about 109,000 sources in the NIR, about 20,000 sources in the MIR-S, and about 16,000 sources seen in the MIR-L channel. The 5σ detection limits are about 21 mag in the NIR and 19.5 - 18.5 mag in the MIR bands in terms of the AB magnitude. 50% completeness levels are about 19.8 mag at 3 μm , 18.6 mag at 9 μm , and 18 mag at 18 μm band (in AB magnitude), respectively. In order to validate the detected sources, all of them are confirmed by matching tests with those in other bands. The 'star-like' sources, defined by the high stellarity and magnitude cut from the optical ancillary data, appear statistically to have a high probability of being stars. The nature of the various types of extragalactic sources in this field are discussed using the color-color diagrams of the NIR and MIR bands with the redshift tracks of galaxies providing useful guidelines.
When we measure a source signal in the presence of a background rate that has been independently measured, the usual approach is to obtain an estimate of the background rate by observing an empty part of the sky, and an estimate of the source signal plus background rate by observing the region where a source signal is expected. The source signal rate is then estimated by subtracting the background rate from the source signal plus background rate. However, when the rates or their observation times are small, this procedure can lead to negative estimates of the source signal rate, even when it should produce a positive value. By applying the Bayesian approach, we solve the problem and prove that the most probable value of source signal rate is zero when the observed total count is smaller than the expected background counts. It is also shown that the results from the conventional method is consistent with the most probable value obtained from the Bayesian approach when the source signal is large or the observation time is long enough.
We are often faced with the task of having to estimate the amplitude of a source signal in the presence of a background. In the simplest case, the background can be taken as being flat, and of unknown magnitude B, and the source signal of interest assumed to be the amplitude A of a peak of known shape and position. We present a robust method to find the most probable values of A and B by applying the one-dimensional Newton-Raphson method. In the derivation of the formula, we adopted the Bayesian statistics and assmumed Poisson distribution so that the results could be applied to the analysis of very weak signals, as observed in FIMS (Far-ultraviolet IMaging Spectrogaph).
Beijing-Arizona-Taipei-Connecticut (BATC) survey is a long term project to map the spectral energy distribution of various objects using 15 intermediate band filters and aims to cover about 450 sq degrees of northern sky. The SED information, combined with image structure information, is used to classify objects into several stellar and galaxy categories as well as QSO candidates. In this paper, we present a preliminary setup of robust data reduction procedure recently developed at NCU and also briefly discuss general classification scheme: redshift estimate, and automatic detection of variable objects.