Frontiers of Information Technology & Electronic Engineering

ISSN 2095-9184

   优先出版

合作单位

2015年, 第16卷 第6期 出版日期:2015-06-05

选择: 合并摘要 显示/隐藏图片
Distributed coordination inmulti-agent systems: a graph Laplacian perspective
Zhi-min HAN,Zhi-yun LIN,Min-yue FU,Zhi-yong CHEN
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 429-448.  
https://doi.org/10.1631/FITEE.1500118

摘要   PDF (532KB)

This paper reviews some main results and progress in distributed multi-agent coordination from a graph Laplacian perspective. Distributed multi-agent coordination has been a very active subject studied extensively by the systems and control community in last decades, including distributed consensus, formation control, sensor localization, distributed optimization, etc. The aim of this paper is to provide both a comprehensive survey of existing literature in distributed multi-agent coordination and a new perspective in terms of graph Laplacian to categorize the fundamental mechanisms for distributed coordination. For different types of graph Laplacians, we summarize their inherent coordination features and specific research issues. This paper also highlights several promising research directions along with some open problems that are deemed important for future study.

参考文献 | 补充材料 | 相关文章 | 多维度评价
A sampling method based on URL clustering for fast web accessibility evaluation
Meng-ni ZHANG,Can WANG,Jia-jun BU,Zhi YU,Yu ZHOU,Chun CHEN
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 449-456.  
https://doi.org/10.1631/FITEE.1400377

摘要   PDF (559KB)

When evaluating the accessibility of a large website, we rely on sampling methods to reduce the cost of evaluation. This may lead to a biased evaluation when the distribution of checkpoint violations in a website is skewed and the selected samples do not provide a good representation of the entire website. To improve sampling quality, stratified sampling methods first cluster web pages in a site and then draw samples from each cluster. In existing stratified sampling methods, however, all the pages in a website need to be analyzed for clustering, causing huge I/O and computation costs. To address this issue, we propose a novel page sampling method based on URL clustering for web accessibility evaluation, namely URLSamp. Using only the URL information for stratified page sampling, URLSamp can efficiently scale to large websites. Meanwhile, by exploiting similarities in URL patterns, URLSamp cluster pages by their generating scripts and can thus effectively detect accessibility problems from web page templates. We use a data set of 45 web sites to validate our method. Experimental results show that our URLSamp method is both effective and efficient for web accessibility evaluation.

参考文献 | 补充材料 | 相关文章 | 多维度评价
Topicmodeling for large-scale text data
Xi-ming LI,Ji-hong OUYANG
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 457-465.  
https://doi.org/10.1631/FITEE.1400352

摘要   PDF (430KB)

This paper develops a novel online algorithm, namely moving average stochastic variational inference (MASVI), which applies the results obtained by previous iterations to smooth out noisy natural gradients. We analyze the convergence property of the proposed algorithm and conduct a set of experiments on two large-scale collections that contain millions of documents. Experimental results indicate that in contrast to algorithms named ‘stochastic variational inference’ and ‘SGRLD’, our algorithm achieves a faster convergence rate and better performance.

参考文献 | 补充材料 | 相关文章 | 多维度评价
AGCD: a robust periodicity analysis method based on approximate greatest common divisor
Juan YU,Pei-zhong LU
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 466-473.  
https://doi.org/10.1631/FITEE.1400345

摘要   PDF (362KB)

Periodicity is one of the most common phenomena in the physical world. The problem of periodicity analysis (or period detection) is a research topic in several areas, such as signal processing and data mining. However, period detection is a very challenging problem, due to the sparsity and noisiness of observational datasets of periodic events. This paper focuses on the problem of period detection from sparse and noisy observational datasets. To solve the problem, a novel method based on the approximate greatest common divisor (AGCD) is proposed. The proposed method is robust to sparseness and noise, and is efficient. Moreover, unlike most existing methods, it does not need prior knowledge of the rough range of the period. To evaluate the accuracy and efficiency of the proposed method, comprehensive experiments on synthetic data are conducted. Experimental results show that our method can yield highly accurate results with small datasets, is more robust to sparseness and noise, and is less sensitive to the magnitude of period than compared methods.

参考文献 | 补充材料 | 相关文章 | 多维度评价
Using hybrid models to predict blood pressure reactivity to unsupported back based on anthropometric characteristics
Gurmanik KAUR,Ajat Shatru ARORA,Vijender Kumar JAIN
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 474-485.  
https://doi.org/10.1631/FITEE.1400295

摘要   PDF (558KB)

Accurate blood pressure (BP) measurement is essential in epidemiological studies, screening programmes, and research studies as well as in clinical practice for the early detection and prevention of high BP-related risks such as coronary heart disease, stroke, and kidney failure. Posture of the participant plays a vital role in accurate measurement of BP. Guidelines on measurement of BP contain recommendations on the position of the back of the participants by advising that they should sit with supported back to avoid spuriously high readings. In this work, principal component analysis (PCA) is fused with forward stepwise regression (SWR), artificial neural network (ANN), adaptive neuro-fuzzy inference system (ANFIS), and the least squares support vector machine (LS-SVM) model for the prediction of BP reactivity to an unsupported back in normotensive and hypertensive participants. PCA is used to remove multi-collinearity among anthropometric predictor variables and to select a subset of components, termed ‘principal components’ (PCs), from the original dataset. The selected PCs are fed into the proposed models for modeling and testing. The evaluation of the performance of the constructed models, using appropriate statistical indices, shows clearly that a PCA-based LS-SVM (PCA-LS-SVM) model is a promising approach for the prediction of BP reactivity in comparison to others. This assessment demonstrates the importance and advantages posed by hybrid models for the prediction of variables in biomedical research studies.

参考文献 | 补充材料 | 相关文章 | 多维度评价
Fast removal of ocular artifacts from electroencephalogram signals using spatial constraint independent component analysis based recursive least squares in brain-computer interface
Bang-hua YANG,Liang-fei HE,Lin LIN,Qian WANG
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 486-496.  
https://doi.org/10.1631/FITEE.1400299

摘要   PDF (780KB)

Ocular artifacts cause the main interfering signals within electroencephalogram (EEG) signal measurements. An adaptive filter based on reference signals from an electrooculogram (EOG) can reduce ocular interference, but collecting EOG signals during a long-term EEG recording is inconvenient and uncomfortable for the subject. To remove ocular artifacts from EEG in brain-computer interfaces (BCIs), a method named spatial constraint independent component analysis based recursive least squares (SCICA-RLS) is proposed. The method consists of two stages. In the first stage, independent component analysis (ICA) is used to decompose multiple EEG channels into an equal number of independent components (ICs). Ocular ICs are identified by an automatic artifact detection method based on kurtosis. Then empirical mode decomposition (EMD) is employed to remove any cerebral activity from the identified ocular ICs to obtain exact artifact ICs. In the second stage, first, SCICA applies exact artifact ICs obtained in the first stage as a constraint to extract artifact ICs from the given EEG signal. These extracted ICs are called spatial constraint ICs (SC-ICs). Then the RLS based adaptive filter uses SC-ICs as reference signals to reduce interference, which avoids the need for parallel EOG recordings. In addition, the proposed method has the ability of fast computation as it is not necessary for SCICA to identify all ICs like ICA. Based on the EEG data recorded from seven subjects, the new approach can lead to average classification accuracies of 3.3% and 12.6% higher than those of the standard ICA and raw EEG, respectively. In addition, the proposed method has 83.5% and 83.8% reduction in time-consumption compared with the standard ICA and ICA-RLS, respectively, which demonstrates a better and faster OA reduction.

参考文献 | 补充材料 | 相关文章 | 多维度评价
A combined modulated feedback and temperature compensation approach to improve bias drift of a closed-loop MEMS capacitive accelerometer
Ming-jun MA,Zhong-he JIN,Hui-jie ZHU
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 497-510.  
https://doi.org/10.1631/FITEE.1400349

摘要   PDF (1370KB)

The bias drift of a micro-electro-mechanical systems (MEMS) accelerometer suffers from the 1/f noise and the temperature effect. For massive applications, the bias drift urgently needs to be improved. Conventional methods often cannot address the 1/f noise and temperature effect in one architecture. In this paper, a combined approach on closed-loop architecture modification is proposed to minimize the bias drift. The modulated feedback approach is used to isolate the 1/f noise that exists in the conventional direct feedback approach. Then a common mode signal is created and added into the closed loop on the basis of modulated feedback architecture, to compensate for the temperature drift. With the combined approach, the bias instability is improved to less than 13 μg, and the drift of the Allan variance result is reduced to 17 μg at 100 s of the integration time. The temperature coefficient is reduced from 4.68 to 0.1 mg/°C. The combined approach could be useful for many other closed-loop accelerometers.

参考文献 | 补充材料 | 相关文章 | 多维度评价
An improved low-complexity sum-product decoding algorithm for low-density parity-check codes
Michaelraj Kingston ROBERTS,Ramesh JAYABALAN
Frontiers of Information Technology & Electronic Engineering. 2015, 16 (6): 511-518.  
https://doi.org/10.1631/FITEE.1400269

摘要   PDF (504KB)

In this paper, an improved low-complexity sum-product decoding algorithm is presented for low-density parity-check (LDPC) codes. In the proposed algorithm, reduction in computational complexity is achieved by utilizing fast Fourier transform (FFT) with time shift in the check node process. The improvement in the decoding performance is achieved by utilizing an optimized integer constant in the variable node process. Simulation results show that the proposed algorithm achieves an overall coding gain improvement ranging from 0.04 to 0.46 dB. Moreover, when compared with the sum-product algorithm (SPA), the proposed decoding algorithm can achieve a reduction of 42%-67% of the total number of arithmetic operations required for the decoding process.

参考文献 | 补充材料 | 相关文章 | 多维度评价
8篇文章