Mechanics, Materials Science & Engineering, July 2017 – ISSN 2412-5954
Random Sparse Sampling and Equal Intervals Bregman High-Resolution Signal Reconstruction1 Guojun Qin1, Jingfang Wang2 1 – School of Mechanical Engineering, Hunan International Economics University, Changsha, China 2 – School of Electrical & Information Engineering, Hunan International Economics University, Changsha, China DOI 10.2412/mmse.74.23.960 provided by Seo4U.link
Keywords: compressed sensing, random sampling, incoherent measurement matrix, sparse sampling, Bregman reconstruction.
ABSTRACT. Compressed sensing (CS) is a new signal processing methods, signal sampling and reconstruction are processed to take full advantage of the signal sparse knowledge structure in the transform domain. It consists of three elements: the sparse matrix, incoherent measurement matrix and reconstruction algorithm. In the framework of compressed sensing theory, the sampling rate is no longer decided in the bandwidth of the signal, but it depends on the structure and content of the information in the signal. In this paper, a complex domain random observation matrix is designed and interval samples are projected to any set of random sample, ie, sparse random sampling. The signal is successfully restored by the use of Bregman algorithm. The signal is described in the transform space, and a theoretical framework is established with a new signal descriptions and processing. By making the case to ensure that the information loss, signal is sampled at much lower than the Nyquist sampling theorem requiring rate, but also the signal is completely restored in high probability. The sparse signal is simulated in sampling and reconstruction of time domain and frequency domain, and the signal length, the measured value, the signal sparse level and SNR influence are analyzed in the reconstruction error.
Introduction. Sampling theorem was the law which is followed in a sampling process, it described the relationship between the sampling frequency and the frequency spectrum of the signal. Sampling theorem was first proposed by Nyquist in 1928, who is the American telecommunications engineer, this theorem was clearly stated and formally incorporated theorem in 1948 by the founder Shannon of information theory , so it is called Shannon sampling theorem in many references. The theory dominates almost all of the signals / images acquisition, processing, storage, transmission, etc., that is: when the highest frequency of a signal is
m ,
The sampling rate as long as the 2 m is sampled to this signal, signal can be perfectly reconstructed. This is a problem with sufficient but not necessarily, if two samples are taken to a waveform with the highest frequency, they are with necessary and sufficient, then the other one cycle of all frequency waves gather more than two samples - adequate but not necessary. As can be seen, Shannon sampling theorem is not the best, it does not make use of the structural characteristics of the signal itself – sparsity (with respect to the signal length, only a very few non-zero coefficients, and the remaining coefficients are zero), only a small part of the collected data is our actual useful work. They want to keep, but most of the rest will have to give up and thus waste a lot of storage space. Nyquist theory codec is that encoding end of the signal is sampled first, and then the samples were all converted and encoded for amplitude and position of one important factor, and finally the coded value were stored or transmitted; decoding of the signal is only the reverse process of encoding, the signal is restored by the de-compression, inverse transform.
© 2017 The Authors. Published by Magnolithe GmbH. This is an open access article under the CC BY-NC-ND license http://creativecommons.org/licenses/by-nc-nd/4.0/
MMSE Journal. Open Access www.mmse.xyz