Limits...
Fast motion deblurring using sensor-aided motion trajectory estimation.

Lee E, Chae E, Cheong H, Paik J - ScientificWorldJournal (2014)

Bottom Line: The proposed method estimates a point-spread-function (PSF) of motion blur by accumulating reweighted projections of the trajectory.A motion blurred image is then adaptively restored using the estimated PSF and spatially varying activity map to reduce both restoration artifacts and noise amplification.Experimental results demonstrate that the proposed method outperforms existing PSF estimation-based motion deconvolution methods in the sense of both objective and subjective performance measures.

View Article: PubMed Central - PubMed

Affiliation: Department of Image, Chung-Ang University, Seoul 156-756, Republic of Korea.

ABSTRACT
This paper presents an image deblurring algorithm to remove motion blur using analysis of motion trajectories and local statistics based on inertial sensors. The proposed method estimates a point-spread-function (PSF) of motion blur by accumulating reweighted projections of the trajectory. A motion blurred image is then adaptively restored using the estimated PSF and spatially varying activity map to reduce both restoration artifacts and noise amplification. Experimental results demonstrate that the proposed method outperforms existing PSF estimation-based motion deconvolution methods in the sense of both objective and subjective performance measures. The proposed algorithm can be employed in various imaging devices because of its efficient implementation without an iterative computational structure.

Show MeSH

Related in: MedlinePlus

Motion trajectory according to the camera translation.
© Copyright Policy - open-access
Related In: Results  -  Collection


getmorefigures.php?uid=PMC4236902&req=5

fig2: Motion trajectory according to the camera translation.

Mentions: In estimating the size and shape of a motion PSF, only the relative position of the camera is needed because the PSF is the sum of reflected intensities from the first position to the last one of the camera motion as described in (2). Each camera position is projected onto the image plane and can be expressed using a planar homography as(4)xk,yk,1T=CRk+1dtknvTC−1x0,y0,1T,where C represents the camera intrinsic matrix, R is the rotation matrix, d is the scene depth, t is the translation vector, and nv is the normal vector to the image plane. The relationship between the motion trajectory and camera translation is shown in Figure 2, where the motion trajectory Δmt in the image plane is computed as(5)Δmt=lfdΔtc,where lf and Δtc, respectively, denote a focal length and translation of the camera. If the scene depth is assumed to be much larger than the focal length, Δmt can be neglected. For this reason, the camera translation does not affect the motion PSF under the large scene depth, and (4) is simplified as(6)xk,yk,1T=CRiC−1x0,y0,1T.The camera coordinate is assumed to be aligned to the world coordinate whose origin lies on the optical axis of the camera. In this case, camera matrix C is determined by the focal length lf as(7)C=lf000lf0001.Using the small-angle approximation [17] and space-invariant motion blur, the rotation matrix is computed as(8)R=10ωiy01−ωix−ωiyωix1,where ωix and ωiy represent the ith angular velocities around x and y axes, respectively. Since lftan(ω) ≈ lfω for a very small ω, the projection matrix in (6) can be expressed as(9)xk,yk,1T=10lfωiy01lfωix001x0,y0,1T.In this work, we use gyro data to estimate angular velocities according to the camera motion as shown in Figure 3 and compute correspondingly the projected positions in the image plane. Under the ideal condition, the projected trajectory is equal to the PSF of the camera motion. However, the gyro data are noisy under real circumstances. More specifically, noisy gyro data results in erroneous matching between the projected position in the image plane and the real PSF sample. For robust estimation of PSF using noisy gyro data, we assume that a point on the projected trajectory has Gaussian distribution, and as a result the projected trajectory consists of sum of Gaussian distributions as(10)hm,n=1KG∑k=1KGm−xk,n−yk,where G represents a two-dimensional Gaussian distribution and KG is the normalization constant. As a result, the PSF of camera motion becomes the accumulation of the reweighted trajectory using Gaussian distribution as shown in Figure 4(a). Gaussian distribution is estimated by analyzing the gyro data of a fixed camera as shown in Figure 4(b).


Fast motion deblurring using sensor-aided motion trajectory estimation.

Lee E, Chae E, Cheong H, Paik J - ScientificWorldJournal (2014)

Motion trajectory according to the camera translation.
© Copyright Policy - open-access
Related In: Results  -  Collection

Show All Figures
getmorefigures.php?uid=PMC4236902&req=5

fig2: Motion trajectory according to the camera translation.
Mentions: In estimating the size and shape of a motion PSF, only the relative position of the camera is needed because the PSF is the sum of reflected intensities from the first position to the last one of the camera motion as described in (2). Each camera position is projected onto the image plane and can be expressed using a planar homography as(4)xk,yk,1T=CRk+1dtknvTC−1x0,y0,1T,where C represents the camera intrinsic matrix, R is the rotation matrix, d is the scene depth, t is the translation vector, and nv is the normal vector to the image plane. The relationship between the motion trajectory and camera translation is shown in Figure 2, where the motion trajectory Δmt in the image plane is computed as(5)Δmt=lfdΔtc,where lf and Δtc, respectively, denote a focal length and translation of the camera. If the scene depth is assumed to be much larger than the focal length, Δmt can be neglected. For this reason, the camera translation does not affect the motion PSF under the large scene depth, and (4) is simplified as(6)xk,yk,1T=CRiC−1x0,y0,1T.The camera coordinate is assumed to be aligned to the world coordinate whose origin lies on the optical axis of the camera. In this case, camera matrix C is determined by the focal length lf as(7)C=lf000lf0001.Using the small-angle approximation [17] and space-invariant motion blur, the rotation matrix is computed as(8)R=10ωiy01−ωix−ωiyωix1,where ωix and ωiy represent the ith angular velocities around x and y axes, respectively. Since lftan(ω) ≈ lfω for a very small ω, the projection matrix in (6) can be expressed as(9)xk,yk,1T=10lfωiy01lfωix001x0,y0,1T.In this work, we use gyro data to estimate angular velocities according to the camera motion as shown in Figure 3 and compute correspondingly the projected positions in the image plane. Under the ideal condition, the projected trajectory is equal to the PSF of the camera motion. However, the gyro data are noisy under real circumstances. More specifically, noisy gyro data results in erroneous matching between the projected position in the image plane and the real PSF sample. For robust estimation of PSF using noisy gyro data, we assume that a point on the projected trajectory has Gaussian distribution, and as a result the projected trajectory consists of sum of Gaussian distributions as(10)hm,n=1KG∑k=1KGm−xk,n−yk,where G represents a two-dimensional Gaussian distribution and KG is the normalization constant. As a result, the PSF of camera motion becomes the accumulation of the reweighted trajectory using Gaussian distribution as shown in Figure 4(a). Gaussian distribution is estimated by analyzing the gyro data of a fixed camera as shown in Figure 4(b).

Bottom Line: The proposed method estimates a point-spread-function (PSF) of motion blur by accumulating reweighted projections of the trajectory.A motion blurred image is then adaptively restored using the estimated PSF and spatially varying activity map to reduce both restoration artifacts and noise amplification.Experimental results demonstrate that the proposed method outperforms existing PSF estimation-based motion deconvolution methods in the sense of both objective and subjective performance measures.

View Article: PubMed Central - PubMed

Affiliation: Department of Image, Chung-Ang University, Seoul 156-756, Republic of Korea.

ABSTRACT
This paper presents an image deblurring algorithm to remove motion blur using analysis of motion trajectories and local statistics based on inertial sensors. The proposed method estimates a point-spread-function (PSF) of motion blur by accumulating reweighted projections of the trajectory. A motion blurred image is then adaptively restored using the estimated PSF and spatially varying activity map to reduce both restoration artifacts and noise amplification. Experimental results demonstrate that the proposed method outperforms existing PSF estimation-based motion deconvolution methods in the sense of both objective and subjective performance measures. The proposed algorithm can be employed in various imaging devices because of its efficient implementation without an iterative computational structure.

Show MeSH
Related in: MedlinePlus