Example-based Super Resolution Text Image Reconstruction Using Image Observation Model 


Vol. 17,  No. 4, pp. 295-302, Aug.  2010
10.3745/KIPSTB.2010.17.4.295


PDF
  Abstract

Example-based super resolution(EBSR) is a method to reconstruct high-resolution images by learning patch-wise correspondence between high-resolution and low-resolution images. It can reconstruct a high-resolution from just a single low-resolution image. However, when it is applied to a text image whose font type and size are different from those of training images, it often produces lots of noise. The primary reason is that, in the patch matching step of the reconstruction process, input patches can be inappropriately matched to the high-resolution patches in the patch dictionary. In this paper, we propose a new patch matching method to overcome this problem. Using an image observation model, it preserves the correlation between the input and the output images. Therefore, it effectively suppresses spurious noise caused by inappropriately matched patches. This does not only improve the quality of the output image but also allows the system to use a huge dictionary containing a variety of font types and sizes, which significantly improves the adaptability to variation in font type and size. In experiments, the proposed method outperformed conventional methods in reconstruction of multi-font and multi-size images. Moreover, it improved recognition performance from 88.58% to 93.54%, which confirms the practical effect of the proposed method on recognition performance.

  Statistics


  Cite this article

[IEEE Style]

G. R. Park and I. J. Kim, "Example-based Super Resolution Text Image Reconstruction Using Image Observation Model," The KIPS Transactions:PartB , vol. 17, no. 4, pp. 295-302, 2010. DOI: 10.3745/KIPSTB.2010.17.4.295.

[ACM Style]

Gyu Ro Park and In Jung Kim. 2010. Example-based Super Resolution Text Image Reconstruction Using Image Observation Model. The KIPS Transactions:PartB , 17, 4, (2010), 295-302. DOI: 10.3745/KIPSTB.2010.17.4.295.