|Table of Contents|

[1] Jia Huizhen, Lei Chucong, Wang Tonghan, Li Tan, et al. No-reference blur assessment methodbased on gradient and saliency [J]. Journal of Southeast University (English Edition), 2021, 37 (2): 184-191. [doi:10.3969/j.issn.1003-7985.2021.02.008]
Copy

No-reference blur assessment methodbased on gradient and saliency()
Share:

Journal of Southeast University (English Edition)[ISSN:1003-7985/CN:32-1325/N]

Volumn:
37
Issue:
2021 2
Page:
184-191
Research Field:
Information and Communication Engineering
Publishing date:
2021-06-20

Info

Title:
No-reference blur assessment methodbased on gradient and saliency
Author(s):
Jia Huizhen1 Lei Chucong1 Wang Tonghan1 Li Tan1 Wu Jiasong2 Li Guang1He Jianfeng1 Shu Huazhong2
1Jiangxi Engineering Laboratory on Radioactive Geoscience and Big Data Technology, East China University of Technology, Nanchang 330013, China
2Laboratory of Image Science and Technology, Southeast University, Nanjing 210096, China
Keywords:
no-reference image quality assessment reblurring effect gradient similarity saliency
PACS:
TN911.73
DOI:
10.3969/j.issn.1003-7985.2021.02.008
Abstract:
To evaluate the quality of blurred images effectively, this study proposes a no-reference blur assessment method based on gradient distortion measurement and salient region maps. First, a Gaussian low-pass filter is used to construct a reference image by blurring a given image. Gradient similarity is included to obtain the gradient distortion measurement map, which can finely reflect the smallest possible changes in textures and details. Second, a saliency model is utilized to calculate image saliency. Specifically, an adaptive method is used to calculate the specific salient threshold of the blurred image, and the blurred image is binarized to yield the salient region map. Block-wise visual saliency serves as the weight to obtain the final image quality. Experimental results based on the image and video engineering database, categorial image quality database, and camera image database demonstrate that the proposed method correlates well with human judgment. Its computational complexity is also relatively low.

References:

[1] Hadizadeh H, Bajic I V. Full-reference objective quality assessment of tone-mapped images [J]. IEEE Transactions on Multimedia, 2017, 20(2): 392-404. DOI: 10.1109/TMM.2017.2740023
[2] Ma J, An P, Shen L Q, et al. Reduced-reference stereoscopic image quality assessment using natural scene statistics and structural degradation [J]. IEEE Access, 2018, 6: 2768-2780. DOI: 10.1109/ACCESS.2017.2785282.
[3] Fang Y M, Yan J B, Li L D, et al.No reference quality assessment for screen content images with both local and global feature representation[J]. IEEE Transactions on Image Processing, 2018, 27(4): 1600-1610. DOI:10.1109 /TIP.2017.2781307.
[4] Wang Z. M. Review of no-reference image quality assessment [J]. Acta Automatica Sinica, 2015, 41(6):1062-1079. DOI:10.16383/j.aas.2015.c140404. (in Chinese)
[5] Caviedes J, Oberti F. A new sharpness metric based on local kurtosis, edge and energy information [J]. Signal Processing: Image Communication, 2004, 19(2): 147-161. DOI:10.1016/j.image.2003.08.002.
[6] Liu H T, Heynderickx I. A perceptually relevant no-reference blockiness metric based on local image characteristics [J]. EURASIP Journal on Advances in Signal Processing, 2009, 2009: 1-14. DOI:10.1155/2009/263540.
[7] Blanchet G, Moisan L. An explicit sharpness index related to global phase coherence[C]// IEEE International Conference on Acoustics, Speech, and Signal Processing(ICASSP). Yoto, Japan, 2012:1065-1068. DOI: 10.1109/ICASSP.2012.6288070.
[8] Hassen R, Wang Z, Salama M. M. A. Image sharpness assessment based on local phase coherence [J]. IEEE Transactions on Image Processing, 2013, 22(7): 2798-2810. DOI:10.1109/TIP.2013.2251643.
[9] Crete F, Dolmiere T, Ladret P, et al. The blur effect: Perception and estimation with a new no-reference perceptual blur metric[C]//2017 Human Vision And Electronic Imaging Ⅻ. Sun Jose, CA, USA, 2007, 6492: 64920I. DOI:10.1117/12.702790.
[10] Tsomko E, Kim H J. Efficient method of detecting globally blurry or sharp images[C]// 2008 Ninth International Workshop on Image Analysis for Multimedia Interactive Services. Klagenfurt, Austria, 2008: 171-174. DOI: 10.1109/WIAMIS.2008.28.
[11] Wee C Y, Paramesran R. Image sharpness measure using eigenvalues[C]//2008 9th International Conference on Signal Processing. Beijing, China, 2008: 840-843. DOI: 10.1109/ICOSP.2008.4
[12] Ferzli R, Karam L J. A no-reference objective image sharpness metric based on the notion of just noticeable blur(JNB)[J].IEEE Transactions on Image Processing, 2009, 18(4): 717-728. DOI:10.1109/TIP.2008.2011760.
[13] Narvekar N D, Karam L J. A no-reference image blur metric based on the cumulative probability of blur detection(CPBD)[J].IEEE Transactions on Image Processing, 2011, 20(9): 2678-2683. DOI:10.1109/TIP.2011.2131660.
[14] Yu S D, Wu S B, Wang L, et al. A shallow convolutional neural network for blind image sharpness assessment[J].PLoS One, 2017, 12(5): e0176632. DOI:10.1371/journal.pone.0176632.
[15] Hosseini M S, Zhang, Y, Plataniotis K N. Encoding visual sensitivity by MaxPol convolution filters for image sharpness assessment[J]. IEEE Transactions on Image Processing, 2019, 28(9): 4510-4525. DOI: 10.1109/TIP.2019.2906582.
[16] Yu S, Jiang F, Li L, Xie Y. CNN-GRNN for image sharpness assessment[C]//2016 Asian Conference on Computer Vision. Cham:Springer, 2016: 50-61. DOI:10.1007/978-3-319-54407-6_4.
[17] Zhang L, Zhang L, Mou X Q, et al. FSIM:A feature similarity index for image quality assessment[J]. IEEE Transactions on Image Processing, 2011, 20(8): 2378-2386. DOI:10.1109/TIP.2011.2109730.
[18] Zhang L, Shen Y, Li H Y. VSI:A visual saliency-induced index for perceptual image quality assessment[J]. IEEE Transactions on Image Processing, 2014, 23(10): 4270-4281. DOI:10.1109/TIP.2014.2346028.
[19] Xue W F, Zhang L, Mou X Q, et al. Gradient magnitude similarity deviation: A highly efficient perceptual image quality index[J].IEEE Transactions on Image Processing, 2014, 23(2): 684-695. DOI:10.1109/TIP.2013.2293423.
[20] Lu Y F, Zhang T, Zheng J, et al. No-reference blurring image quality assessment based on local standard deviation and saliency map[J]. Journal of Jilin University(Engineering and Technology Edition), 2016, 46(4): 1337-1343.(in Chinese)
[21] Zhao P, LI L, Cai H. Saliency guided gradient similarity for fast perceptual blur assessment[J]. IEICE Transactions on Information and Systems, 2015, 98(8): 1613-1616.DOI:10.3969/j.issn.0254-3087.2016.07.026.
[22] Wang H Y, Feng J, Niu W, et al. No-reference image quality assessment based on re-blur theory[J]. Chinese Journal of Scientific Instrument, 2016, 37(7):1647-1655.DOI:10.13229/j.cnki.jdxbgxb201604046. (in Chinese)
[23] Furht B, Marqure O. The handbook of video databases: design and applications [M]. Boca Raton, FL, USA: CRC Press, 2003: 1041-1078.
[24] Wang Z, Bovik A C, Sheikh H R, et al. Image quality assessment: From error visibility to structural similarity[J].IEEE Transactions on Image Processing, 2004, 13(4): 600-612. DOI:10.1109/tip.2003.819861.
[25] Wang Z, Bovik A C, Lu L G.Why is image quality assessment so difficult?[C]// 2011 IEEE International Conference on Acoustics. Orlando, FL, USA, 2002, 4: 3313-3316. DOI: 10.1109/ICASSP.2002.5745362.
[26] Liu A M, Lin W S, Narwaria M. Image quality assessment based on gradient similarity[J].IEEE Transactions on Image Processing, 2012, 21(4): 1500-1512. DOI:10.1109/TIP.2011.2175935.
[27] Wang T H, Jia H Z, Shu H Z. Full-reference image quality assessment algorithm based on gradient magnitude and histogram of oriented gradient [J]. Journal of Southeast University(Natural Science Edition), 2018, 48(2): 276-281. DOI:10.3969/j.issn.1001-0505.2018.02.014. (in Chinese)
[28] Yang C L, Kuang K Z, Chen G H, et al. Gradient-based structural similarity for image quality assessment [J]. Journal of South China University of Technology(Natural Science Edition), 2006(9): 22-25. DOI:10.3321/j.issn:1000-565X.2006.09.005. (in Chinese)
[29] Zhang L, Gu Z Y, Li H Y, et al. SDSP: A novel saliency detection method by combining simple priors[C]//International Conference on Image Processing. Melbourne, Australia, 2013: 171-175. DOI: 10.1109/ICIP.2013.6738036.
[30] Itti L, Koch C, Niebur E, et al. A model of saliency-based visual attention for rapid scene analysis [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1998, 20(11): 1254-1259. DOI: 10.1109/34.730558.
[31] Xia Z, Cheng C, Li X. Visual comfort enhancement study based on visual attention detection for stereoscopic displays [J]. Journal of the Society for Information Display, 2016, 24(10): 633-640. DOI:10.1002/jsid.508.
[32] Zhou Y, Wang K, Zhang H X, Xu W Q, Li L. Blur image quality assessment method based on blur detection probability variation [J]. Laser & Optoelectronics Progress, 2020, 57(10): 55- 60. DOI:10.3788/LOP57.101004.(in Chinese).
[33] Sheikh H, Wang Z, Cormack L, et al. LIVE image quality assessment database release2[EB/OL].(2005)[2020-05-13]. http: / /live.Ece.utexas.edu/ research/ quality.
[34] Larson E C. Chandler D M. Categorical image quality(CSIQ)database [EB/OL].(2009)[2020-07-20].http: // vision.okstata.edu/csiq/.
[35] Virtanen T, Nuutinen M, Vaahteranoksa M, et al. Cid2013: A database for evaluating no-reference image quality assessment algorithms[J]. IEEE Transactions on Image Processing, 2015, 24(1): 390-402. DOI: 10.1109/TIP.2014.2378061.
[36] Video Quality Experts Group. Validation of objective models of video quality assessment, phase Ⅱ VQEG [R/OL].(2003-08-12)[2020-05-13]. http://www.vqeg.org.
[37] Sheikh H R, Sabir M, Bovik A C, et al. A statistical evaluation of recent full reference image quality assessment algorithms [J]. IEEE Transactions on Image Processing, 2006, 15(11): 3440-3451. DOI: 10.1109/TIP.2006.881959.

Memo

Memo:
Biography: Jia Huizhen(1983—), female, doctor, lecturer, hzjianlg@126.com.
Foundation items: The National Natural Science Foundation of China(No. 61762004, 61762005), the National Key Research and Development Program(No. 2018YFB1702700), the Science and Technology Project Founded by the Education Department of Jiangxi Province, China(No. GJJ200702, GJJ200746), the Open Fund Project of Jiangxi Engineering Laboratory on Radioactive Geoscience and Big Data Technology(No. JETRCNGDSS201901, JELRGBDT202001, JELRGBDT202003).
Citation: Jia Huizhen, Lei Chucong, Wang Tonghan, et al. No-reference blur assessment method based on gradient and saliency[J].Journal of Southeast University(English Edition), 2021, 37(2):184-191.DOI:10.3969/j.issn.1003-7985.2021.02.008.
Last Update: 2021-06-20