1. bookAHEAD OF PRINT
Dettagli della rivista
License
Formato
Rivista
eISSN
2444-8656
Prima pubblicazione
01 Jan 2016
Frequenza di pubblicazione
2 volte all'anno
Lingue
Inglese
access type Accesso libero

Optimization of Color Matching Technology in Cultural Industry by Fractional Differential Equations

Pubblicato online: 15 Jul 2022
Volume & Edizione: AHEAD OF PRINT
Pagine: -
Ricevuto: 13 Jan 2022
Accettato: 31 Mar 2022
Dettagli della rivista
License
Formato
Rivista
eISSN
2444-8656
Prima pubblicazione
01 Jan 2016
Frequenza di pubblicazione
2 volte all'anno
Lingue
Inglese
Abstract

This article combines the fractional differential theory with the total variation method and applies it to defining cultural industry image color matching. At the same time, we propose a new image color matching denoising model based on fractional partial differential equations. The model achieves simultaneous denoising in the time direction and the spaceplane. Experiments have proved that the fractional partial differential equations method has more advantages than integer-order partial differential equations in denoising and reducing step effects. It can effectively improve the contrast and clarity of image color matching in the cultural industry.

Keywords

MSC 2010

Introduction

Under weather conditions such as rain, fog, and snow, the atmospheric scattering is severe, which results in the acquired natural scene images with poor color and contrast. WIR is not conducive to the extraction of image features. Most vision applications, including surveillance, tracking, intelligent navigation, intelligent vehicles, etc., need to extract image features [1] fully. This brings great difficulties to the normal operation of outdoor machine vision systems. Therefore, effectively removing the weather effect in the image is of great significance to improve the reliability and robustness of the visual system.

In recent years, natural scene image defogging restoration has gradually become a research hotspot in image processing and computer vision. The current restoration algorithms for images are mainly divided into two categories: one is based on image enhancement methods. For example, the histogram equalization technology is based on the moving template in the literature [2]. Since the quality of the image is reduced exponentially with the distance from the scene point to the imaging sensor, this image enhancement technology that assumes a constant depth of field of the scene cannot well defog and restore the fogged image. If you know the precise scene depth and atmospheric condition information, you can easily restore the color and contrast of the ideal image. However, the degraded images acquired under realistic conditions do not have any additional calibration information for depth of field and atmospheric conditions. The lack of available information has brought great uncertainty to the restoration work. The defogging image recovery is an ill-posed inverse problem. In this paper, with the help of the physical model of fog formation, the energy optimization model of global defogging and local defogging of outdoor images is established [3].

Analysis of color and contrast degradation in foggy weather

This section uses two atmospheric scattering physical models to analyze the color and contrast of foggy images. These two scattering models will be used for defogging and restoration of degraded images. This is the basis of the algorithm in this article. The color of the scene point obtained by the color camera in the foggy weather is the linear combination of the sky color and the color of the scene point. Its mathematical expression is: E=pD+qA E = pD + qA Where p = Ieβd, q = E(1−eβd). E is the color vector (RGB)T. E of the scene point observed in a foggy day. I is the color vector of the scene point under good weather conditions. The unit vector A indicates the direction of the sky color in the R-G-B color space. βd is called the optical depth [4]. This model assumes that the atmospheric scattering coefficient F is not selective to the wavelength of light, the same as the scattering of visible light at any wavelength. This model can be used for the restoration of color vector images. The monochromatic atmospheric scattering model is as follows: E=Ieβd+E(1eβd) E = I{e^{ - \beta d}} + {E_\infty }\left( {1 - {e^{ - \beta d}}} \right) E is the brightness of the sky. I is the brightness of the scene point under good weather conditions. The meaning of β, d is the same as the color atmospheric scattering model. This model can be used to restore grayscale images. For the convenience of description, we make the following definition: p(x, y) for the pixel. We assume that the gray values of sunny and foggy days are I(x, y), E(x, y) respectively. The neighborhood of a point p(x, y) is denoted as NR(p): NR(p) = {(i, j) ||xi|≤R,| yj||≤R}. R is a parameter that determines the size of the neighborhood [5]. The following is to analyze the color and contrast of the foggy scene points based on the atmospheric scattering model. From the physical model of atmospheric scattering, we know: I = [EE (1 − eβd)]eβd. It can be seen that the color degradation of a scene point has an exponential relationship with the scene depth d at that point. We take N1(p) as an example to analyze the changes in contrast. Assuming that the optical depths of the scene points in the neighborhood are all the same, the contrast cc of the area N1(p) in a sunny day is defined as cc=(i,j)N1(p)(I(i,j)I¯)2N1(p) cc = \sqrt {{{\sum\nolimits_{\left( {i,\,j} \right) \in {N_1}\left( p \right)} {{{\left( {I\left( {i,\,j} \right) - \bar I} \right)}^2}} } \over {{N_1}\left( p \right)}}} . Where I¯=(i,j)N1(p)I(i,j)|N1(p)|,|N1(p)| \bar I = {{\sum\nolimits_{\left( {i,\,j} \right) \in {N_1}\left( p \right)} {I\left( {i,\,j} \right)} } \over {\left| {{N_1}\left( p \right)} \right|}},\,\left| {{N_1}\left( p \right)} \right| is the potential of set |N1(p)|. This is the number of elements contained inset |N1(p)|. We assume that the contrast of area |N1(p)| in a foggy day is fc, which can be expressed as fc=(i,j)N1(p)(E(i,j)E¯)2N1(p) fc = \sqrt {{{\sum\nolimits_{\left( {i,\,j} \right) \in {N_1}\left( p \right)} {{{\left( {E\left( {i,\,j} \right) - \bar E} \right)}^2}} } \over {{N_1}\left( p \right)}}} . Where E¯=(i,j)N1(p)E(i,j)|N1(p)| \bar E = {{\sum\nolimits_{\left( {i,\,j} \right) \in {N_1}\left( p \right)} {E\left( {i,\,j} \right)} } \over {\left| {{N_1}\left( p \right)} \right|}} . From equation (2), it can be proved that the two obey the following relationship fc=eβd(i,j)N1(p)(I(i,j)I¯)2N1(p)=eβdcc fc = {e^{ - \beta d}}\sqrt {{{\sum\nolimits_{\left( {i,\,j} \right) \in {N_1}\left( p \right)} {{{\left( {I\left( {i,\,j} \right) - \bar I} \right)}^2}} } \over {{N_1}\left( p \right)}}} = {e^{ - \beta d}}cc .

On a foggy day, the color and contrast degradation has an exponential relationship with the depth of field of the scene point. Therefore, the traditional color and contrast enhancement method that assumes the same depth of field of the scene point does not fully use the prior knowledge of degradation. It cannot remove the bad weather effect in the degraded image [6]. Due to the insufficient amount of information for recovery, it has uncertainty. Based on the restoration as mentioned above algorithm of the atmospheric scattering physical model, we use simple additional information to construct a partial differential equation about the scene depth and image gradient to restore the ideal image.

PDE-based dehazing and restoration of degraded images

In this section, an energy functional including scene depth and image gradient is constructed based on the physical model of fog formation. We formalize the defogging image restoration to minimize this energy function. We propose global and local dehazing models based on partial differential equations [7]. The user's simple interactive operation is used to estimate the depth of field of the scene point to obtain the sky color, which can also eliminate the uncertainty in the restoration. This also realizes the defogging recovery from only a degraded image. The algorithm realizes interactive operation through a visual interface, which is simple and easy to operate in the algorithm. At the same time, local corrections can be made to improve the recovery effect of the global defogging model [8].

Global defogging recovery model

This section formalizes the defogging recovery problem to solve the partial differential equations about the scene depth and image gradient. The formula is expressed as ∇E = ∇IeβdβIeβdd + βEeβdd. Further simplify the formula to get: I=Eeβd+β(IE)d \nabla I = \nabla E{e^{\beta d}} + \beta \left( {I - {E_\infty }} \right)\nabla d We find the gradient ∇I = (∇IR, ∇IG, ∇IB)T for the three color channels of R, G, B respectively. The image β(IE) ∇d, where the depth of field of the scene changes smoothly, is smaller than that of ∇Eeβd. Therefore, the gradient field of the ideal image can be approximately expressed as: IEeβd \nabla I \approx \nabla E{e^{\beta d}} I*=argminIΩIEeβd2dp(x,y) {I^*} = \arg \mathop {\min }\limits_I \int\!\!\!\int_\Omega {{{\left\| {\nabla I - \nabla E{e^{\beta d}}} \right\|}^2}dp\left( {x,\,y} \right)} The corresponding boundary condition I|∂Ω is defined as I|∂Ω = [EE (1− e βd)]eβd as follows. Solve the corresponding partial differential equation as: {ΔI=div(Eeβd)I|Ω=[EE(1eβd)]eβd \left\{ {\matrix{ {\Delta I = div\left( {\nabla E{e^{\beta d}}} \right)} \hfill \cr {I\left| {_{\partial \Omega } = \left[ {E - {E_\infty }\left( {1 - {e^{ - \beta d}}} \right)} \right]} \right.{e^{\beta d}}} \hfill \cr } } \right. The Laplacian operator Δ=(2x,2y),div \Delta = \left( {{{{\partial ^2}} \over {\partial x}},{{{\partial ^2}} \over {\partial y}}} \right),div is the divergence operator. Equation (6) is the global defogging recovery model. Solving equation (6) according to the boundary conditions can restore the original image I globally. We use the finite difference method to solve the above partial differential equations numerically. This paper gives the Gauss-Seidel super-relaxation iteration format as follows {I¯=14(Ii+1,j(k)+Ii,j+1(k)+Ii1,j(k+1)+Ii,j1(k+1)[div(Eeβd)](i,j))(i,j)ΩIi,j(k+1)=wI¯+(1w)Ii,j(k) \left\{ {\matrix{ {\bar I = {1 \over 4}\left( {I_{i + 1,j}^{\left( k \right)} + I_{i,j + 1}^{\left( k \right)} + I_{i - 1,j}^{\left( {k + 1} \right)} + I_{i,j - 1}^{\left( {k + 1} \right)} - {{\left[ {div\left( {\nabla E{e^{\beta d}}} \right)} \right]}_{\left( {i,j} \right)}}} \right)\left( {i,\,j} \right) \in \Omega } \hfill \cr {I_{i,\,j}^{\left( {k + 1} \right)} = w\bar I + \left( {1 - w} \right)I_{i,j}^{\left( k \right)}} \hfill \cr } } \right. The best relaxation factor w=21+1(cosπm+cosπn2)2,m,n w = {2 \over {1 + \sqrt {1 - {{\left( {{{\cos {\pi \over m} + \cos {\pi \over n}} \over 2}} \right)}^2}} }},m,n is the width and height of an image I, respectively. The discrete difference format of [div(∇Eeβd)](i, j) is: [div(Eeβd)](i,j)eβd(i+1,j)E(i+1,j)+eβd(i1,j)E(i1,j)+eβd(i,j+1)E(i,j+1)+eβd(i,j1)E(i,j1)4eβd(i,j)E(i,j). \matrix{ {{{\left[ {div\left( {\nabla E{e^{\beta d}}} \right)} \right]}_{\left( {i,j} \right)}} \approx {e^{\beta d\left( {i + 1,j} \right)}}E\left( {i + 1,j} \right) + {e^{\beta d\left( {i - 1,j} \right)}}E\left( {i - 1,j} \right) + } \hfill \cr {{e^{\beta d\left( {i,j + 1} \right)}}E\left( {i,j + 1} \right) + {e^{\beta d\left( {i,j - 1} \right)}}E\left( {i,j - 1} \right) - 4{e^{\beta d\left( {i,j} \right)}}E\left( {i,j} \right)} \hfill \cr } . The boundary condition is discretized as I(i, j)|∂Ω = [E(i, j) − E(1 − eβd(i, j)]eβd(i, j) i, j ∈ ∂Ω.

Elimination of uncertainty in defogging restoration

The image defogging restoration problem is transformed into solving the partial differential equation shown in formula (6). However, the depth of field d and sky color E of the scene point in the equation is still unknown. This brings uncertainty to the recovery work. Sensors with a limited dynamic range (8 bits per pixel) are not sensitive to small scene changes under atmospheric conditions. Therefore, effective restoration of degraded images does not require very accurate scene depth information. Under normal circumstances, the ideal image can be effectively restored by using the changing trend of the scene depth [9]. The user realizes interactive operation through the visual interface, which is simple and easy to operate. Specific steps are as follows:

1. Select a sky area in the degraded image to obtain the brightness of the sky. The sky color vector and color direction are obtained if the degraded image is a color vector image.

First, select the approximate position of the vanishing point of the degraded image along the direction of increasing depth of field. The depth of field of a scene point has an inverse relationship with the image pixel distance from the scene point to the vanishing point. Secondly, input the scene points to approximate the maximum depth dmax and the minimum depth dmin, and interpolate the depth of field of the scene points. Here we can choose linear and non-linear interpolation. This article adopts the linear interpolation method. The specific formula is as follows d = dmaxa(dmaxdmin). Where a ∈ [0, 1]. It is used to normalize the image pixel distance from the pixel point to the vanishing point. When a = 0 is d = dmax; a = 1 When d = dmin.

So far, the depth of field d and sky color E of each pixel point has been obtained. Usually, within a few kilometers, we think that the atmospheric conditions are of the same sex. That is, the β value of each pixel in the image remains unchanged. The scene optical depth of different degraded images is shown in Figure 1. These scene depth information and atmospheric condition information eliminate the uncertainty in the restoration work. The gradual change of the color from blue to deep red in Figure 1 represents that the optical depth of the scene point gradually increases.

Figure 1

The optical depth of the scene with different degraded images generated by interpolation

Solving equation (6) according to the boundary conditions can recover the ideal image I globally. The β value will produce different concentrations of fog, which will degrade the image in different degrees. By adjusting the optical depth βd of each pixel in the image, the re-solving equation (6) can correct the recovery results as a whole. This can improve the dehazing effect (see Figure 2).

Figure 2

Global defogging restoration of degraded images

The global defogging restoration model has a good restoration effect for images with small changes in scene depth. When the scene depth changes greatly, equation (4) is not a good approximation to the ideal image gradient, and the scene depth gradient β(IE) ∨ d cannot be ignored.

Local dehazing recovery model

In the global defogging model, it is assumed that the term β(IE)∇d in equation (3) is zero. When the depth of the scene point changes greatly, the depth-of-field gradient β(IE)∇d cannot be ignored [10]. The gradient of the ideal image should be taken as ∇I = ∇Eeβd + β(IE)∇d. Different scene optical depths βd estimated by the global defogging model will produce different restoration effects. When the estimated optical depth βd of the scene is less than the true optical depth (βd)* of the scene, it cannot be completely defogged and restored. The contrast of the image is still low at this time. Overestimation will cause color distortion in the restored image.

To improve the global defogging effect, the user can select the local area ΩL with poor recovery effects to perform local defogging recovery. Let the boundary of the local area be ∂ΩL. The variational problem of redefining the local defogging model is as follows [11]: I*=argminIΩLIEeβdβ(IE)d2dp(x,y) {I^*} = \arg \mathop {\min }\limits_I \int\!\!\!\int_{\Omega L} {{{\left\| {\nabla I - \nabla E{e^{\beta d}} - \beta \left( {I - {E_\infty }} \right)\nabla d} \right\|}^2}dp\left( {x,\,y} \right)} The integration area becomes ΩL at this time. I|∂ΩL = Ig, Ig is the result of the current global restoration at the boundary ∂ΩL of the local area. Usually, the local area selected by the user is relatively small, and solving the equation can quickly realize the local defogging recovery. Another important feature of the local defogging model is that the user can decide how to determine the optical depth βd of the image according to the result of the global restoration.

The image contrast degradation in fog is exponentially related to the optical depth of the scene point. Therefore, the optical depth of the local area can be increased when the contrast of the global restored image is still low. Otherwise, reduce the optical depth of the area. In this paper, the optical depth βd (abbreviated as Op) of each scene point p in the local area is corrected. Its mathematical expression is: Op=[1+λexp(pp0+k22σ2)]Op O_p^\prime = \left[ {1 + \lambda \,\exp \left( { - {{{{\left\| {p - {p_0} + k} \right\|}^2}} \over {2{\sigma ^2}}}} \right)} \right]{O_p} p0 is the center pixel coordinate of the local area. According to formula (8), the optical depth of each pixel point p in the area can be adjusted from Op to Op O_p^\prime .

Simulation results and analysis

The specific process of the image defogging restoration algorithm (see Figure 3) is as follows:

1. Obtain the scene depth and sky color of the degraded image. 2. Apply the global model to dehaze and restore. 3. Correct the optical depth of the area where the global restoration effect is not satisfactory to solve the local defogging model [12].

Figure 3

Dehazing image defogging recovery process

The following applies the algorithm of this paper to restore the degraded foggy image. The experimental results are shown in Figure 4. By gradually modifying the β value, the overall defogging recovery results can be revised as a whole. This can improve the dehazing effect.

Figure 4

Global defogging restoration of degraded images

Figure 4 shows that if the local defogging model continues to be used, the contrast of the local area of the image can be further improved.

Conclusion

This paper formalizes defogging recovery as solving partial differential equations about scene depth and image gradient to achieve defogging recovery from only one image. We can make local corrections to improve the defogging effect and smoothly integrate it into the global defogging recovery result until the application requirements are met. The defogging recovery of a single degraded image can be achieved better.

Figure 1

The optical depth of the scene with different degraded images generated by interpolation
The optical depth of the scene with different degraded images generated by interpolation

Figure 2

Global defogging restoration of degraded images
Global defogging restoration of degraded images

Figure 3

Dehazing image defogging recovery process
Dehazing image defogging recovery process

Figure 4

Global defogging restoration of degraded images
Global defogging restoration of degraded images

Du, Q., Li, Y. & Pan, L. Wheelchair Size and Material Application in Human-machine System Model. Applied Mathematics and Nonlinear Sciences., 2021 6(2): 7–18 DuQ. LiY. PanL. Wheelchair Size and Material Application in Human-machine System Model Applied Mathematics and Nonlinear Sciences 2021 6 2 7 18 10.2478/amns.2021.1.00009 Search in Google Scholar

Çitil, H. Investigation of A Fuzzy Problem by the Fuzzy Laplace Transform. Applied Mathematics and Nonlinear Sciences., 2019 4(2): 407–416. ÇitilH. Investigation of A Fuzzy Problem by the Fuzzy Laplace Transform Applied Mathematics and Nonlinear Sciences 2019 4 2 407 416 10.2478/AMNS.2019.2.00039 Search in Google Scholar

Dua, M., Suthar, A., Garg, A., & Garg, V. An ILM-cosine transform-based improved approach to image encryption. Complex & Intelligent Systems, 2021 7(1): 327–343. DuaM. SutharA. GargA. GargV. An ILM-cosine transform-based improved approach to image encryption Complex & Intelligent Systems 2021 7 1 327 343 10.1007/s40747-020-00201-z Search in Google Scholar

Pankaj, S., & Dua, M. A novel ToCC map and two-level scrambling-based medical image encryption technique. Network Modeling Analysis in Health Informatics and Bioinformatics, 2021 10(1): 1–19. PankajS. DuaM. A novel ToCC map and two-level scrambling-based medical image encryption technique Network Modeling Analysis in Health Informatics and Bioinformatics 2021 10 1 1 19 10.1007/s13721-021-00324-4 Search in Google Scholar

Ruthotto, L., & Haber, E. Deep neural networks motivated by partial differential equations. Journal of Mathematical Imaging and Vision, 2020 62(3): 352–364. RuthottoL. HaberE. Deep neural networks motivated by partial differential equations Journal of Mathematical Imaging and Vision 2020 62 3 352 364 10.1007/s10851-019-00903-1 Search in Google Scholar

Kwon, H., Cordaro, A., Sounas, D., Polman, A., & Alu, A. Dual-polarization analog 2d image processing with nonlocal metasurfaces. ACS Photonics, 2020 7(7): 1799–1805. KwonH. CordaroA. SounasD. PolmanA. AluA. Dual-polarization analog 2d image processing with nonlocal metasurfaces ACS Photonics 2020 7 7 1799 1805 10.1021/acsphotonics.0c00473 Search in Google Scholar

Sun, H., Hou, M., Yang, Y., Zhang, T., Weng, F., & Han, F. Solving partial differential equation based on Bernstein neural network and extreme learning machine algorithm. Neural Processing Letters, 2019 50(2): 1153–1172. SunH. HouM. YangY. ZhangT. WengF. HanF. Solving partial differential equation based on Bernstein neural network and extreme learning machine algorithm Neural Processing Letters 2019 50 2 1153 1172 10.1007/s11063-018-9911-8 Search in Google Scholar

Monga, V., Li, Y., & Eldar, Y. C. Algorithm unrolling: Interpretable, efficient deep learning for signal and image processing. IEEE Signal Processing Magazine, 2021 38(2): 18–44. MongaV. LiY. EldarY. C. Algorithm unrolling: Interpretable, efficient deep learning for signal and image processing IEEE Signal Processing Magazine 2021 38 2 18 44 10.1109/MSP.2020.3016905 Search in Google Scholar

Prasath, S., & THANH, D. N. Structure tensor adaptive total variation for image restoration. Turkish Journal of electrical engineering & computer sciences, 2019 27(2): 1147–1156. PrasathS. THANHD. N. Structure tensor adaptive total variation for image restoration Turkish Journal of electrical engineering & computer sciences 2019 27 2 1147 1156 10.3906/elk-1802-76 Search in Google Scholar

Al-Zhour, Z., Al-Mutairi, N., Alrawajeh, F., & Alkhasawneh, R. Series solutions for the Laguerre and Lane-Emden fractional differential equations in the sense of conformable fractional derivative. Alexandria Engineering Journal, 2019 58(4): 1413–1420. Al-ZhourZ. Al-MutairiN. AlrawajehF. AlkhasawnehR. Series solutions for the Laguerre and Lane-Emden fractional differential equations in the sense of conformable fractional derivative Alexandria Engineering Journal 2019 58 4 1413 1420 10.1016/j.aej.2019.11.012 Search in Google Scholar

Scheerlinck, C., Barnes, N., & Mahony, R. Asynchronous spatial image convolutions for event cameras. IEEE Robotics and Automation Letters, 2019 4(2): 816–822. ScheerlinckC. BarnesN. MahonyR. Asynchronous spatial image convolutions for event cameras IEEE Robotics and Automation Letters 2019 4 2 816 822 10.1109/LRA.2019.2893427 Search in Google Scholar

Adam, T., & Paramesran, R. Image denoising using combined higher order non-convex total variation with overlapping group sparsity. Multidimensional Systems and Signal Processing, 2019 30(1): 503–527. AdamT. ParamesranR. Image denoising using combined higher order non-convex total variation with overlapping group sparsity Multidimensional Systems and Signal Processing 2019 30 1 503 527 10.1007/s11045-018-0567-3 Search in Google Scholar

Articoli consigliati da Trend MD

Pianifica la tua conferenza remota con Sciendo