Keywords

1 Introduction

Among many image restoration methods, patch-based methods [1,2,3] have offered effective ways, for example, log likelihood (log L) probability [4] and maximum a posteriori (MAP) [5] method. We usually use the Markov random field (MRF) [6] for the whole image processing directly. At this point, log L probability and MAP are difficult to calculate accurately. Thus, a general optimization framework based on patch prior has been widely put forward, the most representative, Field of experts (FOE) [7] framework.

Expected log patch likelihood (EPLL) [8] is also an optimization framework using Gaussian mixture priors [9] learned by the Gaussian mixture model (GMM) [10, 11]. We find that it is a global parameter \( \uplambda \) in the GMM model, instead of local adaptive parameters. The influence is that the denoising performance of different image regions is inconsistent. For avoiding this effect, we propose a novel GMM model with local constraints. We use a set of constraints \( \uplambda_{\text{i}} \) and each \( \uplambda \) which permits to satisfy the constraint for one region does not serve for other ones. Certainly, a different selection of the regularization parameters [12] may give better results in some region, but no single parameter can give an improvement for all regions at the same time. In the Lagrange multiplier formulations, we need several the Lagrange multipliers to be able to impose locally that noise variance is given by \( \updelta^{2} \). Therefore, parameter is now spatial adaptive.

2 Proposed Method

2.1 Background of Expected Patch Log Likelihood

Expected patch log likelihood (EPLL) is a general optimization framework based on patch priors for image restoration. Given a natural image \( u \) and known priori \( p \), EPLL can be defined as:

$$ EPLL\,(u) = \log \,p(u) = \sum\limits_{i} {\log \,p(} P_{i} u) $$
(1)

Where \( P_{i} \) denotes an operator for extracting image patch \( u_{i} \) from image \( u \). The patch prior used in the joint conditional density with the EPLL is a Gaussian Mixture Model (GMM) given by:

$$ p(u) = \prod\limits_{i = 1}^{N} {\sum\limits_{j = 1}^{K} {\pi_{j} N(u_{i} \,|\,\mu_{j} ,\Sigma _{j} )} } $$
(2)

Where \( \pi_{j } \) are the mixing coefficients, \( \mu_{j} \) and \( {\sum }_{j} \) are the corresponding mean and covariance matrix.

Given a noise image \( u_{0} \), the degradation model can be expressed by \( \left\| {u - u_{0} } \right\|^{2} \). In order to achieve good results for restoration, we should maximize log likelihood (log L) probability of the image patch, while keeping the \( u \) and \( u_{0} \) as consistent as possible. Therefore, EPLL model based on priori \( p \) is represented as follows:

$$ \mathop {\hbox{min} }\limits_{u} \left\{ {\frac{\lambda }{2}\left\| {u - u_{0} } \right\|^{2} - \sum\limits_{i} {\log \,p(P_{i} } u)} \right\} $$
(3)

Where \( \uplambda \) is a regularization parameter. The equation can be solved by Half Quadratic Splitting [8] which introduces a set of auxiliary variables \( {\text{z}}^{i} \) and changes the cost function into the following form:

$$ \mathop {\hbox{min} }\limits_{{u,\{ z_{i} \} }} \left\{ {\frac{\lambda }{2}\left\| {u - u_{0} } \right\|^{2} + \sum\limits_{i} {\{ \frac{\beta }{2}(\left\| {P_{i} u - z_{i} } \right\|^{2} ) - } \log \,p(z_{i} )\} } \right\} $$
(4)

Where \( \upbeta \) is the penalty parameter which often is set to be large enough to ensure that the solution of (4) is close to that of (3). Then formula (4) can be minimized by alternatively updating \( {\text{z}}^{i} \) and \( u_{i} \).

2.2 Proposed Method with Local Constraints

Suppose that \( \left\{ {{\text{O}}_{1} , \ldots ,{\text{O}}_{\text{r}} } \right\} \) is a partition of image. Given \( \uplambda_{1} , \ldots ,\,\uplambda_{\text{r }} \), we consider the following problem:

Whether there are values \( (\uplambda_{1} , \ldots ,\,\uplambda_{\text{r }} ) \) satisfying local constraints [13, 14] as follows

$$ \frac{1}{{|O_{i} |}}\left\| {u - u_{0} } \right\|^{2} = \delta^{2} ,\forall i = 1, \ldots ,r. $$
(5)

We proposed to solve the following constrained problem:

$$ \begin{aligned} & \mathop {\hbox{max} }\limits_{u} \sum\limits_{i} {\log \,p(P_{i} u)} \\ & s.t.\left\| {u - u_{0} } \right\|^{2} = \delta^{2} \left| {o_{i} } \right| \\ \end{aligned} $$
(6)

In case that we answer the above question in the affirmative sense, the solution of (6) would give a solution of the problem:

$$ \quad \mathop {\hbox{min} }\limits_{u} \left\{ {\frac{\lambda (x)}{2}\left\| {u - u_{0} } \right\|^{2} - \sum\limits_{i} {\log \,p(P_{i} } u)} \right\} $$
(7)

Where \( \uplambda\,\text{(}{\text{x}}\text{)} = \mathop \sum \limits_{{{\text{i}} = 1}}^{\text{r}}\uplambda_{\text{i}}\upchi{\text{o}}_{\text{i}} \). For simplicity we shall write \( {\vec{\uplambda }} = \left\{{\uplambda_{1}, \ldots\uplambda_{\text{r}}} \right\} \) and \( {\vec{\uplambda}} \ge 0 \) if \( \uplambda_{\text{i}} \ge 0 \) for all \( {\text{i}} = 1, \ldots ,\,{\text{r}} \). The novel model that we propose is an extension of GMM model, where the parameter \( \uplambda \) takes different values for different regions. To solve (7) we use the same numerical approach we used to solve (3). The Eq. (7) is equivalently transformed into the following function:

$$ \mathop {\hbox{min} }\limits_{{u,\{ z_{i} \} }} \{ \frac{\lambda (x)}{2}\left\| {u - u_{0} } \right\|^{2} + \sum\limits_{i} {\{ \frac{\beta }{2}(R\left\| {Ru - z_{i} } \right\|^{2} ) - } \log \,p(z_{i} )\} \} $$
(8)

For solving (8), at first, we choose the most likely Gaussian mixing weight \( j_{max} \) for each patch \( R_{i} u \). Then Eq. (8) is minimized by alternatively updating \( z_{i} \) and \( u \):

$$ z_{i}^{n + 1} = (\Sigma _{{j_{\hbox{max} } }} + \frac{1}{\beta }I)^{ - 1} \cdot (R_{i} u^{n}\Sigma _{{j_{\hbox{max} } }} + \frac{1}{\beta }\mu_{{j_{\hbox{max} } }} I) $$
(9)
$$ u^{n + 1} = u^{n} + \Delta t[\lambda (x)(u_{0} - u^{n} ) - \sum\limits_{i} {\beta R_{i}^{T} } (R_{i} u^{n} - z_{i}^{n} ) $$
(10)

Where \( I \) is the identity matrix, \( \Delta t \) is the time step. In practice, for updating the parameters \( \lambda_{i} \), we use Uzawa’s method [15]. In summary, the algorithm can be implemented as follows:

  1. Step1.

    Input corrupted image \( u_{0} \), model parameters \( \beta ,\Delta t \) and iteration stopping tolerance \( \varepsilon \);

  2. Step2.

    Choose the most likely Gaussian mixing weights \( j_{max} \) for each patch \( R_{i} u \);

  3. Step3.

    Initially, we take the values of \( \lambda_{i} > 0 \) small enough so that

    $$ Q_{Oi} \left( {u^{\lambda } } \right) = \frac{1}{{\left| {O_{i} } \right|}}\left\| {u^{\lambda } - u_{0} } \right\|^{2} > \delta^{2} ,\forall \,i = 1, \ldots ,r $$
  4. Step4.

    For each set of values \( \lambda_{i} > 0 \), we alternatively update (9) and (10), until we reach the asymptotic state \( u^{\lambda } \).

  5. Step5.

    For each \( i \in \left\{ {1, \ldots , r} \right\} \) recompute \( \uplambda_{i} = { \hbox{max} }(\uplambda_{i} + \rho \left( {Q_{Oi} \left( {u^{\lambda } - \delta^{2} } \right),0} \right) \) (with \( \rho \) > 0 small enough)

  6. Step6.

    Iterate steps 4–5 until the \( \uplambda_{i}^{{\prime }} \) satisfying stopping criterion.

3 Implementation and Experiment Results

In experiments, we compare our proposed method with the original EPLL in image denoising. The GMM with 200 mixture components is learned from \( 2 \times 10^{6} \) images patches which are sampled from the Berkeley Database. The experimental pictures are added Gaussian noise with zero mean and standard variance \( \updelta = 25 \).

Figures 1 and 2 show the performance of the EPLL with Gaussian mixture priors and our method respectively on Test1 image (i.e. No. 37073) and Test2 image (i.e. No. 103070) in denoising. We can find that our proposed method outperforms the original EPLL in the denoised result. Because that the local constraints are equivalent to the fidelity term with spatial adaptive parameters by Lagrange multiplier method. For the related quantitative comparison, as demonstrated in Table 1, the peak signal to noise ratio (PSNR) value of our method is also higher than the original EPLL.

Fig. 1.
figure 1

Denoising results on Test1 image

Fig. 2.
figure 2

Denoising results on Test2 image

Table 1. The PSNR results of different denoising models

4 Conclusions

Image priors play a vital role in image restoration tasks. In this paper, we devote to researching on Gaussian mixture model based on local constraints. We construct an adaptive regularization parameter coupling the local entropy of the image, which varies with different regions of the image and each \( \uplambda \) corresponds to a region. The numerical results show our proposed method achieves a satisfying denoised result, compared with the original EPLL algorithm with fixed regularization parameters.