Monday, May 1, 2017

Finding the balance

Filter training

It's a while ago since my last blog post. In the meanwhile, I made quite some progress in several areas of the demosaicking algorithm.

Most notably, I managed to implement an algorithm to train the filters on real reference images. This is one contributor allowing to reduce the filter size from the initially used 31x31 filters to the 11x11 and 9x9 filters used now. I also found out how different training images can have substantial impact on the shape of the resulting filters.

Furthermore, I added additional lowpass filtering in the chroma domain.

Lots and lots of experiments, simulations (calculating CPSNR from mosaicking and subsequently demosaicking reference images), and pixel-peeping lead to the present setup.

And even the present setup is not absolutely perfect...

X-Trans triangle of challenges

The reason is that the X-Trans colour filter array makes reconstruction of chroma information a rather challenging task. In the most common approaches, like the Markesteijn algorithm, areas of high spatial frequency in luminance cause plenty of false colour artifacts.

The most straightforward approach to dealing with these artifacts is any kind of lowpass filtering, especially in the chroma domain. However, by the time that the most pronounced artifacts are reduced, the image can look dull; in the most extreme cases, the image might start looking nearly like a black-and-white image. In any case desaturation, especially in areas of colour transition, will occur.

My initial approaches with rather wide 31x31 filters were efficiently suppressing false colour artifacts without causing too much desaturation. As a positive side effect, they also showed very good results for high ISO images. But, these first approaches were suffering from quite pronounced colour bleeding.

         ... can't have it all ...
  1. The Markesteijn algorithm, especially in its one-pass variant, is a typical example of an extreme approach to be found in the lower left corner of the above triangle. There is practically no colour bleeding, there are no subdued colors, but there are—as a consequence—plenty of false colour artifacts.
  2. Any application of massive lowpass filtering leads to the situation as depicted at the top corner of the above triangle. The image can look dull; in the most extreme cases, the image might start looking nearly like a black-and-white image. In any case desaturation, especially in areas of colour transition, will occur.
  3. The initial approaches described in this blog were representing the situation as found in the lower right corner of the above triangle. The rather wide 31x31 filters looked like a good remedy to the false colour artifacts at first sight. But in the end, these filters caused rather unacceptable colour bleeding.
So my present setup forms a compromise. There are still false colour artifacts, but to a much lesser degree. If aliasing is still present, it is usually so weak that just a little bit of bilateral filtering can solve this residual aliasing. The colour response in areas of colour transitions is slightly subdued, but this appears to be well tolerable. A tiny bit of colour bleeding is still present, but at normal viewing distances this does not really constitute a problem.

If you want to try

As usual, you can find the latest changes in the GitHub repository of the darktable fork.

Many thanks

My special thanks goes to François Guerraz, for pointing me to the use of the quick select algorithm and for making an extra menu entry in the GUI of the darktable fork. Furthermore, I'd like to thank J. Liles for plenty of testing and feedback.

Next steps

I do not think that the visual quality of the algorithm as such can still be vastly improved. Any improvement in one area will likely lead to a deterioration in another area. As a next step I'll look into performance improvements because the algorithm is computationally intensive.