skip to Main Content

I want to apply unsharp mask like Adobe Photoshop,
I know this answer, but it’s not as sharp as Photoshop.

Photoshop has 3 parameters in Smart Sharpen dialog: Amount, Radius, Reduce Noise; I want to implement all of them.
enter image description here

This is the code I wrote, according to various sources in SO.
But the result is good in some stages ("blurred", "unsharpMask", "highContrast"), but in the last stage ("retval") the result is not good.

Where am I wrong, what should I improve?

Is it possible to improve the following algorithm in terms of performance?

#include "opencv2/opencv.hpp"
#include "fstream"
#include "iostream"
#include <chrono>

using namespace std;
using namespace cv;

// from https://docs.opencv.org/3.4/d3/dc1/tutorial_basic_linear_transform.html
void increaseContrast(Mat img, Mat* dst, int amountPercent)
{
    *dst = img.clone();
    double alpha = amountPercent / 100.0;
    *dst *= alpha;
}

// from https://stackoverflow.com/a/596243/7206675
float luminanceAsPercent(Vec3b color)
{
    return (0.2126 * color[2]) + (0.7152 * color[1]) + (0.0722 * color[0]);
}

// from https://stackoverflow.com/a/2938365/7206675
Mat usm(Mat original, int radius, int amountPercent, int threshold)
{
    // copy original for our return value
    Mat retval = original.clone();

    // create the blurred copy
    Mat blurred;
    cv::GaussianBlur(original, blurred, cv::Size(0, 0), radius);

    cv::imshow("blurred", blurred);
    waitKey();

    // subtract blurred from original, pixel-by-pixel to make unsharp mask
    Mat unsharpMask;
    cv::subtract(original, blurred, unsharpMask);

    cv::imshow("unsharpMask", unsharpMask);
    waitKey();

    Mat highContrast;
    increaseContrast(original, &highContrast, amountPercent);

    cv::imshow("highContrast", highContrast);
    waitKey();

    // assuming row-major ordering
    for (int row = 0; row < original.rows; row++) 
    {
        for (int col = 0; col < original.cols; col++) 
        {
            Vec3b origColor = original.at<Vec3b>(row, col);
            Vec3b contrastColor = highContrast.at<Vec3b>(row, col);

            Vec3b difference = contrastColor - origColor;
            float percent = luminanceAsPercent(unsharpMask.at<Vec3b>(row, col));

            Vec3b delta = difference * percent;

            if (*(uchar*)&delta > threshold) {
                retval.at<Vec3b>(row, col) += delta;
                //retval.at<Vec3b>(row, col) = contrastColor;
            }
        }
    }

    return retval;
}

int main(int argc, char* argv[])
{
    if (argc < 2) exit(1);
    Mat mat = imread(argv[1]);
    mat = usm(mat, 4, 110, 66);
    imshow("usm", mat);
    waitKey();
    //imwrite("USM.png", mat);
}

Original Image:
enter image description here

Blurred stage – Seemingly good:
enter image description here

UnsharpMask stage – Seemingly good:
enter image description here

HighContrast stage – Seemingly good:
enter image description here

Result stage of my code – Looks bad!
enter image description here

Result From Photoshop – Excellent!
enter image description here

2

Answers


  1. First of all, judging by the artefacts that Photoshop left on the borders of the petals, I’d say that it applies the mask by using a weighted sum between the original image and the mask, as in the answer you tried first.

    I modified your code to implement this scheme and I tried to tweak the parameters to get as close as the Photoshop result, but I couldn’t without creating a lot of noise. I wouldn’t try to guess what Photoshop is exactly doing (I would definitely like to know), however I discovered that it is fairly reproducible by applying some filter on the mask to reduce the noise. The algorithm scheme would be:

    blurred = blur(image, Radius)
    mask = image - blurred
    mask = some_filter(mask)
    sharpened = (mask < Threshold) ? image : image - Amount * mask
    

    I implemented this and tried using basic filters (median blur, mean filter, etc) on the mask and this is the kind of result I can get:
    comparison plot

    which is a bit noisier than the Photoshop image but, in my opinion, close enough to what you wanted.

    On another note, it will of course depend on the usage you have for your filter, but I think that the settings you used in Photoshop are too strong (you have big overshoots near petals borders). This is sufficient to have a nice image at the naked eye, with limited overshoot:
    comparison plot

    Finally, here is the code I used to generate the two images above:

    #include <opencv2/opencv.hpp>
    #include <iostream>
    
    using namespace std;
    using namespace cv;
    
    Mat usm(Mat original, float radius, float amount, float threshold)
    {
        // work using floating point images to avoid overflows
        cv::Mat input;
        original.convertTo(input, CV_32FC3);
    
        // copy original for our return value
        Mat retbuf = input.clone();
    
        // create the blurred copy
        Mat blurred;
        cv::GaussianBlur(input, blurred, cv::Size(0, 0), radius);
    
        // subtract blurred from original, pixel-by-pixel to make unsharp mask
        Mat unsharpMask;
        cv::subtract(input, blurred, unsharpMask);
        
        // --- filter on the mask ---
        
        //cv::medianBlur(unsharpMask, unsharpMask, 3);
        cv::blur(unsharpMask, unsharpMask, {3,3});
        
        // --- end filter ---
    
        // apply mask to image
        for (int row = 0; row < original.rows; row++) 
        {
            for (int col = 0; col < original.cols; col++) 
            {
                Vec3f origColor = input.at<Vec3f>(row, col);
                Vec3f difference = unsharpMask.at<Vec3f>(row, col);
    
                if(cv::norm(difference) >= threshold) {
                    retbuf.at<Vec3f>(row, col) = origColor + amount * difference;
                }
            }
        }
    
        // convert back to unsigned char
        cv::Mat ret;
        retbuf.convertTo(ret, CV_8UC3);
    
        return ret;
    }
    
    int main(int argc, char* argv[])
    {
        if (argc < 3) exit(1);
        Mat original = imread(argv[1]);
        Mat expected = imread(argv[2]);
        
        // closer to Photoshop
        Mat current = usm(original, 0.8, 12., 1.);
        
        // better settings (in my opinion)
        //Mat current = usm(original, 2., 1., 3.);
        
        cv::imwrite("current.png", current);
        
        // comparison plot
        cv::Rect crop(127, 505, 163, 120);
        cv::Mat crops[3];
        cv::resize(original(crop), crops[0], {0,0}, 4, 4, cv::INTER_NEAREST);
        cv::resize(expected(crop), crops[1], {0,0}, 4, 4, cv::INTER_NEAREST);
        cv::resize( current(crop), crops[2], {0,0}, 4, 4, cv::INTER_NEAREST);
        
        char const* texts[] = {"original", "photoshop", "current"};
        
        cv::Mat plot = cv::Mat::zeros(120 * 4, 163 * 4 * 3, CV_8UC3);
        for(int i = 0; i < 3; ++i) {
            cv::Rect region(163 * 4 * i, 0, 163 * 4, 120 * 4);
            crops[i].copyTo(plot(region));
            cv::putText(plot, texts[i], region.tl() + cv::Point{5,40}, 
                cv::FONT_HERSHEY_SIMPLEX, 1.5, CV_RGB(255, 0, 0), 2.0);
        }
        
        cv::imwrite("plot.png", plot); 
    }
    
    Login or Signup to reply.
  2. Here’s my attempt at ‘smart’ unsharp masking. Result isn’t very good, but I’m posting anyway. Wikipedia article on unsharp masking has details about smart sharpening.

    Several things I did differently:

    • Convert BGR to Lab color space and apply the enhancements to the brightness channel
    • Use an edge map to apply enhancement to the edge regions

    Original:

    orig

    Enhanced: sigma=2 amount=3 low=0.3 high=.8 w=2

    enh

    Edge map: low=0.3 high=.8 w=2

    edges

    #include "opencv2/core.hpp"
    #include "opencv2/imgproc.hpp"
    #include "opencv2/highgui.hpp"
    #include <cstring>
    
    cv::Mat not_so_smart_sharpen(
            const cv::Mat& bgr,
            double sigma,
            double amount,
            double canny_low_threshold_weight,
            double canny_high_threshold_weight,
            int edge_weight)
    {
        cv::Mat enhanced_bgr, lab, enhanced_lab, channel[3], blurred, difference, bw, kernel, edges;
    
        // convert to Lab
        cv::cvtColor(bgr, lab, cv::ColorConversionCodes::COLOR_BGR2Lab);
        // perform the enhancement on the brightness component
        cv::split(lab, channel);
        cv::Mat& brightness = channel[0];
        // smoothing for unsharp masking
        cv::GaussianBlur(brightness, blurred, cv::Size(0, 0), sigma);
        difference = brightness - blurred;
        // calculate an edge map. I'll use Otsu threshold as the basis
        double thresh = cv::threshold(brightness, bw, 0, 255, cv::ThresholdTypes::THRESH_BINARY | cv::ThresholdTypes::THRESH_OTSU);
        cv::Canny(brightness, edges, thresh * canny_low_threshold_weight, thresh * canny_high_threshold_weight);
        // control edge thickness. use edge_weight=0 to use Canny edges unaltered
        cv::dilate(edges, edges, kernel, cv::Point(-1, -1), edge_weight);
        // unsharp masking on the edges
        cv::add(brightness, difference * amount, brightness, edges);
        // use the enhanced brightness channel
        cv::merge(channel, 3, enhanced_lab);
        // convert to BGR
        cv::cvtColor(enhanced_lab, enhanced_bgr, cv::ColorConversionCodes::COLOR_Lab2BGR);
    
    //  cv::imshow("edges", edges);
    //  cv::imshow("difference", difference * amount);
    //  cv::imshow("original", bgr);
    //  cv::imshow("enhanced", enhanced_bgr);
    //  cv::waitKey(0);
    
        return enhanced_bgr;
    }
    
    int main(int argc, char *argv[])
    {
        double sigma = std::stod(argv[1]);
        double amount = std::stod(argv[2]);
        double low = std::stod(argv[3]);
        double high = std::stod(argv[4]);
        int w = std::stoi(argv[5]);
    
        cv::Mat bgr = cv::imread("flower.jpg");
    
        cv::Mat enhanced = not_so_smart_sharpen(bgr, sigma, amount, low, high, w);
    
        cv::imshow("original", bgr);
        cv::imshow("enhanced", enhanced);
        cv::waitKey(0);
    
        return 0;
    }
    
    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search