Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 14 additions & 7 deletions NetCore/Resize.cs
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ public ImageSharpSize ResizeImageSharp()
{
using (var image = new ImageSharpImage(Width, Height))
{
image.Mutate(i => i.Resize(ResizedWidth, ResizedHeight));
image.Mutate(i => i.Resize(ResizedWidth, ResizedHeight, KnownResamplers.Bicubic));
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a little uncomfortable with this and I'd prefer to hear @JimBobSquarePants ' opinion on this line change: I'm all in favor of comparing oranges to oranges, but we're comparing images that will have some aesthetic differences in the end even if we use bicubic everywhere. Benchmark numbers are not fully meaningful unless you also consider the visual quality of the results, which is harder to quantify. As a consequence, it's possible that by forcing all benchmarks to use bicubic, we may just be degrading performance without necessarily bringing significantly better picture quality. As such, I'm hesitant to take this change unless there's evidence that this brings more homogeneity in image quality across libraries.

Copy link

@antonfirsov antonfirsov Oct 15, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bicubic is the default for ImageSharp, so there is no actual change on this code line. This explains why ImageSharp before/after values remained unchanged.

On the other hand, we need to investigate if the current comparison is fair. If output quality is different and/or not all libraries are capable to handle certain cases, that should be pointed out in benchmark (result) comments or the blog post. For example: do all libraries implement premultiplication to prevent alpha bleeding?

@JimBobSquarePants is planning to have a look soon and come back with more specific details.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Benchmark numbers are not fully meaningful unless you also consider the visual quality of the results, which is harder to quantify.

I couldn't agree more here. Operations should consider far more than just the raw speed. Size, quality, correctness for example.

Here's a test image which demonstrates issues with several of the libraries. Only MagicScaler, ImageSharp, and System.Drawing get that right.

Note: I couldn't even get FreeImage to save an input png to jpeg with the current code so I'd consider that useless.

Input
kaboom

MagicScaler
kaboom-MagicScaler

NetVips
kaboom-NetVips

SkiaSharpBitmap
kaboom-SkiaSharpBitmap

SkiaSharpCanvas
kaboom-SkiaSharpCanvas

SystemDrawing
kaboom-SystemDrawing

ImageSharp
kaboom-ImageSharp

MagickNET
kaboom-MagickNET

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Forgot to say, the System.Drawing code is using some sort of sharpening via InterpolationMode.HighQualityBicubic. It also doesn't all subsampling so image quality at 75 will appear sharper than many of the others.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think most of the differences here are due to the background colour.

This test PNG has transparent pixels in a checkerboard pattern, so when you make a JPG thumbnail, the colour you get depends on what background colour you set. NetVips defaults to black (like skia I guess), so it does look rather dark. If you set a white background, or thumbnail to PNG, NetVips and MagickNET look the same (I suppose skia would too).

NetVips has an option for linear light downsampling, but I don't think this particular test image is a great way to show that. It seems to have a surprise hidden in the RGB of the transparent pixels, so I think this image is probably for testing premultiplication.

return image.Size();
}
}
Expand All @@ -59,6 +59,8 @@ public MagickGeometry MagickResize()
var size = new MagickGeometry(ResizedWidth, ResizedHeight);
using (var image = new MagickImage(MagickColor.FromRgba(0, 0, 0, 0), Width, Height))
{
image.FilterType = FilterType.Cubic;

image.Resize(size);
return size;
}
Expand All @@ -85,7 +87,8 @@ public Size MagicScalerResize()
{
Width = ResizedWidth,
Height = ResizedHeight,
Sharpen = false
Sharpen = false,
Interpolation = InterpolationSettings.Cubic
};

using (var pixels = new TestPatternPixelSource(Width, Height, PixelFormats.Bgr24bpp))
Expand Down Expand Up @@ -133,13 +136,17 @@ public SKSize SkiaBitmapResizeBenchmark()
public (int width, int height) NetVipsResize()
{
// Scaling calculations
const double xFactor = (double)Width / ResizedWidth;
const double yFactor = (double)Height / ResizedHeight;
const double xFactor = (double)ResizedWidth / Width;
const double yFactor = (double)ResizedHeight / Height;

using (var original = NetVips.Image.Black(Width, Height).CopyMemory())
using (var resized = original.Reduce(xFactor, yFactor, kernel: Enums.Kernel.Cubic).CopyMemory())
using (var original = NetVips.Image.Black(Width, Height))
using (var resized = original.Resize(xFactor, vscale: yFactor, kernel: Enums.Kernel.Cubic))
{
return (resized.Width, resized.Height);
// libvips is "lazy" and will not process pixels
// until you write to an output file, buffer or memory
var _ = resized.WriteToMemory();

return (resized.Width, resized.Height);
}
}
}
Expand Down