From: Stefan Heinzmann (stefan_heinzmann_at_[hidden])
Date: 2006-10-31 16:32:36


Lubomir Bourdev wrote:
> Matt Gruenke wrote:
>> For example, a program or library that handles encoding/decoding of
>> MPEG-4 video (non-studio profiles) has to deal with as many as 6
>> variants of YUV, 4 transfer functions, and two different scales of
>> sample values (without getting into things like n-bit profile). In
>> addition to that, professional video production systems will
>> also have
>> to deal with a variety of linear, non-linear, and log-scale RGB
>> formats. Add RGBA, and you also have to deal with whether Alpha is
>> premultiplied. Combined with a few different channel
>> orderings and data
>> layouts, I fear the result is such a multiplicity of
>> combinations that
>> the core purpose of GIL's color space construct would be defeated.
>
> Hopefully my description above addresses all these examples. You don't
> need to create a custom color space for every combination of
> possibilities. These variations, which are mostly orthogonal to each
> other, are best addressed in different GIL abstractions, which are also
> orthogonal, such as custom channels and channel algorithms, custom
> pixels, pixel references and iterators, custom color conversion objects,
> views, etc.

Maybe it's just me but I find extending GIL to support something like
the v210 Quicktime format quite challenging (I don't want to imply that
this is GIL's fault). This is a 10-bit YUV 4:2:2 format which stores 6
pixels in 16 bytes. It appears to me as if trying to support it would
touch on a lot of concepts and corners of GIL, as it would require a new
pixel storage format, color space, component subsampling, and maybe more.

I believe it would help understanding if you could try to give at least
a road map of what needs doing to support this properly (a fully coded
example would probably require quite some effort).

Cheers
Stefan