Discussion:
[RFC v2] surface crop & scale protocol extension
(too old to reply)
Pekka Paalanen
2013-11-08 11:33:22 UTC
Permalink
Hi all,

this is the v2 of the crop and scale extension, as RFC.

The v1 was in:
http://lists.freedesktop.org/archives/wayland-devel/2013-April/008927.html

Based on v1, Jonny Lamb has been working on a Weston implementation,
starting from the protocol side, and working towards the renderer
implementations. That work is still very much in progress.


Introduction:

The primary use case for the crop & scale extension is for hardware
accelerated, zero-copy video display. Hardware video decoders produce
complete frames or ultimately wl_buffers that are attached to
a wl_surface. The crop & scale extension allows the client to
request the compositor to crop and scale the frame so it fits the
client's purposes. Optimally, a compositor implements this by
programming a hardware overlay unit to do the requested cropping and
scaling. The major benefit is that the CPU never has to even see the
video pixels.

Probably a common case is to have a video in a sub-surface, so that
window decorations and overlaid GUI elements can be in other
(sub-)surfaces, and avoid touching the pixels in the video frame
buffers. Video as a browser element will need cropping when the element
is partially off-view. Scaling should be useful in general, e.g.
showing a video scaled up (or down) but still in a window.

However, I also see that crop & scale can be used to present videos
with non-square pixels in fullscreen (e.g. without sub-surfaces). Crop
& scale is needed in this case, because the fullscreening methods in
wl_shell do not allow changing the aspect ratio, or cropping parts of
the video out to fill the whole output area when video and output
aspect ratios differ. The fullscreening methods support only adding
black borders, but not scale-to-fill-and-crop.


Changes since v1:

The changes I have made are very small, and can be seen in patch form
at:
http://cgit.collabora.com/git/user/pq/weston.git/log/?h=clipscale-wip

The changes are:
- improve wording, add missing details
- take buffer_scale into account
- rewrite the coordinate transformations more clearly

In the end, I did not get much else out from the discussions of v1.

I think some people do not like the structure of the protocol
additions, namely adding yet another global factory interface and a
sub-interface to wl_surface. If the concensus really is to move this
into a single wl_surface request, that is fine.

But, to me this would not make sense as a request in wl_subsurface. The
crop & scale state is supposed to be driven by the application
component that produces the wl_buffers and attaches them to a
wl_surface. The wl_subsurface interface OTOH is to be used by the
parent component, for e.g. setting the sub-surface position. The
situation is similar to a compositor vs. clients: clients define the
surface size and content, compositor the position. Also, the crop &
scale state is not well suited to be "forced on" by a parent software
component, as it changes how e.g. input event coordinates relate to the
wl_buffer contents. Finally, there is the fullscreen video use case I
described above.

The interface names are still so-and-so, I haven't come up with
anything better. wl_scaler is pretty ok, but I feel something better
would be in place for wl_surface_scaler. How would wl_viewport sound
like?

A minor benefit of keeping crop & scale as a separate interface is that
it could be made optional, a per-backend or renderer thing.


For your reviewing pleasure, here is the v2:

<?xml version="1.0" encoding="UTF-8"?>
<protocol name="scaler">

<copyright>
Copyright ?? 2013 Collabora, Ltd.

Permission to use, copy, modify, distribute, and sell this
software and its documentation for any purpose is hereby granted
without fee, provided that the above copyright notice appear in
all copies and that both that copyright notice and this permission
notice appear in supporting documentation, and that the name of
the copyright holders not be used in advertising or publicity
pertaining to distribution of the software without specific,
written prior permission. The copyright holders make no
representations about the suitability of this software for any
purpose. It is provided "as is" without express or implied
warranty.

THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
THIS SOFTWARE.
</copyright>

<interface name="wl_scaler" version="1">
<description summary="surface cropping and scaling">
The global interface exposing surface cropping and scaling
capabilities is used to instantiate an interface extension for a
wl_surface object. This extended interface will then allow
cropping and scaling the surface contents, effectively
disconnecting the direct relationship between the buffer and the
surface size.
</description>

<request name="destroy" type="destructor">
<description summary="unbind from the cropping and scaling interface">
Informs the server that the client will not be using this
protocol object anymore. This does not affect any other objects,
wl_surface_scaler objects included.
</description>
</request>

<enum name="error">
<entry name="scaler_exists" value="0"
summary="the surface already has a scaler object associated"/>
</enum>

<request name="get_surface_scaler">
<description summary="extend surface interface for crop and scale">
Instantiate an interface extension for the given wl_surface to
crop and scale its content. If the given wl_surface already has
a wl_surface_scaler object associated, the scaler_exists
protocol error is raised.
</description>

<arg name="id" type="new_id" interface="wl_surface_scaler"
summary="the new scaler interface id"/>
<arg name="surface" type="object" interface="wl_surface"
summary="the surface"/>
</request>
</interface>

<interface name="wl_surface_scaler" version="1">
<description summary="crop and scale interface to a wl_surface">
An additional interface to a wl_surface object, which allows the
client to specify the cropping and scaling of the surface
contents.

This interface allows to define the source rectangle (src_x,
src_y, src_width, src_height) from where to take the wl_buffer
contents, and scale that to destination size (dst_width,
dst_height). This state is double-buffered, and is applied on the
next wl_surface.commit.

Before the first set request, the wl_surface still behaves as if
there was no crop and scale state. That is, no scaling is applied,
and the surface size is as defined in wl_surface.attach.

The crop and scale state causes the surface size to become
dst_width, dst_height. This overrides whatever the attached
wl_buffer size is, unless the wl_buffer is NULL. If the wl_buffer is
NULL, the surface has no content and therefore no size.

The coordinate transformations from buffer pixel coordinates up to
the surface-local coordinates happen in the following order:
1. buffer_transform (wl_surface.set_buffer_transform)
2. buffer_scale (wl_surface.set_buffer_scale)
3. crop and scale (wl_surface_scaler.set)
This means, that the source rectangle coordinates of crop and scale
are given in the coordinates after the buffer transform and scale,
i.e. in the coordinates that would be the surface-local coordinates
if the crop and scale was not applied.

If the source rectangle is partially or completely outside of the
wl_buffer, then the surface contents are undefined (not void), and
the surface size is still dst_width, dst_height.

The x, y arguments of wl_surface.attach are applied as normal to
the surface. They indicate how many pixels to remove from the
surface size from the left and the top. In other words, they are
still in the surface-local coordinate system, just like dst_width
and dst_height are.

If the wl_surface associated with the wl_surface_scaler is
destroyed, the wl_surface_scaler object becomes inert.

If the wl_surface_scaler object is destroyed, the crop and scale
state is removed from the wl_surface. The change will be applied
on the next wl_surface.commit.
</description>

<request name="destroy" type="destructor">
<description summary="remove scaling and cropping from the surface">
The associated wl_surface's crop and scale state is removed.
The change is applied on the next wl_surface.commit.
</description>
</request>

<enum name="error">
<entry name="bad_value" value="0"
summary="negative values in width or height"/>
</enum>

<request name="set">
<description summary="set the crop and scale state">
Set the crop and scale state of the associated wl_surface. See
wl_surface_scaler for the description, and relation to the
wl_buffer size.

If any of src_width, src_height, dst_width, and dst_height is
negative, the bad_value protocol error is raised.

The crop and scale state is double-buffered state, and will be
applied on the next wl_surface.commit.

Arguments dst_x and dst_y do not exist here, use the x and y
arguments to wl_surface.attach. The x, y, dst_width, and dst_height
define the surface-local coordinate system irrespective of the
attached wl_buffer size.
</description>

<arg name="src_x" type="fixed" summary="source rectangle x"/>
<arg name="src_y" type="fixed" summary="source rectangle y"/>
<arg name="src_width" type="fixed" summary="source rectangle width"/>
<arg name="src_height" type="fixed" summary="source rectangle height"/>
<arg name="dst_width" type="int" summary="surface width"/>
<arg name="dst_height" type="int" summary="surface height"/>
</request>

</interface>
</protocol>


Thanks,
pq
Bill Spitzak
2013-11-08 18:59:07 UTC
Permalink
Pekka Paalanen wrote:
> Hi all,
>
> this is the v2 of the crop and scale extension, as RFC.

I get the impression that the result of crop+scale is supposed to be
exactly the same as though the client made a second buffer of the scale
size, scaled the crop region from the first buffer to this second
buffer, then attached it with the normal wayland mechanism. Correct?

So that compositors are allowed to only keep the cropped area of a
buffer, there will have to be limitations on changing the crop+scale
without also doing another attach. Maybe it does not work unless there
is an attach in the same commit, or you might even require the attach to
be after the crop+scale and before the commit.

The big problem I see with this api is that if buffer_scale is not one,
the client is unable to specify the crop or scale rectangles in pixels.
However this is a general problem with buffer_scale everywhere.
(actually you seem to be using fixed, not integers, so it is possible if
buffer_scale is a power of 2). I would change the crop to be in pixels.
The scale rectangle requires fixing or replacing the buffer scale mechanism.

You need to define what happens if the crop region extends outside the
buffer. I think the edge pixels will have to be replicated to fill,
since using transparency here will turn an opaque buffer into one with
transparency, and produce an anti-aliased transparent edge, both of
which may defeat the ability to use hardware scaling.

I think there may also have to be rules about whether filters are
allowed to sample outside the crop region (there are things other than
box filters, you know). For security reasons this must not be allowed
outside the actual buffer, so that adjacent memory does not leak into
the displayed image, but it could be left undefined for pixels between
the crop and the buffer edge.
Rafael Antognolli
2013-11-09 01:00:28 UTC
Permalink
On Fri, Nov 08, 2013 at 10:59:07AM -0800, Bill Spitzak wrote:
>
>
> Pekka Paalanen wrote:
> >Hi all,
> >
> >this is the v2 of the crop and scale extension, as RFC.
>
> I get the impression that the result of crop+scale is supposed to be exactly
> the same as though the client made a second buffer of the scale size, scaled
> the crop region from the first buffer to this second buffer, then attached
> it with the normal wayland mechanism. Correct?

>From what I understood, the visual result might be that, but it's not
what should go on inside the renderer.

> So that compositors are allowed to only keep the cropped area of a buffer,
> there will have to be limitations on changing the crop+scale without also
> doing another attach. Maybe it does not work unless there is an attach in
> the same commit, or you might even require the attach to be after the
> crop+scale and before the commit.

IMHO the compositor would keep the entire buffer, just like it already
does. So when a buffer is attached to a surface, in the case of the gl
renderer, it would get entirely uploaded as a texture, and just in the
end when the texture is going to be rendered on the screen, only the
cropped area would be presented (in a scaled version, if that's the
case). This would allow to change the crop & scale parameters without
the need to a new attach.

Unless I'm wrong regarding the "buffer being entirely uploaded to the
compositor", but that's how I was implementing it.

> The big problem I see with this api is that if buffer_scale is not one, the
> client is unable to specify the crop or scale rectangles in pixels. However
> this is a general problem with buffer_scale everywhere. (actually you seem
> to be using fixed, not integers, so it is possible if buffer_scale is a
> power of 2). I would change the crop to be in pixels. The scale rectangle
> requires fixing or replacing the buffer scale mechanism.
>
> You need to define what happens if the crop region extends outside the
> buffer. I think the edge pixels will have to be replicated to fill, since
> using transparency here will turn an opaque buffer into one with
> transparency, and produce an anti-aliased transparent edge, both of which
> may defeat the ability to use hardware scaling.

It's defined as "undefined/dirty" pixels, isn't it? I think that's good
enough for now, at least.

>
> I think there may also have to be rules about whether filters are allowed to
> sample outside the crop region (there are things other than box filters, you
> know). For security reasons this must not be allowed outside the actual
> buffer, so that adjacent memory does not leak into the displayed image, but
> it could be left undefined for pixels between the crop and the buffer edge.
Bill Spitzak
2013-11-09 02:04:33 UTC
Permalink
Rafael Antognolli wrote:
> On Fri, Nov 08, 2013 at 10:59:07AM -0800, Bill Spitzak wrote:
>>
>> Pekka Paalanen wrote:
>>> Hi all,
>>>
>>> this is the v2 of the crop and scale extension, as RFC.
>> I get the impression that the result of crop+scale is supposed to be exactly
>> the same as though the client made a second buffer of the scale size, scaled
>> the crop region from the first buffer to this second buffer, then attached
>> it with the normal wayland mechanism. Correct?
>
> From what I understood, the visual result might be that, but it's not
> what should go on inside the renderer.

Sounds right. My main concern was that the scale width+height completely
replaces all uses of the buffer width+height in the visible compositor
api (the buffer width+height is needed to figure out where the memory
for the crop region is, but should be ignored otherwise).

>> So that compositors are allowed to only keep the cropped area of a buffer,
>
> IMHO the compositor would keep the entire buffer, just like it already
> does.

I'm probably using shm too much but I thought it would be a savings if
the compositor could upload only the crop region to an EGL texture from
an shm buffer, and then immediately release the buffer.

I suppose it could upload the entire thing but then scaling the
subrectangle either requires a copy to another texture, or what I think
is an unrequired EGL extension. Scaling the entire thing but scissoring
the output does not produce the correct filtering at the edges.

>> You need to define what happens if the crop region extends outside the
>> buffer.
>
> It's defined as "undefined/dirty" pixels, isn't it? I think that's good
> enough for now, at least.

I think making the crop larger than the buffer will be useful,
especially to fit your video into a slightly-different shaped window
(this will happen if the scale factor is not an integer). It may be
useful for letterboxing as well, but that will require the client to add
black pixels on the edges so the user does not just see the edge of the
video smeared to fill the area.
Pekka Paalanen
2013-11-10 08:40:25 UTC
Permalink
On Fri, 08 Nov 2013 18:04:33 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

>
>
> Rafael Antognolli wrote:
> > On Fri, Nov 08, 2013 at 10:59:07AM -0800, Bill Spitzak wrote:
> >>
> >> Pekka Paalanen wrote:
> >>> Hi all,
> >>>
> >>> this is the v2 of the crop and scale extension, as RFC.
> >> I get the impression that the result of crop+scale is supposed to be exactly
> >> the same as though the client made a second buffer of the scale size, scaled
> >> the crop region from the first buffer to this second buffer, then attached
> >> it with the normal wayland mechanism. Correct?
> >
> > From what I understood, the visual result might be that, but it's not
> > what should go on inside the renderer.
>
> Sounds right. My main concern was that the scale width+height completely
> replaces all uses of the buffer width+height in the visible compositor
> api (the buffer width+height is needed to figure out where the memory
> for the crop region is, but should be ignored otherwise).

This thread is about protocol. It says nothing about Weston's
internal implementation details.

> >> So that compositors are allowed to only keep the cropped area of a buffer,
> >
> > IMHO the compositor would keep the entire buffer, just like it already
> > does.
>
> I'm probably using shm too much but I thought it would be a savings if
> the compositor could upload only the crop region to an EGL texture from
> an shm buffer, and then immediately release the buffer.
>
> I suppose it could upload the entire thing but then scaling the
> subrectangle either requires a copy to another texture, or what I think
> is an unrequired EGL extension. Scaling the entire thing but scissoring
> the output does not produce the correct filtering at the edges.

No, there is no copy or EGL extension needed.

> >> You need to define what happens if the crop region extends outside the
> >> buffer.
> >
> > It's defined as "undefined/dirty" pixels, isn't it? I think that's good
> > enough for now, at least.
>
> I think making the crop larger than the buffer will be useful,
> especially to fit your video into a slightly-different shaped window
> (this will happen if the scale factor is not an integer). It may be
> useful for letterboxing as well, but that will require the client to add
> black pixels on the edges so the user does not just see the edge of the
> video smeared to fill the area.

No, sampling outside of a buffer is a non-fatal client mistake.

- pq
Bill Spitzak
2013-11-11 18:10:04 UTC
Permalink
Pekka Paalanen wrote:

>> Sounds right. My main concern was that the scale width+height completely
>> replaces all uses of the buffer width+height in the visible compositor
>> api (the buffer width+height is needed to figure out where the memory
>> for the crop region is, but should be ignored otherwise).
>
> This thread is about protocol. It says nothing about Weston's
> internal implementation details.

My question *is* about the protocol.

Let me try again: to a client, after it sends a scale of a 100x100
buffer to 200x200, is there any indication of that original 100x100 size
in any protocol events or requests afterwards, or does it work precisely
the same as though they sent an unscaled 200x200 buffer?
Pekka Paalanen
2013-11-12 15:00:04 UTC
Permalink
On Mon, 11 Nov 2013 10:10:04 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

>
>
> Pekka Paalanen wrote:
>
> >> Sounds right. My main concern was that the scale width+height completely
> >> replaces all uses of the buffer width+height in the visible compositor
> >> api (the buffer width+height is needed to figure out where the memory
> >> for the crop region is, but should be ignored otherwise).
> >
> > This thread is about protocol. It says nothing about Weston's
> > internal implementation details.
>
> My question *is* about the protocol.
>
> Let me try again: to a client, after it sends a scale of a 100x100
> buffer to 200x200, is there any indication of that original 100x100 size
> in any protocol events or requests afterwards, or does it work precisely
> the same as though they sent an unscaled 200x200 buffer?

All protocol that deals with surfaces (input, sub-surfaces, shell,
etc...) works in surface coordinates.

Of course both client and compositor will know the true size of the
buffer, but the size of buffer is generally not used in the
protocol beyond determining the surface size or setting the
buffer-to-surface coordinate mapping parameters.


Thanks,
pq
Pekka Paalanen
2013-11-10 09:02:09 UTC
Permalink
On Fri, 8 Nov 2013 23:00:28 -0200
Rafael Antognolli <rafael.antognolli at intel.com> wrote:

> On Fri, Nov 08, 2013 at 10:59:07AM -0800, Bill Spitzak wrote:
> >
> >
> > Pekka Paalanen wrote:
> > >Hi all,
> > >
> > >this is the v2 of the crop and scale extension, as RFC.
> >
> > I get the impression that the result of crop+scale is supposed to be exactly
> > the same as though the client made a second buffer of the scale size, scaled
> > the crop region from the first buffer to this second buffer, then attached
> > it with the normal wayland mechanism. Correct?
>
> From what I understood, the visual result might be that, but it's not
> what should go on inside the renderer.

Right.

> > So that compositors are allowed to only keep the cropped area of a buffer,
> > there will have to be limitations on changing the crop+scale without also
> > doing another attach. Maybe it does not work unless there is an attach in
> > the same commit, or you might even require the attach to be after the
> > crop+scale and before the commit.
>
> IMHO the compositor would keep the entire buffer, just like it already
> does. So when a buffer is attached to a surface, in the case of the gl
> renderer, it would get entirely uploaded as a texture, and just in the
> end when the texture is going to be rendered on the screen, only the
> cropped area would be presented (in a scaled version, if that's the
> case). This would allow to change the crop & scale parameters without
> the need to a new attach.

Yes, that is my opinion about it, too.

The crop & scale extension is usually used with hardware
(EGL-based) wl_buffers, so there is no copy to begin with.

> Unless I'm wrong regarding the "buffer being entirely uploaded to the
> compositor", but that's how I was implementing it.
>
> > The big problem I see with this api is that if buffer_scale is not one, the
> > client is unable to specify the crop or scale rectangles in pixels. However
> > this is a general problem with buffer_scale everywhere. (actually you seem
> > to be using fixed, not integers, so it is possible if buffer_scale is a
> > power of 2). I would change the crop to be in pixels. The scale rectangle
> > requires fixing or replacing the buffer scale mechanism.

Clients will always specify surface content in blocks of
buffer_scale x buffer_scale pixels. That is how it was before, and
that is how the crop & scale extension uses it.

In other words, everything is still in surface coordinate units,
just like before.

> > You need to define what happens if the crop region extends outside the
> > buffer. I think the edge pixels will have to be replicated to fill, since
> > using transparency here will turn an opaque buffer into one with
> > transparency, and produce an anti-aliased transparent edge, both of which
> > may defeat the ability to use hardware scaling.
>
> It's defined as "undefined/dirty" pixels, isn't it? I think that's good
> enough for now, at least.

Yes. The compositor is free to do whatever is easiest. Software
compositing might just ignore the part that is outside of the
buffer, and a GL-renderer may simply not care and produces whatever
the GL texture repeat mode happens to be. Also "hall-of-mirrors" (a
type of Doom rendering glitch) kind of rendering artifacts are
allowed. It is the client's fault of not giving proper content.

Not specifying anything particular for the undefined pixels allows
compositors to use the overlay hardware in a simple way, and not
require jumping through hoops to realize e.g. black fill, which
would be part of the surface but likely not realizable with the one
hw overlay unit.

> > I think there may also have to be rules about whether filters are allowed to
> > sample outside the crop region (there are things other than box filters, you
> > know). For security reasons this must not be allowed outside the actual
> > buffer, so that adjacent memory does not leak into the displayed image, but
> > it could be left undefined for pixels between the crop and the buffer edge.

I think that is implied, that it is ok to sample from outside of the
source rectangle, as long as it is inside the buffer. The Wayland
protocol always assumes that buffers are complete and fully filled.


Thanks,
pq
Bill Spitzak
2013-11-11 18:24:33 UTC
Permalink
Pekka Paalanen wrote:

> Clients will always specify surface content in blocks of
> buffer_scale x buffer_scale pixels. That is how it was before, and
> that is how the crop & scale extension uses it.
>
> In other words, everything is still in surface coordinate units,
> just like before.

Yes I understand this.

IMHO this design is incorrect however. A client cannot take full
advantage of all possible surface sizes because they must be multiples
of the buffer scale.

In any case this really is a problem with the buffer_scale api. I think
it will be fixed once somebody tries to write a serious application that
takes advantage of a hi-res screen.
Pekka Paalanen
2013-11-12 15:09:28 UTC
Permalink
On Mon, 11 Nov 2013 10:24:33 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

> Pekka Paalanen wrote:
>
> > Clients will always specify surface content in blocks of
> > buffer_scale x buffer_scale pixels. That is how it was before, and
> > that is how the crop & scale extension uses it.
> >
> > In other words, everything is still in surface coordinate units,
> > just like before.
>
> Yes I understand this.
>
> IMHO this design is incorrect however. A client cannot take full
> advantage of all possible surface sizes because they must be multiples
> of the buffer scale.
>
> In any case this really is a problem with the buffer_scale api. I think
> it will be fixed once somebody tries to write a serious application that
> takes advantage of a hi-res screen.

The surface size has never been limited to multiples of buffer_scale.
The *buffer* size is.

Note, that when we talk about surface size, it is always given in the
surface coordinate units, unless said otherwise.

You are probably referring to the view size on the output, that is the
size of the surface drawn on a particular output, measured in output
pixels. That size will always be a multiple of output_scale, regardless
of buffer_scale, because the surface size is in integers.

We are all just waiting for the use case where this is not sufficient.
- pq
Bill Spitzak
2013-11-12 21:38:46 UTC
Permalink
Pekka Paalanen wrote:

> The surface size has never been limited to multiples of buffer_scale.
> The *buffer* size is.
>
> Note, that when we talk about surface size, it is always given in the
> surface coordinate units, unless said otherwise.
>
> You are probably referring to the view size on the output, that is the
> size of the surface drawn on a particular output, measured in output
> pixels. That size will always be a multiple of output_scale, regardless
> of buffer_scale, because the surface size is in integers.

I am assuming the only buffer_scale other than 1 will be to set it equal
to an output_scale. Since the numbers are equal, it is therefore true
that neither the buffer or output surface size can be given in pixels,
only in multiples of the buffer_scale/output_scale.

> We are all just waiting for the use case where this is not sufficient.

Primarily because smooth resizing will require using transparent pixels
to fill in the odd-sized edges, thus losing the ability to use opaque
surfaces, and complicating the api to drawing libraries.

I am also worried that clients will accidentally throw away
higher-precision pointing information by rounding all event coordinates
to pixels.
Antognolli, Rafael
2013-11-09 02:26:18 UTC
Permalink
Sending again to the list, as my message apparently didn't get delivered.
________________________________________
From: Rafael Antognolli [rafael.antognolli at intel.com]
Sent: Friday, November 08, 2013 10:52 PM
To: Pekka Paalanen
Cc: wayland-devel@; Kristian H?gsberg; jonny.lamb at collabora.co.uk; Daniel Stone
Subject: Re: [RFC v2] surface crop & scale protocol extension

Hi Pekka, following are my comments:

On Fri, Nov 08, 2013 at 01:33:22PM +0200, Pekka Paalanen wrote:
> Hi all,
>
> this is the v2 of the crop and scale extension, as RFC.
>
> The v1 was in:
> http://lists.freedesktop.org/archives/wayland-devel/2013-April/008927.html
>
> Based on v1, Jonny Lamb has been working on a Weston implementation,
> starting from the protocol side, and working towards the renderer
> implementations. That work is still very much in progress.
>
>
> Introduction:
>
> The primary use case for the crop & scale extension is for hardware
> accelerated, zero-copy video display. Hardware video decoders produce
> complete frames or ultimately wl_buffers that are attached to
> a wl_surface. The crop & scale extension allows the client to
> request the compositor to crop and scale the frame so it fits the
> client's purposes. Optimally, a compositor implements this by
> programming a hardware overlay unit to do the requested cropping and
> scaling. The major benefit is that the CPU never has to even see the
> video pixels.
>
> Probably a common case is to have a video in a sub-surface, so that
> window decorations and overlaid GUI elements can be in other
> (sub-)surfaces, and avoid touching the pixels in the video frame
> buffers. Video as a browser element will need cropping when the element
> is partially off-view. Scaling should be useful in general, e.g.
> showing a video scaled up (or down) but still in a window.
>
> However, I also see that crop & scale can be used to present videos
> with non-square pixels in fullscreen (e.g. without sub-surfaces). Crop
> & scale is needed in this case, because the fullscreening methods in
> wl_shell do not allow changing the aspect ratio, or cropping parts of
> the video out to fill the whole output area when video and output
> aspect ratios differ. The fullscreening methods support only adding
> black borders, but not scale-to-fill-and-crop.

So, assuming we are cropping/scaling the main surface of a given
application that, for some reason, has some visible subsurfaces inside
it. It is not expected that any crop/scale applies to the child
subsurfaces, right? It should all be handled by the application in
question, I assume.

>
> Changes since v1:
>
> The changes I have made are very small, and can be seen in patch form
> at:
> http://cgit.collabora.com/git/user/pq/weston.git/log/?h=clipscale-wip
>
> The changes are:
> - improve wording, add missing details
> - take buffer_scale into account
> - rewrite the coordinate transformations more clearly
>
> In the end, I did not get much else out from the discussions of v1.
>
> I think some people do not like the structure of the protocol
> additions, namely adding yet another global factory interface and a
> sub-interface to wl_surface. If the concensus really is to move this
> into a single wl_surface request, that is fine.
>
> But, to me this would not make sense as a request in wl_subsurface. The
> crop & scale state is supposed to be driven by the application
> component that produces the wl_buffers and attaches them to a
> wl_surface. The wl_subsurface interface OTOH is to be used by the
> parent component, for e.g. setting the sub-surface position. The
> situation is similar to a compositor vs. clients: clients define the
> surface size and content, compositor the position. Also, the crop &

I don't agree with this. I think the crop & scale state should be driven
by the parent component that is composing the scene, maybe attaching the
wl_buffer, but not necessarily the one that is producing the wl_buffer.

I'm talking from EFL point of view: the decoder part of a video player
application would produce the wl_buffers, with their respective sizes,
while Evas (the canvas library, that controls the scene graph) would
position, scale and crop the surface as required to fit the scene.

> scale state is not well suited to be "forced on" by a parent software
> component, as it changes how e.g. input event coordinates relate to the
> wl_buffer contents. Finally, there is the fullscreen video use case I

Maybe these input event coordinates should be translated too, by the
same parent software component that is forcing the crop/scale state on
the subsurface.

> described above.
>
> The interface names are still so-and-so, I haven't come up with
> anything better. wl_scaler is pretty ok, but I feel something better
> would be in place for wl_surface_scaler. How would wl_viewport sound
> like?

+1 for wl_viewport.

> A minor benefit of keeping crop & scale as a separate interface is that
> it could be made optional, a per-backend or renderer thing.

Really cool.

So, in summary, I'm not against using wl_scaler (or wl_viewport), nor
adding it directly to wl_surface if necessary. But I do think that
putting it into wl_subsurface would work just fine too, except from the
fullscreen, without-subsurface, video player case.

All the other use cases that I have in mind, also discussed with other
guys from EFL, are related to subsurfaces too.

Just one comment about the protocol description below.

> For your reviewing pleasure, here is the v2:
>
> <?xml version="1.0" encoding="UTF-8"?>
> <protocol name="scaler">
>
> <copyright>
> Copyright ?? 2013 Collabora, Ltd.
>
> Permission to use, copy, modify, distribute, and sell this
> software and its documentation for any purpose is hereby granted
> without fee, provided that the above copyright notice appear in
> all copies and that both that copyright notice and this permission
> notice appear in supporting documentation, and that the name of
> the copyright holders not be used in advertising or publicity
> pertaining to distribution of the software without specific,
> written prior permission. The copyright holders make no
> representations about the suitability of this software for any
> purpose. It is provided "as is" without express or implied
> warranty.
>
> THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
> SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
> FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
> SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
> WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
> AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
> ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
> THIS SOFTWARE.
> </copyright>
>
> <interface name="wl_scaler" version="1">
> <description summary="surface cropping and scaling">
> The global interface exposing surface cropping and scaling
> capabilities is used to instantiate an interface extension for a
> wl_surface object. This extended interface will then allow
> cropping and scaling the surface contents, effectively
> disconnecting the direct relationship between the buffer and the
> surface size.
> </description>
>
> <request name="destroy" type="destructor">
> <description summary="unbind from the cropping and scaling interface">
> Informs the server that the client will not be using this
> protocol object anymore. This does not affect any other objects,
> wl_surface_scaler objects included.
> </description>
> </request>
>
> <enum name="error">
> <entry name="scaler_exists" value="0"
> summary="the surface already has a scaler object associated"/>
> </enum>
>
> <request name="get_surface_scaler">
> <description summary="extend surface interface for crop and scale">
> Instantiate an interface extension for the given wl_surface to
> crop and scale its content. If the given wl_surface already has
> a wl_surface_scaler object associated, the scaler_exists
> protocol error is raised.
> </description>
>
> <arg name="id" type="new_id" interface="wl_surface_scaler"
> summary="the new scaler interface id"/>
> <arg name="surface" type="object" interface="wl_surface"
> summary="the surface"/>
> </request>
> </interface>
>
> <interface name="wl_surface_scaler" version="1">
> <description summary="crop and scale interface to a wl_surface">
> An additional interface to a wl_surface object, which allows the
> client to specify the cropping and scaling of the surface
> contents.
>
> This interface allows to define the source rectangle (src_x,
> src_y, src_width, src_height) from where to take the wl_buffer
> contents, and scale that to destination size (dst_width,
> dst_height). This state is double-buffered, and is applied on the
> next wl_surface.commit.
>
> Before the first set request, the wl_surface still behaves as if
> there was no crop and scale state. That is, no scaling is applied,
> and the surface size is as defined in wl_surface.attach.
>
> The crop and scale state causes the surface size to become
> dst_width, dst_height. This overrides whatever the attached
> wl_buffer size is, unless the wl_buffer is NULL. If the wl_buffer is
> NULL, the surface has no content and therefore no size.
>
> The coordinate transformations from buffer pixel coordinates up to
> the surface-local coordinates happen in the following order:
> 1. buffer_transform (wl_surface.set_buffer_transform)
> 2. buffer_scale (wl_surface.set_buffer_scale)
> 3. crop and scale (wl_surface_scaler.set)
> This means, that the source rectangle coordinates of crop and scale
> are given in the coordinates after the buffer transform and scale,
> i.e. in the coordinates that would be the surface-local coordinates
> if the crop and scale was not applied.
>
> If the source rectangle is partially or completely outside of the
> wl_buffer, then the surface contents are undefined (not void), and
> the surface size is still dst_width, dst_height.
>
> The x, y arguments of wl_surface.attach are applied as normal to
> the surface. They indicate how many pixels to remove from the
> surface size from the left and the top. In other words, they are
> still in the surface-local coordinate system, just like dst_width
> and dst_height are.

How many pixels to rmeove from the surface size? Does it still mean that
the surface is just going to be moved, or will it affect the surface
size? Sorry, but it might be just a misunderstanding from me.

> If the wl_surface associated with the wl_surface_scaler is
> destroyed, the wl_surface_scaler object becomes inert.
>
> If the wl_surface_scaler object is destroyed, the crop and scale
> state is removed from the wl_surface. The change will be applied
> on the next wl_surface.commit.
> </description>
>
> <request name="destroy" type="destructor">
> <description summary="remove scaling and cropping from the surface">
> The associated wl_surface's crop and scale state is removed.
> The change is applied on the next wl_surface.commit.
> </description>
> </request>
>
> <enum name="error">
> <entry name="bad_value" value="0"
> summary="negative values in width or height"/>
> </enum>
>
> <request name="set">
> <description summary="set the crop and scale state">
> Set the crop and scale state of the associated wl_surface. See
> wl_surface_scaler for the description, and relation to the
> wl_buffer size.
>
> If any of src_width, src_height, dst_width, and dst_height is
> negative, the bad_value protocol error is raised.
>
> The crop and scale state is double-buffered state, and will be
> applied on the next wl_surface.commit.
>
> Arguments dst_x and dst_y do not exist here, use the x and y
> arguments to wl_surface.attach. The x, y, dst_width, and dst_height
> define the surface-local coordinate system irrespective of the
> attached wl_buffer size.
> </description>
>
> <arg name="src_x" type="fixed" summary="source rectangle x"/>
> <arg name="src_y" type="fixed" summary="source rectangle y"/>
> <arg name="src_width" type="fixed" summary="source rectangle width"/>
> <arg name="src_height" type="fixed" summary="source rectangle height"/>
> <arg name="dst_width" type="int" summary="surface width"/>
> <arg name="dst_height" type="int" summary="surface height"/>
> </request>
>
> </interface>
> </protocol>

Regards,
Rafael
Jason Ekstrand
2013-11-09 04:33:18 UTC
Permalink
Pekka,
I've got just a comple of general comments below.

On Nov 8, 2013 8:46 AM, "Pekka Paalanen" <ppaalanen at gmail.com> wrote:
>
> Hi all,
>
> this is the v2 of the crop and scale extension, as RFC.
>
> The v1 was in:
> http://lists.freedesktop.org/archives/wayland-devel/2013-April/008927.html
>
> Based on v1, Jonny Lamb has been working on a Weston implementation,
> starting from the protocol side, and working towards the renderer
> implementations. That work is still very much in progress.
>
>
> Introduction:
>
> The primary use case for the crop & scale extension is for hardware
> accelerated, zero-copy video display. Hardware video decoders produce
> complete frames or ultimately wl_buffers that are attached to
> a wl_surface. The crop & scale extension allows the client to
> request the compositor to crop and scale the frame so it fits the
> client's purposes. Optimally, a compositor implements this by
> programming a hardware overlay unit to do the requested cropping and
> scaling. The major benefit is that the CPU never has to even see the
> video pixels.
>
> Probably a common case is to have a video in a sub-surface, so that
> window decorations and overlaid GUI elements can be in other
> (sub-)surfaces, and avoid touching the pixels in the video frame
> buffers. Video as a browser element will need cropping when the element
> is partially off-view. Scaling should be useful in general, e.g.
> showing a video scaled up (or down) but still in a window.
>
> However, I also see that crop & scale can be used to present videos
> with non-square pixels in fullscreen (e.g. without sub-surfaces). Crop
> & scale is needed in this case, because the fullscreening methods in
> wl_shell do not allow changing the aspect ratio, or cropping parts of
> the video out to fill the whole output area when video and output
> aspect ratios differ. The fullscreening methods support only adding
> black borders, but not scale-to-fill-and-crop.
>
>
> Changes since v1:
>
> The changes I have made are very small, and can be seen in patch form
> at:
> http://cgit.collabora.com/git/user/pq/weston.git/log/?h=clipscale-wip
>
> The changes are:
> - improve wording, add missing details
> - take buffer_scale into account
> - rewrite the coordinate transformations more clearly
>
> In the end, I did not get much else out from the discussions of v1.
>
> I think some people do not like the structure of the protocol
> additions, namely adding yet another global factory interface and a
> sub-interface to wl_surface. If the concensus really is to move this
> into a single wl_surface request, that is fine.
>
> But, to me this would not make sense as a request in wl_subsurface. The
> crop & scale state is supposed to be driven by the application
> component that produces the wl_buffers and attaches them to a
> wl_surface. The wl_subsurface interface OTOH is to be used by the
> parent component, for e.g. setting the sub-surface position. The
> situation is similar to a compositor vs. clients: clients define the
> surface size and content, compositor the position. Also, the crop &
> scale state is not well suited to be "forced on" by a parent software
> component, as it changes how e.g. input event coordinates relate to the
> wl_buffer contents. Finally, there is the fullscreen video use case I
> described above.

In the design of both the subsurfaces and crop&scale, I fear that you may
be trying too hard to avoid out-of-band communication between components
with little to no benefit. I don't want to start a bikeshedding landslide
here and you are free to disagree. However, I'm concerned that we're
overcomplicating things without gaining anything.

In some simple use-cases, you will be able to write one component that has
runs on a subsurface and another that uses it and the two components are
basically unaware of each other beyond the initial handshake. However, I
think in practice I feel that what is more likely to happen is that the
primary tookit will handle everything. If this is the case, it makes no
difference whether it's separate or inside wl_subsurface except that the
toolkit has more extensions to wrangle.

Concerning full-screen video: After thinking about it a bit more, I think
this is a failure of wl_subsurface more than a need for scaling regular
surfaces. I think the primary issue there is that wl_subsurface provides
no real way to make a surface that, itself, is empty but its subsurfaces
have content. If this were possible, then it would be no problem putting
crop&scale in the wl_subsurface protocol. It's worth noting that this same
issue makes putting a video (from an external source) in a SHM subsurface
frame really awkward. See also this e-mail from April:

http://lists.freedesktop.org/archives/wayland-devel/2013-April/008875.html

Hope it's useful. Thanks,
--Jason Ekstrand

>
> The interface names are still so-and-so, I haven't come up with
> anything better. wl_scaler is pretty ok, but I feel something better
> would be in place for wl_surface_scaler. How would wl_viewport sound
> like?
>
> A minor benefit of keeping crop & scale as a separate interface is that
> it could be made optional, a per-backend or renderer thing.
>
>
> For your reviewing pleasure, here is the v2:
>
> <?xml version="1.0" encoding="UTF-8"?>
> <protocol name="scaler">
>
> <copyright>
> Copyright ?? 2013 Collabora, Ltd.
>
> Permission to use, copy, modify, distribute, and sell this
> software and its documentation for any purpose is hereby granted
> without fee, provided that the above copyright notice appear in
> all copies and that both that copyright notice and this permission
> notice appear in supporting documentation, and that the name of
> the copyright holders not be used in advertising or publicity
> pertaining to distribution of the software without specific,
> written prior permission. The copyright holders make no
> representations about the suitability of this software for any
> purpose. It is provided "as is" without express or implied
> warranty.
>
> THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
> SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
> FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
> SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
> WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
> AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
> ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
> THIS SOFTWARE.
> </copyright>
>
> <interface name="wl_scaler" version="1">
> <description summary="surface cropping and scaling">
> The global interface exposing surface cropping and scaling
> capabilities is used to instantiate an interface extension for a
> wl_surface object. This extended interface will then allow
> cropping and scaling the surface contents, effectively
> disconnecting the direct relationship between the buffer and the
> surface size.
> </description>
>
> <request name="destroy" type="destructor">
> <description summary="unbind from the cropping and scaling
interface">
> Informs the server that the client will not be using this
> protocol object anymore. This does not affect any other objects,
> wl_surface_scaler objects included.
> </description>
> </request>
>
> <enum name="error">
> <entry name="scaler_exists" value="0"
> summary="the surface already has a scaler object
associated"/>
> </enum>
>
> <request name="get_surface_scaler">
> <description summary="extend surface interface for crop and scale">
> Instantiate an interface extension for the given wl_surface to
> crop and scale its content. If the given wl_surface already has
> a wl_surface_scaler object associated, the scaler_exists
> protocol error is raised.
> </description>
>
> <arg name="id" type="new_id" interface="wl_surface_scaler"
> summary="the new scaler interface id"/>
> <arg name="surface" type="object" interface="wl_surface"
> summary="the surface"/>
> </request>
> </interface>
>
> <interface name="wl_surface_scaler" version="1">
> <description summary="crop and scale interface to a wl_surface">
> An additional interface to a wl_surface object, which allows the
> client to specify the cropping and scaling of the surface
> contents.
>
> This interface allows to define the source rectangle (src_x,
> src_y, src_width, src_height) from where to take the wl_buffer
> contents, and scale that to destination size (dst_width,
> dst_height). This state is double-buffered, and is applied on the
> next wl_surface.commit.
>
> Before the first set request, the wl_surface still behaves as if
> there was no crop and scale state. That is, no scaling is applied,
> and the surface size is as defined in wl_surface.attach.
>
> The crop and scale state causes the surface size to become
> dst_width, dst_height. This overrides whatever the attached
> wl_buffer size is, unless the wl_buffer is NULL. If the wl_buffer is
> NULL, the surface has no content and therefore no size.
>
> The coordinate transformations from buffer pixel coordinates up to
> the surface-local coordinates happen in the following order:
> 1. buffer_transform (wl_surface.set_buffer_transform)
> 2. buffer_scale (wl_surface.set_buffer_scale)
> 3. crop and scale (wl_surface_scaler.set)
> This means, that the source rectangle coordinates of crop and scale
> are given in the coordinates after the buffer transform and scale,
> i.e. in the coordinates that would be the surface-local coordinates
> if the crop and scale was not applied.
>
> If the source rectangle is partially or completely outside of the
> wl_buffer, then the surface contents are undefined (not void), and
> the surface size is still dst_width, dst_height.
>
> The x, y arguments of wl_surface.attach are applied as normal to
> the surface. They indicate how many pixels to remove from the
> surface size from the left and the top. In other words, they are
> still in the surface-local coordinate system, just like dst_width
> and dst_height are.
>
> If the wl_surface associated with the wl_surface_scaler is
> destroyed, the wl_surface_scaler object becomes inert.
>
> If the wl_surface_scaler object is destroyed, the crop and scale
> state is removed from the wl_surface. The change will be applied
> on the next wl_surface.commit.
> </description>
>
> <request name="destroy" type="destructor">
> <description summary="remove scaling and cropping from the surface">
> The associated wl_surface's crop and scale state is removed.
> The change is applied on the next wl_surface.commit.
> </description>
> </request>
>
> <enum name="error">
> <entry name="bad_value" value="0"
> summary="negative values in width or height"/>
> </enum>
>
> <request name="set">
> <description summary="set the crop and scale state">
> Set the crop and scale state of the associated wl_surface. See
> wl_surface_scaler for the description, and relation to the
> wl_buffer size.
>
> If any of src_width, src_height, dst_width, and dst_height is
> negative, the bad_value protocol error is raised.
>
> The crop and scale state is double-buffered state, and will be
> applied on the next wl_surface.commit.
>
> Arguments dst_x and dst_y do not exist here, use the x and y
> arguments to wl_surface.attach. The x, y, dst_width, and
dst_height
> define the surface-local coordinate system irrespective of the
> attached wl_buffer size.
> </description>
>
> <arg name="src_x" type="fixed" summary="source rectangle x"/>
> <arg name="src_y" type="fixed" summary="source rectangle y"/>
> <arg name="src_width" type="fixed" summary="source rectangle
width"/>
> <arg name="src_height" type="fixed" summary="source rectangle
height"/>
> <arg name="dst_width" type="int" summary="surface width"/>
> <arg name="dst_height" type="int" summary="surface height"/>
> </request>
>
> </interface>
> </protocol>
>
>
> Thanks,
> pq
> _______________________________________________
> wayland-devel mailing list
> wayland-devel at lists.freedesktop.org
> http://lists.freedesktop.org/mailman/listinfo/wayland-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.freedesktop.org/archives/wayland-devel/attachments/20131108/f5474064/attachment-0001.html>
Pekka Paalanen
2013-11-10 09:38:56 UTC
Permalink
On Fri, 8 Nov 2013 22:33:18 -0600
Jason Ekstrand <jason at jlekstrand.net> wrote:

> Pekka,
> I've got just a comple of general comments below.
>
> On Nov 8, 2013 8:46 AM, "Pekka Paalanen" <ppaalanen at gmail.com> wrote:
> >
> > Hi all,
> >
> > this is the v2 of the crop and scale extension, as RFC.
> >
> > The v1 was in:
> > http://lists.freedesktop.org/archives/wayland-devel/2013-April/008927.html
> >
> > Based on v1, Jonny Lamb has been working on a Weston implementation,
> > starting from the protocol side, and working towards the renderer
> > implementations. That work is still very much in progress.
> >
> >
> > Introduction:
> >
> > The primary use case for the crop & scale extension is for hardware
> > accelerated, zero-copy video display. Hardware video decoders produce
> > complete frames or ultimately wl_buffers that are attached to
> > a wl_surface. The crop & scale extension allows the client to
> > request the compositor to crop and scale the frame so it fits the
> > client's purposes. Optimally, a compositor implements this by
> > programming a hardware overlay unit to do the requested cropping and
> > scaling. The major benefit is that the CPU never has to even see the
> > video pixels.
> >
> > Probably a common case is to have a video in a sub-surface, so that
> > window decorations and overlaid GUI elements can be in other
> > (sub-)surfaces, and avoid touching the pixels in the video frame
> > buffers. Video as a browser element will need cropping when the element
> > is partially off-view. Scaling should be useful in general, e.g.
> > showing a video scaled up (or down) but still in a window.
> >
> > However, I also see that crop & scale can be used to present videos
> > with non-square pixels in fullscreen (e.g. without sub-surfaces). Crop
> > & scale is needed in this case, because the fullscreening methods in
> > wl_shell do not allow changing the aspect ratio, or cropping parts of
> > the video out to fill the whole output area when video and output
> > aspect ratios differ. The fullscreening methods support only adding
> > black borders, but not scale-to-fill-and-crop.
> >
> >
> > Changes since v1:
> >
> > The changes I have made are very small, and can be seen in patch form
> > at:
> > http://cgit.collabora.com/git/user/pq/weston.git/log/?h=clipscale-wip
> >
> > The changes are:
> > - improve wording, add missing details
> > - take buffer_scale into account
> > - rewrite the coordinate transformations more clearly
> >
> > In the end, I did not get much else out from the discussions of v1.
> >
> > I think some people do not like the structure of the protocol
> > additions, namely adding yet another global factory interface and a
> > sub-interface to wl_surface. If the concensus really is to move this
> > into a single wl_surface request, that is fine.
> >
> > But, to me this would not make sense as a request in wl_subsurface. The
> > crop & scale state is supposed to be driven by the application
> > component that produces the wl_buffers and attaches them to a
> > wl_surface. The wl_subsurface interface OTOH is to be used by the
> > parent component, for e.g. setting the sub-surface position. The
> > situation is similar to a compositor vs. clients: clients define the
> > surface size and content, compositor the position. Also, the crop &
> > scale state is not well suited to be "forced on" by a parent software
> > component, as it changes how e.g. input event coordinates relate to the
> > wl_buffer contents. Finally, there is the fullscreen video use case I
> > described above.
>
> In the design of both the subsurfaces and crop&scale, I fear that you may
> be trying too hard to avoid out-of-band communication between components
> with little to no benefit. I don't want to start a bikeshedding landslide
> here and you are free to disagree. However, I'm concerned that we're
> overcomplicating things without gaining anything.

I don't really see how excluding wl_subsurface as a possible host
interface for crop & scale would complicate things. Using
wl_subsurface would only restrict the possible use cases without
any gain I can see.

> In some simple use-cases, you will be able to write one component that has
> runs on a subsurface and another that uses it and the two components are
> basically unaware of each other beyond the initial handshake. However, I
> think in practice I feel that what is more likely to happen is that the
> primary tookit will handle everything. If this is the case, it makes no
> difference whether it's separate or inside wl_subsurface except that the
> toolkit has more extensions to wrangle.

There has to be communication after the initial hand-shake anyway,
at least for resizing. The thing I am trying to avoid is to have to
synchronize the two components (which may run in different threads)
for every frame in the steady state. Yes, that complicated the
sub-surface protocol, but crop & scale is largely unrelated and
agnostic, unless it is made part of wl_subsurface.

Maybe someone from Gstreamer video sink or Clutter et al. experts
can comment here. I seem to recall hearing about use cases where
two different toolkits are being used in the same client.

> Concerning full-screen video: After thinking about it a bit more, I think
> this is a failure of wl_subsurface more than a need for scaling regular
> surfaces. I think the primary issue there is that wl_subsurface provides
> no real way to make a surface that, itself, is empty but its subsurfaces
> have content. If this were possible, then it would be no problem putting
> crop&scale in the wl_subsurface protocol. It's worth noting that this same
> issue makes putting a video (from an external source) in a SHM subsurface
> frame really awkward. See also this e-mail from April:
>
> http://lists.freedesktop.org/archives/wayland-devel/2013-April/008875.html
>

Actually, crop & scale *not* being in wl_subsurface allows you to
have your invisible main surface! (If you really insist.) Make a
1x1 wl_shm buffer, make the pixel in it (0,0,0,0), and use crop &
scale to define the size of the main surface: you get an invisible
normal wl_surface of the size you want, without wasting memory, to
be used as you like.

Personally I don't like that use case, but it is possible.

As for the video player with hw-accelerated video and shm
decorations, how about keeping e.g. the window title side
sub-surface always as the main surface, and video in a sub-surface?
When fullscreening, the title decoration surface would be
completely occluded by the video sub-surface. Then, when shell
scales the window to fill the output, the only visible surface will
be the video sub-surface. It should also trivially allow use of hw
overlays in the compositor. Would that work?

Sure, you'd need input to the video sub-surface, but that is
quite solvable. Any invisible surfaces covering everything might
prevent the use of hw overlays.


Thanks,
pq
Pekka Paalanen
2013-11-11 09:33:09 UTC
Permalink
Argh, reply-to-all had a bad list address. Sorry for double-posting
your personal copies.
- pq

On Mon, 11 Nov 2013 11:13:48 +0200
Pekka Paalanen <ppaalanen at gmail.com> wrote:

> Hi Rafael,
>
> btw. if anyone thinks any of my replies should be added to the protocol
> specification to clarify it, let me know. If I don't say I'll add or
> change something in the spec, then I probably won't.
>
>
> On Fri, 8 Nov 2013 22:52:45 -0200
> Rafael Antognolli <rafael.antognolli at intel.com> wrote:
>
> > Hi Pekka, following are my comments:
> >
> > On Fri, Nov 08, 2013 at 01:33:22PM +0200, Pekka Paalanen wrote:
> > > Hi all,
> > >
> > > this is the v2 of the crop and scale extension, as RFC.
> > >
> > > The v1 was in:
> > > http://lists.freedesktop.org/archives/wayland-devel/2013-April/008927.html
> > >
> > > Based on v1, Jonny Lamb has been working on a Weston implementation,
> > > starting from the protocol side, and working towards the renderer
> > > implementations. That work is still very much in progress.
> > >
> > >
> > > Introduction:
> > >
> > > The primary use case for the crop & scale extension is for hardware
> > > accelerated, zero-copy video display. Hardware video decoders produce
> > > complete frames or ultimately wl_buffers that are attached to
> > > a wl_surface. The crop & scale extension allows the client to
> > > request the compositor to crop and scale the frame so it fits the
> > > client's purposes. Optimally, a compositor implements this by
> > > programming a hardware overlay unit to do the requested cropping and
> > > scaling. The major benefit is that the CPU never has to even see the
> > > video pixels.
> > >
> > > Probably a common case is to have a video in a sub-surface, so that
> > > window decorations and overlaid GUI elements can be in other
> > > (sub-)surfaces, and avoid touching the pixels in the video frame
> > > buffers. Video as a browser element will need cropping when the element
> > > is partially off-view. Scaling should be useful in general, e.g.
> > > showing a video scaled up (or down) but still in a window.
> > >
> > > However, I also see that crop & scale can be used to present videos
> > > with non-square pixels in fullscreen (e.g. without sub-surfaces). Crop
> > > & scale is needed in this case, because the fullscreening methods in
> > > wl_shell do not allow changing the aspect ratio, or cropping parts of
> > > the video out to fill the whole output area when video and output
> > > aspect ratios differ. The fullscreening methods support only adding
> > > black borders, but not scale-to-fill-and-crop.
> >
> > So, assuming we are cropping/scaling the main surface of a given
> > application that, for some reason, has some visible subsurfaces inside
> > it. It is not expected that any crop/scale applies to the child
> > subsurfaces, right? It should all be handled by the application in
> > question, I assume.
>
> Yes, crop & scale is defined for the buffer, not the surface, so it does
> not affect anything particular about surfaces, like clipping of
> (sub-)surfaces. It certainly does not affect other wl_surfaces in any
> way, than one where it is set.
>
> The crop & scale state is simply a transformation applied to the
> buffer, when it becomes the surface content.
>
> > > Changes since v1:
> > >
> > > The changes I have made are very small, and can be seen in patch form
> > > at:
> > > http://cgit.collabora.com/git/user/pq/weston.git/log/?h=clipscale-wip
> > >
> > > The changes are:
> > > - improve wording, add missing details
> > > - take buffer_scale into account
> > > - rewrite the coordinate transformations more clearly
> > >
> > > In the end, I did not get much else out from the discussions of v1.
> > >
> > > I think some people do not like the structure of the protocol
> > > additions, namely adding yet another global factory interface and a
> > > sub-interface to wl_surface. If the concensus really is to move this
> > > into a single wl_surface request, that is fine.
> > >
> > > But, to me this would not make sense as a request in wl_subsurface. The
> > > crop & scale state is supposed to be driven by the application
> > > component that produces the wl_buffers and attaches them to a
> > > wl_surface. The wl_subsurface interface OTOH is to be used by the
> > > parent component, for e.g. setting the sub-surface position. The
> > > situation is similar to a compositor vs. clients: clients define the
> > > surface size and content, compositor the position. Also, the crop &
> >
> > I don't agree with this. I think the crop & scale state should be driven
> > by the parent component that is composing the scene, maybe attaching the
> > wl_buffer, but not necessarily the one that is producing the wl_buffer.
>
> Bad choice of words from me: with producer I mean the one who attaches
> to wl_surface, i.e. the one who uses the wl_surface methods.
>
> > I'm talking from EFL point of view: the decoder part of a video player
> > application would produce the wl_buffers, with their respective sizes,
> > while Evas (the canvas library, that controls the scene graph) would
> > position, scale and crop the surface as required to fit the scene.
>
> Yeah, that's one way to do it. The other way I have been thinking of,
> is that the parent toolkit notifies the component that drives the
> wl_surface about crop & scale changes, just like for resizes.
>
> Think about toolkit agnostic library components, like perhaps a
> gstreamer video sink.
>
> > > scale state is not well suited to be "forced on" by a parent software
> > > component, as it changes how e.g. input event coordinates relate to the
> > > wl_buffer contents. Finally, there is the fullscreen video use case I
> >
> > Maybe these input event coordinates should be translated too, by the
> > same parent software component that is forcing the crop/scale state on
> > the subsurface.
>
> That would introduce one more coordinate system into the protocol. We
> right now have surface coordinates, where practically everything in the
> protocol is given, and buffer coordinates that clients use to draw.
> Using modified input coordinates would add an input coordinate system
> somewhere in between. I think that would complicate the use of the
> protocol, and it then overlooks the opposite case of the parent toolkit
> handling the sub-surface's input events.
>
> > > described above.
> > >
> > > The interface names are still so-and-so, I haven't come up with
> > > anything better. wl_scaler is pretty ok, but I feel something better
> > > would be in place for wl_surface_scaler. How would wl_viewport sound
> > > like?
> >
> > +1 for wl_viewport.
> >
> > > A minor benefit of keeping crop & scale as a separate interface is that
> > > it could be made optional, a per-backend or renderer thing.
> >
> > Really cool.
> >
> > So, in summary, I'm not against using wl_scaler (or wl_viewport), nor
> > adding it directly to wl_surface if necessary. But I do think that
> > putting it into wl_subsurface would work just fine too, except from the
> > fullscreen, without-subsurface, video player case.
> >
> > All the other use cases that I have in mind, also discussed with other
> > guys from EFL, are related to subsurfaces too.
> >
> > Just one comment about the protocol description below.
>
> The video player switching to/from fullscreen question is a really hard
> one, and I still do not have a good solution for that.
>
> > > For your reviewing pleasure, here is the v2:
> > >
> > > <?xml version="1.0" encoding="UTF-8"?>
> > > <protocol name="scaler">
> > >
> > > <copyright>
> > > Copyright ?? 2013 Collabora, Ltd.
> > >
> > > Permission to use, copy, modify, distribute, and sell this
> > > software and its documentation for any purpose is hereby granted
> > > without fee, provided that the above copyright notice appear in
> > > all copies and that both that copyright notice and this permission
> > > notice appear in supporting documentation, and that the name of
> > > the copyright holders not be used in advertising or publicity
> > > pertaining to distribution of the software without specific,
> > > written prior permission. The copyright holders make no
> > > representations about the suitability of this software for any
> > > purpose. It is provided "as is" without express or implied
> > > warranty.
> > >
> > > THE COPYRIGHT HOLDERS DISCLAIM ALL WARRANTIES WITH REGARD TO THIS
> > > SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
> > > FITNESS, IN NO EVENT SHALL THE COPYRIGHT HOLDERS BE LIABLE FOR ANY
> > > SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
> > > WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN
> > > AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION,
> > > ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF
> > > THIS SOFTWARE.
> > > </copyright>
> > >
> > > <interface name="wl_scaler" version="1">
> > > <description summary="surface cropping and scaling">
> > > The global interface exposing surface cropping and scaling
> > > capabilities is used to instantiate an interface extension for a
> > > wl_surface object. This extended interface will then allow
> > > cropping and scaling the surface contents, effectively
> > > disconnecting the direct relationship between the buffer and the
> > > surface size.
> > > </description>
> > >
> > > <request name="destroy" type="destructor">
> > > <description summary="unbind from the cropping and scaling interface">
> > > Informs the server that the client will not be using this
> > > protocol object anymore. This does not affect any other objects,
> > > wl_surface_scaler objects included.
> > > </description>
> > > </request>
> > >
> > > <enum name="error">
> > > <entry name="scaler_exists" value="0"
> > > summary="the surface already has a scaler object associated"/>
> > > </enum>
> > >
> > > <request name="get_surface_scaler">
> > > <description summary="extend surface interface for crop and scale">
> > > Instantiate an interface extension for the given wl_surface to
> > > crop and scale its content. If the given wl_surface already has
> > > a wl_surface_scaler object associated, the scaler_exists
> > > protocol error is raised.
> > > </description>
> > >
> > > <arg name="id" type="new_id" interface="wl_surface_scaler"
> > > summary="the new scaler interface id"/>
> > > <arg name="surface" type="object" interface="wl_surface"
> > > summary="the surface"/>
> > > </request>
> > > </interface>
> > >
> > > <interface name="wl_surface_scaler" version="1">
> > > <description summary="crop and scale interface to a wl_surface">
> > > An additional interface to a wl_surface object, which allows the
> > > client to specify the cropping and scaling of the surface
> > > contents.
> > >
> > > This interface allows to define the source rectangle (src_x,
> > > src_y, src_width, src_height) from where to take the wl_buffer
> > > contents, and scale that to destination size (dst_width,
> > > dst_height). This state is double-buffered, and is applied on the
> > > next wl_surface.commit.
> > >
> > > Before the first set request, the wl_surface still behaves as if
> > > there was no crop and scale state. That is, no scaling is applied,
> > > and the surface size is as defined in wl_surface.attach.
> > >
> > > The crop and scale state causes the surface size to become
> > > dst_width, dst_height. This overrides whatever the attached
> > > wl_buffer size is, unless the wl_buffer is NULL. If the wl_buffer is
> > > NULL, the surface has no content and therefore no size.
> > >
> > > The coordinate transformations from buffer pixel coordinates up to
> > > the surface-local coordinates happen in the following order:
> > > 1. buffer_transform (wl_surface.set_buffer_transform)
> > > 2. buffer_scale (wl_surface.set_buffer_scale)
> > > 3. crop and scale (wl_surface_scaler.set)
> > > This means, that the source rectangle coordinates of crop and scale
> > > are given in the coordinates after the buffer transform and scale,
> > > i.e. in the coordinates that would be the surface-local coordinates
> > > if the crop and scale was not applied.
> > >
> > > If the source rectangle is partially or completely outside of the
> > > wl_buffer, then the surface contents are undefined (not void), and
> > > the surface size is still dst_width, dst_height.
> > >
> > > The x, y arguments of wl_surface.attach are applied as normal to
> > > the surface. They indicate how many pixels to remove from the
> > > surface size from the left and the top. In other words, they are
> > > still in the surface-local coordinate system, just like dst_width
> > > and dst_height are.
> >
> > How many pixels to rmeove from the surface size? Does it still mean that
> > the surface is just going to be moved, or will it affect the surface
> > size? Sorry, but it might be just a misunderstanding from me.
>
> This is exactly the same definition as what is in wl_surface.attach,
> but in different words. "Removed" means compared to the previous
> surface position and size. So yes, it is still just the move.
>
> Would it be clearer, if it was written like this:
>
> "They indicate how many pixels to remove from the surface from the left
> and the top. How many pixels are added or removed from the bottom and
> the right are detemined by the combination of x, y, and dst_width and
> dst_height. In other words..."
>
> Or should I just remove that extra explanation as confusing?
>
>
> Thanks,
> pq
>
> > > If the wl_surface associated with the wl_surface_scaler is
> > > destroyed, the wl_surface_scaler object becomes inert.
> > >
> > > If the wl_surface_scaler object is destroyed, the crop and scale
> > > state is removed from the wl_surface. The change will be applied
> > > on the next wl_surface.commit.
> > > </description>
> > >
> > > <request name="destroy" type="destructor">
> > > <description summary="remove scaling and cropping from the surface">
> > > The associated wl_surface's crop and scale state is removed.
> > > The change is applied on the next wl_surface.commit.
> > > </description>
> > > </request>
> > >
> > > <enum name="error">
> > > <entry name="bad_value" value="0"
> > > summary="negative values in width or height"/>
> > > </enum>
> > >
> > > <request name="set">
> > > <description summary="set the crop and scale state">
> > > Set the crop and scale state of the associated wl_surface. See
> > > wl_surface_scaler for the description, and relation to the
> > > wl_buffer size.
> > >
> > > If any of src_width, src_height, dst_width, and dst_height is
> > > negative, the bad_value protocol error is raised.
> > >
> > > The crop and scale state is double-buffered state, and will be
> > > applied on the next wl_surface.commit.
> > >
> > > Arguments dst_x and dst_y do not exist here, use the x and y
> > > arguments to wl_surface.attach. The x, y, dst_width, and dst_height
> > > define the surface-local coordinate system irrespective of the
> > > attached wl_buffer size.
> > > </description>
> > >
> > > <arg name="src_x" type="fixed" summary="source rectangle x"/>
> > > <arg name="src_y" type="fixed" summary="source rectangle y"/>
> > > <arg name="src_width" type="fixed" summary="source rectangle width"/>
> > > <arg name="src_height" type="fixed" summary="source rectangle height"/>
> > > <arg name="dst_width" type="int" summary="surface width"/>
> > > <arg name="dst_height" type="int" summary="surface height"/>
> > > </request>
> > >
> > > </interface>
> > > </protocol>
> >
> > Regards,
> > Rafael
>
Pekka Paalanen
2013-11-11 13:19:06 UTC
Permalink
This post might be inappropriate. Click to display it.
Bill Spitzak
2013-11-11 18:47:47 UTC
Permalink
Pekka Paalanen wrote:

> Or would you rather have src_x, src_y, src_width, src_height always
> defined in buffer coordinates, and then buffer_transform and
> buffer_scale (with their requirement of size being a
> multiple of buffer_scale) applied to dst_width and dst_height instead?
>
> In the latter case, the surface size would be affected by all of
> buffer_transform, buffer_scale, dst_width and dst_height.

The source rectangle *must* be in buffer pixels. Putting it in
buffer_scale units makes absolutely no sense, as the buffer_scale is in
effect ignored for this surface, overridden entirely by the scaling.

It does not matter whether buffer_transform is applied first, as it does
not move where integers are.

Also if integers are used it is impossible to scale an image that is not
a multiple of buffer_scale in both dimensions. Using fixed for the
source rectangle works around this problem but is very misleading.

At the moment it sounds like the destination has to be in buffer_scale
units, rather than in pixels, in order to match the rest of the Wayland
api. I think this is a mistake but it should be fixed everywhere, not
just here.

> In the end, I see no other difference between the transformation orders
> than convenience: calculations you might be able to avoid, some lines
> of code you can skip writing. The same piece of code has to control both
> the wl_surface interface and the wl_surface_scaler interface anyway.

In theory yes, but unfortunately in the real world there is finite
precision in our representation of numbers, making the differences
relevant and important.
Pekka Paalanen
2013-11-12 09:00:53 UTC
Permalink
On Mon, 11 Nov 2013 10:47:47 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

> Pekka Paalanen wrote:
>
> > Or would you rather have src_x, src_y, src_width, src_height always
> > defined in buffer coordinates, and then buffer_transform and
> > buffer_scale (with their requirement of size being a
> > multiple of buffer_scale) applied to dst_width and dst_height instead?
> >
> > In the latter case, the surface size would be affected by all of
> > buffer_transform, buffer_scale, dst_width and dst_height.
>
> The source rectangle *must* be in buffer pixels. Putting it in
> buffer_scale units makes absolutely no sense, as the buffer_scale is in
> effect ignored for this surface, overridden entirely by the scaling.

That means that dst_width and dst_height will be required to be in
multiples of buffer_scale.

In the original proposal however, a client can simply choose to set
buffer_scale=1 regardless of output_scale, and so use buffer-pixel
units in the src rectangle (although buffer_transform will still affect
it). You understood right, buffer_scale is in effect ignored.

> It does not matter whether buffer_transform is applied first, as it does
> not move where integers are.

But it does matter? Buffer_transform rotates in 90-deg steps, and can
do a flip. Using a 20x20 area at 0,0 from a 100x100 buffer, if
buffer_transform is applied before crop & scale then buffer_transform
will decide from which corner it actually is.

If you were talking only about avoiding fractional units, then yes,
buffer_transform always keeps integers as integers.

> Also if integers are used it is impossible to scale an image that is not
> a multiple of buffer_scale in both dimensions. Using fixed for the
> source rectangle works around this problem but is very misleading.

That can be remedied by simply choosing buffer_scale=1 in the original
proposal to make the src image always a multiple of buffer_scale.

I agree, that using wl_fixed_t does not solve this problem. Setting
buffer_scale=1 does.

> At the moment it sounds like the destination has to be in buffer_scale
> units, rather than in pixels, in order to match the rest of the Wayland
> api. I think this is a mistake but it should be fixed everywhere, not
> just here.

In the original proposal, dst_width and dst_height are in surface
coordinate units and not restricted to multiples of buffer_scale. They
are just any positive integers.

In the "src rect in buffer pixel units" suggestion, dst_width and
dst_height must be multiples of buffer_scale, so that they can be
divided by buffer_scale to get the surface dimensions in surface
coordinate units.

(Surface coordinate units and coordinate frame are the one used by
almost all of the Wayland protocols: input, child surface
positioning, ...)

> > In the end, I see no other difference between the transformation orders
> > than convenience: calculations you might be able to avoid, some lines
> > of code you can skip writing. The same piece of code has to control both
> > the wl_surface interface and the wl_surface_scaler interface anyway.
>
> In theory yes, but unfortunately in the real world there is finite
> precision in our representation of numbers, making the differences
> relevant and important.

The only place that may require rounding is encoding the information
into the protocol requests. Everything else can be computed exactly by
using ratios of integers, given exact input values.

***

There is a third option: specify that when crop & scale state is set,
buffer_scale gets ignored. I really don't know if this would better or
worse for future-proofing, but a new extension that disables state that
was used in the past feels a bit strange here. And disabling
buffer_scale would not be explicitly visible in the protocol, it would
be a rule that you just have to know, if you set crop & scale.

In this particular case, we have seemingly equal impacts whether to go
with the original proposal or disable buffer_scale. In the original
proposal clients can effectively disable buffer_scale by setting it to
one (which is the default value anyway, so no need to explicitly set
it).

Therefore I would not propose the third option, but I'm mentioning it
for completeness.


In my opinion, buffer_scale is not an issue either way. The issue is
buffer_transform. Does buffer_transform apply to the buffer, or to the
cut-out rectangle?

Even the name "buffer_transform" implies that it applies to the buffer,
just like it originally does.


Thanks,
pq
Bill Spitzak
2013-11-13 00:14:36 UTC
Permalink
Pekka Paalanen wrote:

>> The source rectangle *must* be in buffer pixels. Putting it in
>> buffer_scale units makes absolutely no sense, as the buffer_scale is in
>> effect ignored for this surface, overridden entirely by the scaling.
>
> That means that dst_width and dst_height will be required to be in
> multiples of buffer_scale.

I am rather confused about this and possibly misunderstanding what
Wayland is doing. Here is an example of what I think the current design is:

Imagine there is an output_scale of 3 (chosen to not be a power of 2). A
client is aware of this and wishes to draw a full-resolution display
that is 2000x2000 pixels and make a subwindow that scales a 512x512
picture to fill a 1000x1000 pixel square.

I think what that client does on Wayland under this proposal is this
(please correct if wrong):

- It creates a 2000x2000 pixel buffer and draws everything including the
1000x1000 frame into it. The surface is given a buffer_scale of 1/3 (or
3 depending on how buffer_scale is defined, I am unclear on this).

- This is sent to the compositor. The compositor scales the buffer by
1/3 to get "surface coordinates" and then scales it up by 3 to output
space, but these cancel out to the identity. In "surface coordinates"
the surface is 666.666 units square and the frame is 333.333 units square.

- The client also creates the 512x512 image for the subsurface and will
use the scaling api. It sets the source rectangle to 0,0,512,512 pixels.
If buffer_scale affects this the client is forced to set it to a power
of 2, since 512 is not a multiple of 3.

- The destination is set in surface units and thus must be a rectangle
333.333 on each side, and may have a fractional origin as well. This is
done by sending the nearest possible fixed values and assuming the
compositor will round to the correct pixel. Buffer_scale of the
subsurface and parent have no effect.

My proposal is:

- The source rectangle is always in buffer pixels so it can be 512 no
matter what the buffer_scale is set to.

- The buffer_scale *does* apply to the destination rectangle. The client
would set the subsurface buffer_scale to 1/3 just like the parent (or it
is inherited from the parent), and then the destination rectangle would
be 1000 square, and the corner would be integers as well.

- (unrelated) The origin of the buffer attach is in buffer pixels so a
surface can be translated by integer numbers of pixels.

> In my opinion, buffer_scale is not an issue either way. The issue is
> buffer_transform. Does buffer_transform apply to the buffer, or to the
> cut-out rectangle?

I personally would like all coordinates to be in actual buffer pixels,
before the buffer_transform. However I think too much existing Wayland
api, in particular the buffer width + height, are in the after
coordinates and consistency is better.
Pekka Paalanen
2013-11-13 09:54:38 UTC
Permalink
On Tue, 12 Nov 2013 16:14:36 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

> Pekka Paalanen wrote:
>
> >> The source rectangle *must* be in buffer pixels. Putting it in
> >> buffer_scale units makes absolutely no sense, as the buffer_scale is in
> >> effect ignored for this surface, overridden entirely by the scaling.
> >
> > That means that dst_width and dst_height will be required to be in
> > multiples of buffer_scale.
>
> I am rather confused about this and possibly misunderstanding what
> Wayland is doing. Here is an example of what I think the current design is:
>
> Imagine there is an output_scale of 3 (chosen to not be a power of 2). A
> client is aware of this and wishes to draw a full-resolution display
> that is 2000x2000 pixels and make a subwindow that scales a 512x512
> picture to fill a 1000x1000 pixel square.

Is your whole issue based on the premise, that the output resolution is
not a multiple of output_scale?

Just like you deduced, it won't work. It not working has nothing to do
with crop & scale state.

Alexander, did you have any ideas for the case when someone sets
output_scale so that output resolution is not divisible by it?


Thanks,
pq
Alexander Larsson
2013-11-13 10:30:19 UTC
Permalink
On ons, 2013-11-13 at 11:54 +0200, Pekka Paalanen wrote:
> On Tue, 12 Nov 2013 16:14:36 -0800
> Bill Spitzak <spitzak at gmail.com> wrote:
>
> > Pekka Paalanen wrote:
> >
> > >> The source rectangle *must* be in buffer pixels. Putting it in
> > >> buffer_scale units makes absolutely no sense, as the buffer_scale is in
> > >> effect ignored for this surface, overridden entirely by the scaling.
> > >
> > > That means that dst_width and dst_height will be required to be in
> > > multiples of buffer_scale.
> >
> > I am rather confused about this and possibly misunderstanding what
> > Wayland is doing. Here is an example of what I think the current design is:
> >
> > Imagine there is an output_scale of 3 (chosen to not be a power of 2). A
> > client is aware of this and wishes to draw a full-resolution display
> > that is 2000x2000 pixels and make a subwindow that scales a 512x512
> > picture to fill a 1000x1000 pixel square.
>
> Is your whole issue based on the premise, that the output resolution is
> not a multiple of output_scale?
>
> Just like you deduced, it won't work. It not working has nothing to do
> with crop & scale state.
>
> Alexander, did you have any ideas for the case when someone sets
> output_scale so that output resolution is not divisible by it?

The output scale is a hint to the client what kind of pre-scaling it
should apply to the buffer to best match the real output. There is no
real guarantee that this is an exact match, because the compositor may
be projecting it on a 3d-rotated spherical monitor or whatever. It is
*commonly* the case that we have a regular rectangular output where the
output scale is an integer divisor of the real output resolution such
that the buffer can be used unscaled in the scanout framebuffer.

I.e. the case of a 2560x1600 native display is typically run with a
output scale of 2, which gives a maximized window a surface coordinate
space of 1280x800.

However, a compositor is also free to do something else, for instance
like OSX it can allow using a 1440x900 global coordinate space, while
saying output scale is 2. This would result in the client sending
2880x1800 buffers for fullscreen windows that the compositor will have
to downscale (by a non-integer scaling factor) to draw to the
framebuffer.
Pekka Paalanen
2013-11-14 08:29:52 UTC
Permalink
On Wed, 13 Nov 2013 11:30:19 +0100
Alexander Larsson <alexl at redhat.com> wrote:

> On ons, 2013-11-13 at 11:54 +0200, Pekka Paalanen wrote:
> > On Tue, 12 Nov 2013 16:14:36 -0800
> > Bill Spitzak <spitzak at gmail.com> wrote:
> >
> > > Pekka Paalanen wrote:
> > >
> > > >> The source rectangle *must* be in buffer pixels. Putting it in
> > > >> buffer_scale units makes absolutely no sense, as the buffer_scale is in
> > > >> effect ignored for this surface, overridden entirely by the scaling.
> > > >
> > > > That means that dst_width and dst_height will be required to be in
> > > > multiples of buffer_scale.
> > >
> > > I am rather confused about this and possibly misunderstanding what
> > > Wayland is doing. Here is an example of what I think the current design is:
> > >
> > > Imagine there is an output_scale of 3 (chosen to not be a power of 2). A
> > > client is aware of this and wishes to draw a full-resolution display
> > > that is 2000x2000 pixels and make a subwindow that scales a 512x512
> > > picture to fill a 1000x1000 pixel square.
> >
> > Is your whole issue based on the premise, that the output resolution is
> > not a multiple of output_scale?
> >
> > Just like you deduced, it won't work. It not working has nothing to do
> > with crop & scale state.
> >
> > Alexander, did you have any ideas for the case when someone sets
> > output_scale so that output resolution is not divisible by it?
>
> The output scale is a hint to the client what kind of pre-scaling it
> should apply to the buffer to best match the real output. There is no
> real guarantee that this is an exact match, because the compositor may
> be projecting it on a 3d-rotated spherical monitor or whatever. It is
> *commonly* the case that we have a regular rectangular output where the
> output scale is an integer divisor of the real output resolution such
> that the buffer can be used unscaled in the scanout framebuffer.
>
> I.e. the case of a 2560x1600 native display is typically run with a
> output scale of 2, which gives a maximized window a surface coordinate
> space of 1280x800.
>
> However, a compositor is also free to do something else, for instance
> like OSX it can allow using a 1440x900 global coordinate space, while
> saying output scale is 2. This would result in the client sending
> 2880x1800 buffers for fullscreen windows that the compositor will have
> to downscale (by a non-integer scaling factor) to draw to the
> framebuffer.

Right, but I am specifically thinking about the case where directly
scanning out a client is (or should be) possible. The issue is the
following:

Assume an output resolution 2000x2000, with output_scale=3. That means,
that there is no surface size in integers that would result in exactly
2000x2000 output area. You'd either have surface size 666x666, which
becomes output 1998x1998, or 667x667 which becomes 2001x2001.

We require, that the buffer size for a surface is a multiple of
buffer_scale. If buffer_scale is set to 3 as intended, the client
cannot create a legal buffer, that would be the size of the output in
output pixels.

If fact, regardless of what buffer_scale the client chooses, it cannot
cover the output exactly with the surface, since we have the surface
coordinate system in between and assume that surface sizes are always
integers.

***
Originally I had written more flow of thoughts here, but I need to
think this more; is it actually a problem, or is it possible for a
compositor to just do what is intended rather than following the
coordinate specifications to the letter and nitpick about a pixel or
two, when a client actually uses that 2000x2000 buffer resulting in
some fractional surface size.


Thanks,
pq
Bill Spitzak
2013-11-13 20:26:41 UTC
Permalink
Pekka Paalanen wrote:

> Is your whole issue based on the premise, that the output resolution is
> not a multiple of output_scale?

I think there is some serious confusion here. Everything here is a
multiple of everything else.

My client is attempting to obey the output_scale on the assumption that
this advertised output_scale will produce the best image. It sees an
output_scale of 3 and assumes the reason is that the pixels on that
output are 3x smaller (1/9 the area) of "standard" pixels.

The client says "If I want to produce the best hi-resolution image, I
need to use a buffer that is 3x larger in each direction and use a
buffer_scale of 1/3".

Then the client says "I want to use a subsurface and the crop+scale api
to blow this 512x512 image up to cover exactly 1024x1024 pixels in my
main buffer" (on the assumption that this is also 1024x1024 pixels on
the output).

However instead of sending a 1024x1024 square to the compositor for the
dst area, it has to send a 341.3333333 square using fixed-point. This
requires everybody to agree on rounding rules, and is misleading because
the crop+scale only will work for integer numbers of pixels.

It also has to set the buffer_scale of the subsurface to 1, otherwise it
is impossible to specify a 512x512 source rectangle because that is not
using fixed point.

Please explain if I have gotten this analysis wrong.

My recommendation: buffer_scale is ignored for the crop rectangle (which
should then be in the same space as the buffer width and height values,
which I think is "after" the buffer_transform, not before it). That
allows buffer_scale to be applied to the dst rectangle. The client would
then set the buffer_scale of the subsurface to 1/3 and can set the dst
rectangle to 1024.
Pekka Paalanen
2013-11-14 09:28:55 UTC
Permalink
On Wed, 13 Nov 2013 12:26:41 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

> Pekka Paalanen wrote:
>
> > Is your whole issue based on the premise, that the output resolution is
> > not a multiple of output_scale?
>
> I think there is some serious confusion here. Everything here is a
> multiple of everything else.
>
> My client is attempting to obey the output_scale on the assumption that
> this advertised output_scale will produce the best image. It sees an
> output_scale of 3 and assumes the reason is that the pixels on that
> output are 3x smaller (1/9 the area) of "standard" pixels.

Ok, let's assume output_scale=3.

> The client says "If I want to produce the best hi-resolution image, I
> need to use a buffer that is 3x larger in each direction and use a
> buffer_scale of 1/3".

That is indicated by setting buffer_scale=3 in the protocol.

> Then the client says "I want to use a subsurface and the crop+scale api
> to blow this 512x512 image up to cover exactly 1024x1024 pixels in my
> main buffer" (on the assumption that this is also 1024x1024 pixels on
> the output).
>
> However instead of sending a 1024x1024 square to the compositor for the
> dst area, it has to send a 341.3333333 square using fixed-point. This
> requires everybody to agree on rounding rules, and is misleading because
> the crop+scale only will work for integer numbers of pixels.

Yes, you cannot use non-integer surface sizes. You cannot express a
surface size, that would result in a 1024x1024 area in output pixels,
because 1024 is not divisible by output_scale=3.

This limitation exists also before crop & scale. Crop & scale cannot
remove this limitation, because the whole Wayland protocol has been
built on surface coordinates and especially surface sizes are integers.


Thanks,
pq
Bill Spitzak
2013-11-14 20:27:22 UTC
Permalink
Pekka Paalanen wrote:

> Yes, you cannot use non-integer surface sizes. You cannot express a
> surface size, that would result in a 1024x1024 area in output pixels,
> because 1024 is not divisible by output_scale=3.

Thanks. That is indeed exactly the problem I am concerned about.

> This limitation exists also before crop & scale. Crop & scale cannot
> remove this limitation, because the whole Wayland protocol has been
> built on surface coordinates and especially surface sizes are integers.

I thought the proposed crop+scale used fixed point to specify the
destination rectangle, which is a solution. I don't like it but I
thought it was done on purpose for this.

The buffer size sent with attach is also in buffer pixels. So a client
can make a surface any size. But because the offset is in "surface
coordinates" they cannot resize the origin corner except in multiples of
buffer_scale. However they *can* resize the opposite corner in pixels.
This seems pretty inconsistent and it would be nice to fix it too.
Pekka Paalanen
2013-11-15 08:27:16 UTC
Permalink
On Thu, 14 Nov 2013 12:27:22 -0800
Bill Spitzak <spitzak at gmail.com> wrote:

> Pekka Paalanen wrote:
>
> > Yes, you cannot use non-integer surface sizes. You cannot express a
> > surface size, that would result in a 1024x1024 area in output pixels,
> > because 1024 is not divisible by output_scale=3.
>
> Thanks. That is indeed exactly the problem I am concerned about.
>
> > This limitation exists also before crop & scale. Crop & scale cannot
> > remove this limitation, because the whole Wayland protocol has been
> > built on surface coordinates and especially surface sizes are integers.
>
> I thought the proposed crop+scale used fixed point to specify the
> destination rectangle, which is a solution. I don't like it but I
> thought it was done on purpose for this.

If you read the proposal, it does not, exactly because it was written
in the way that the destination rectangle size becomes directly the
surface size, and hence it must be integers.

The source rectangle is in wl_fixed_t.

> The buffer size sent with attach is also in buffer pixels. So a client
> can make a surface any size. But because the offset is in "surface
> coordinates" they cannot resize the origin corner except in multiples of
> buffer_scale. However they *can* resize the opposite corner in pixels.
> This seems pretty inconsistent and it would be nice to fix it too.

Buffer size is the property of a wl_buffer. The size is not explicitly
sent on attach. You are right about the offset.

Clients cannot "resize the opposite corner" in the way you
assume, because the protocol requires that the buffer size is a
multiple of buffer_scale, so that it always results in a surface size in
integers.

Buffer size must always be a multiple of buffer_scale. That was defined
when buffer_scale was added to the protocol.

So, there are two different cases for surface size:
1. Crop & scale not used: surface size is determined from buffer size
via buffer_transform and buffer_scale.
2. Crop & scale is used: surface size is directly dst_width,dst_height.

In both cases, surface size is guaranteed to be in integers.

Surface size is always in integers. Nothing I have proposed, or in the
current protocol, allows otherwise.

- pq
Loading...