This may have been asked before, but I wasn't able to find anything about it, so here goes. The player (obviously) uses the display device. Is there any way for a user program, or perhaps the kernel, to control a small portion of the display in parallel to the player using the rest of it? Like, the display would be clipped in one certain portion to allow something from a user program to be output there.

My gut feeling is "no" due to the synchronization and fast refresh of the display, but I thought I'd ask if there were any creative ways to implement this. I somehow doubt that sharing of the display device is allowed, but if it was, it would open up a lot of cool ideas for user programs.


See the attachment for simple kernel modification to allow other applications to share the display. It should work with the V1.02 kernel source tree.

The idea here is that your application opens /dev/display with special flags which after you can normally blit to the screen buffer using the display queue, whichafter kernel will mix the two frame buffers together before actually blitting the data to screen. The attached example will use the lowest eight pixel rows from the screen buffer to mix with the player screen buffer. The player screen buffers gets roughly scaled from 32 pixel rows to 24 pixel rows.

Here's a simple guide:

Open /dev/display from your application:
m_fd = open( "/dev/display", O_RDWR | O_SYNC );

Blit your stuff to the screen (to the display queue):
ioctl( m_fd, _IO( 'd', 6 ) );

Enable/disable display overlaying:
ioctl( m_fd, _IOW( 'd', 12, int ), &m_iOverlayType ); (m_iOverlayType being 0 or 1)

Kim



Attachments
27378-empeg_display_diff.txt (205 downloads)