This is a try to use new gen4asm language, and will finish
composite program for mask picture with or without CA case later.
Signed-off-by: Keith Packard <keithp@neko.keithp.com>
This makes the CRTCs now always run in gamma-enabled mode, rather than having
flaky logic for switching modes. Also, it should clear up issues with the LUTs
being uninitialized when outputs are first brought up.
Add relative and absolute position configuration code, using per-output
monitor sections. Options include:
PreferredMode selects a preferred mode for this output by name
Position absolute position, x and y in a single string.
Below relative positions; argument names other monitor.
RightOf
Above
LeftOf
Enable force the monitor to be disabled by setting
Disable enable to no or disable to yes.
MinClock Set valid clock ranges
MaxClock
Monitor sections can also include sync ranges, physical size and mode lines
as documented in xorg.conf(5).
Monitors are associated with outputs through options in the Device section:
Option "monitor-VGA" "My VGA Monitor"
Output named 'VGA' will use monitor section "My VGA Monitor".
Autodetect libdrm version, disable new memory manager on older libraries.
Move new M_T_ defines from i830.h to i830_xf86Crtc.h. Add many system
headers to define functions. Use i830PipeSetBase at end of mode setting
code to set DSP*BASE and flush changes. Don't duplicate PipeSetBase call
from screen init function. Make initial RandR configuration code usable on
older versions of extension so the server doesn't start in a panning mode.
Use xfree instead of free in i830_tv.c.
Move output connection status detection from RandR code up to ProbeModes so
it is done before mode sets are built. Otherwise, the mode building code
will elide all modes the first time through as it ignores outputs that are
disconnected.
Most get_modes functions fetch EDID data; make sure that any
EDID changes are used in the ProbeModes filtering of default modes.
Otherwise, stale EDID data will be used.
Allow outputs to advertise support for interlaced and double scan modes;
prune such modes from the default mode list when outputs do not support them.
Limit the effect of sync ranges so that sync ranges found via EDID will not
eliminate modes explicitly added by the user. Limit default sync range to
eliminating only default modes, not configured or EDID modes.
Belinea 10 15 55 model monitor reports a preferred mode of 640x350, when in
fact it wants a 1024x768 mode @ 60Hz. Add an edid quirk that selects the
largest size mode, preferring those closer to 60hz among equal sized modes.
With modern monitors and increased XV and EXA memory requirements, these small
limits were resulting in DRI and other initialization failures because we
wouldn't allow them enough memory. Instead, allow each piece of the system
(DRI, EXA, XAA, etc) to request as much memory as it wants, and choose the
actual videoRam to be used for laying out the memory afterwards.
With this change, in the absence of a VideoRam option, 32MB will be allocated
for textures.
Add the modelines specified in the per-output monitor and all of the default
modes to the list to each output. Prune the resulting list to specified sync
limits and virtual sizes. Sort the resulting mode list on
preferred/size/refresh.
Currently, when the backing pixmap is not in framebuffer, we just BadAlloc
rather than drawing garbage to the front buffer. This can be fixed with EXA.
The documentation states that GPIOB is (generally) used for devices on DVOA
on the motherboard, which appears to be the case on the laptop we have with
LVDS on the motherboard.
This patch is probably not entirely accurate, as there was apparently an LVDS
DVO card sold that could be put in desktop machines, which would likely be on
GPIOE like other ADD cards. Given that we couldn't find one of these cards for
purchase, I'm not worrying about it.