GNOME, KDE get their kicks from XFree86
Hans Blix diplomacy manual
Leading GNOME and KDE developers have borrowed a leaf from the Hans Blix diplomacy manual in a joint statement on the XFree86 schism.
Last week a leading XFree86 developer, Keith Packard, was expelled in an acrimonious dispute which put the project's fundamental goals and architecture in the spotlight.
Sixteen GNOME and KDE developers, including Miguel de Icaza and Dirk Mueller, have issued a statement supporting XFree86 that declines to comment on the Packard episode. Frustrated by what he cites slow development progress and an unaccountable board, Packard was soliciting help for a breakaway project.
However the diplomatic language does include a very delicately worded call:
"We would like to have a frequently-released, robust, stable, open source implementation of these APIs, specifications, and features."
Register readers have contributed their thoughts on the future of X11.
Reader Allen Crider points out that taking out the network capabilities would deprive us being able to use cheap, low cost (and silent) terminals:
"People who want to take X Windows out of the socket model are generally ill-informed. For instance, my own computer (running an XP 1800+) can generate 45,021 socket packets per second.
"Also, Xterminals are NOT slow (unless you are 3-d gaming). I always use one. My noisy XP 1800 is in the closet doing all those server things and I use a
homemade Xterminal on a 10base100 net. It feels like I'm one a dual-CPU machine.
"Xterminals are a very good idea if people think they want Linux in the workplace. PCs with their disk drives and autonomous operating systems are a
ridiculous headache. The only reason Xterminals lost their market in the olden days were their price exceeded standalone PCs. Try it out for yourself
sometime. As long as you have a cou[ple of Linux boxes sitting around, you can run any X program off one machine and have the display show up on another.
"Shifting the problem, not solving it"
A graphics developer who says he's "firmly on the fence on this one" has many good observations.
"Wexelblat article is right in many respects.
"The remote stuff is less of an issue as he says, but that's all transparent and slow anyways because it's over the network. X can actually run in two modes now, direct rendering and networked and it must support both. As he says, networked display is largely unused for desktop stuff, and when it is used it's SLOW, because your graphics bus becomes the network. It's the old "thin client" philosophy, support for which is built into every single X server.
"As for supporting a new card like Radeon, I can't help think that Packard is making too much of this. There's the DRI (direct rendering infrastructure)
that pretty much lets you plug in your own 3D support.
"The 2D support is less of an issue having several levels of support possible and there's plenty of code you can lift. All the stuff to transport calls over the network is already implemented and would wind up calling through the DRI. So updating the DRI implementation for a new card it seems to me is akin to what he's talking about anyway.
"Graphics is naturally more complex and difficult so I think he'd just be shifting the problem not solving it. Instead of the DRI you'd end up with some other direct rendering fubar and you haven't really moved the problem with regards to the 3D implementations. You've got to implement OpenGL and provide a common interface and provide X API support at least natively on the desktop. So you eliminate the indirect rendering networked stuff? How does that help anyone?
"That code is just lying there already written, it's just a network protocol "tranceiver" for want of a better word. It took people ages just to fix the OpenGL ABI on Linux so software would be portable between drivers. This is standards work, it's boring it's mundane, but you can't throw it away.
"You're stuck with that, it's a GOOD THING, so what are you eliminating, some network baggage you largely don't have to look at but may not like for aesthetic reasons? In reality, implementing 3D stuff can be done at the DRI level adding new stuff takes messing around with specs and protocols and that's possibly a good thing in a heterogeneous hardware environment, but lacks the purity some hackers crave."
Thanks for your comments. One more: an anonymous poster at gnome.org comments that GNOME and KDE should drive assume responsibility for the project. Along with Apple, they're the main customers and therefore, rely the most on this implementation of X11.
So, would this be bad thing? ®
Sponsored: The Nuts and Bolts of Ransomware in 2016