Thursday, March 23. 2006
Playing with hardware sensors in linux
Yesterday I played around a bit with what hardware sensors can do and how to access them in linux.
hddtemp
The first, and quite trivial tool I tried was hddtemp. You don't need to do anything further, just install it and run
hddtemp /dev/hda
(Assuming your harddisk is hda, which is usually the case)
It supports a bunch of harddisks by default and if it doesn't know your HD, it tries to access it with some default-values. Extending the hd-database seems to be trivial, I already sent a patch for my HD. Output looks like this:
/dev/hda: SAMSUNG MP0804H: 46°C
lm_sensors
lm_sensors is a bunch of drivers and tools to use hardware-sensors on motherboards. As you probably have no idea what chips your motherboard has, lm_sensors brings a tool called sensors-detect to help you. The way to go is just enabling everything (except debugging, which you usually don't need) in the kernel-sections i2c and hardware monitoring as module and let sensors-detect to the work.
Basically, pressing return all the time should be okay. At the end, it'll tell you which kernel-modules are useful for your system.
After that, running sensors shows something like this:
max6657-i2c-0-4c
Adapter: SMBus I801 adapter at 1100
M/B Temp: +40°C (low = -65°C, high = +127°C)
CPU Temp: +36.6°C (low = +35.1°C, high = +72.2°C)
M/B Crit: +110°C (hyst = +100°C)
CPU Crit: +110°C (hyst = +100°C)
Well, not that useful, but interesting to know that I have at least 3 temperature-sensors in my laptop.
Update: As noted by Joshua Jackson in the comments, with smartctl /dev/hda (from smartmontools) you get the temperature and much more information about your HD.
hddtemp
The first, and quite trivial tool I tried was hddtemp. You don't need to do anything further, just install it and run
hddtemp /dev/hda
(Assuming your harddisk is hda, which is usually the case)
It supports a bunch of harddisks by default and if it doesn't know your HD, it tries to access it with some default-values. Extending the hd-database seems to be trivial, I already sent a patch for my HD. Output looks like this:
/dev/hda: SAMSUNG MP0804H: 46°C
lm_sensors
lm_sensors is a bunch of drivers and tools to use hardware-sensors on motherboards. As you probably have no idea what chips your motherboard has, lm_sensors brings a tool called sensors-detect to help you. The way to go is just enabling everything (except debugging, which you usually don't need) in the kernel-sections i2c and hardware monitoring as module and let sensors-detect to the work.
Basically, pressing return all the time should be okay. At the end, it'll tell you which kernel-modules are useful for your system.
After that, running sensors shows something like this:
max6657-i2c-0-4c
Adapter: SMBus I801 adapter at 1100
M/B Temp: +40°C (low = -65°C, high = +127°C)
CPU Temp: +36.6°C (low = +35.1°C, high = +72.2°C)
M/B Crit: +110°C (hyst = +100°C)
CPU Crit: +110°C (hyst = +100°C)
Well, not that useful, but interesting to know that I have at least 3 temperature-sensors in my laptop.
Update: As noted by Joshua Jackson in the comments, with smartctl /dev/hda (from smartmontools) you get the temperature and much more information about your HD.
Thursday, March 9. 2006
Reverse engineering onlinetvrecorder
onlinetvrecorder, a service that let's you record broadcasts from some german television stations, provides it's files in .otrkey-format, which can be decoded using their binary otrdecoder-tool, considering you have requested the recording in advance.
As there is no information how the format and authentication work, I had a deeper look at it.
Getting the key
Using some network sniffer, the authentication is very simple, it just requests them with http, the URL is
http://www.onlinetvrecorder.com/uncrypt.php?email=[email]&pass=[pass]&filename=[file]
(filename is the .wmv-name without otrkey)
Inside that file is an ascii/hex-encoded number with 128 bit. That very much looks like a key.
This already gives us the possibility to manually download the key and, if we want to re-decode some movie (because we lost the wmv or because we want to decode a file before it's completely downloaded to already start watching the recording), save the key to a local webserver as uncrypt.php, forward the hostname to 127.0.0.1 and re-start otrdecoder.
The cryptography
From what I found out yet, the file is encrypted with some sort of blowfish. The encrypted and decrypted files are exactly the same size, that means we have no IV and a variant of blowfish that does no padding.
The best I got till now was using mcrypt with ecb-mode:
mcrypt -d -a blowfish-compat -s 16 -o hex -b --noiv -m ecb --nodelete -f [keyfile] [file]
This decrypts the first 256 bytes correctly, after that it seems to mix up things (the correct decryption continues at byte 512). From what I read in Schneier[1996] (»Applied cryptography«), there is an ecb variant using ciphertex stealing that avoids padding. I found no easy-to-use implementation of that.
Having a commandline-cryptography tool that supports more options than mcrypt would be handy here.
As there is no information how the format and authentication work, I had a deeper look at it.
Getting the key
Using some network sniffer, the authentication is very simple, it just requests them with http, the URL is
http://www.onlinetvrecorder.com/uncrypt.php?email=[email]&pass=[pass]&filename=[file]
(filename is the .wmv-name without otrkey)
Inside that file is an ascii/hex-encoded number with 128 bit. That very much looks like a key.
This already gives us the possibility to manually download the key and, if we want to re-decode some movie (because we lost the wmv or because we want to decode a file before it's completely downloaded to already start watching the recording), save the key to a local webserver as uncrypt.php, forward the hostname to 127.0.0.1 and re-start otrdecoder.
The cryptography
From what I found out yet, the file is encrypted with some sort of blowfish. The encrypted and decrypted files are exactly the same size, that means we have no IV and a variant of blowfish that does no padding.
The best I got till now was using mcrypt with ecb-mode:
mcrypt -d -a blowfish-compat -s 16 -o hex -b --noiv -m ecb --nodelete -f [keyfile] [file]
This decrypts the first 256 bytes correctly, after that it seems to mix up things (the correct decryption continues at byte 512). From what I read in Schneier[1996] (»Applied cryptography«), there is an ecb variant using ciphertex stealing that avoids padding. I found no easy-to-use implementation of that.
Having a commandline-cryptography tool that supports more options than mcrypt would be handy here.
Posted by Hanno Böck
in Computer culture, Cryptography, English, Linux
at
21:17
| Comments (4294967295)
| Trackbacks (2)
Monday, February 27. 2006
Fosdem 2006 notes
As there was only very limited internet access at Fosdem, I didn't find the time to blog live, so here my collected impressions.
Keynote with Richard Stallman about software patents. I already knew this talk, so it wasn't so interesting. I also think there weren't much people in the room that had to be persuaded to resist software patents, so they should have probably choosen a more »visionary« topic for RMS to talk about. After that an interesting talk about the GPL v3 (also by RMS). I asked a question about the problem that GPL v2 only code can't be mixed with GPL v3 code, he asked me to email discuss this with him, what I will do.
There were two talks about Xgl, one from Matthias Hopf telling what xgl is, what problems they face and some compiz presentation (with the always-known whooo-effect). Zack Rusin did a »Why Xgl is not the answer« talk. Was very interesting to hear the pros and cons of Xgl, I don't have a real opinion on that (I don't feel that I understand the technical details enough), but we should probably have an eye on the different futures X has (Xgl and aiglx at the moment).
Another very interesting talk in the X room: Stephane Marchesin is working on reverse-engineering nvidia chipsets and intends to write a free driver for them. It's in a very early stage (basically at the moment just finding out how the chips work), let's wish him all success (see nouveau - his not yet working first sources).
Suse gave out free (as in beer) t-shirts, so don't be amazed if you see me with a suse t-shirt running around ;-)
Some other more or less interesting talks, overall the presentations are the highlights of fosdem, you'll probably hardly find another event with so many interesting, high level talks about open source and free software.
Pictures will be here as soon as I find time to upload them.
Keynote with Richard Stallman about software patents. I already knew this talk, so it wasn't so interesting. I also think there weren't much people in the room that had to be persuaded to resist software patents, so they should have probably choosen a more »visionary« topic for RMS to talk about. After that an interesting talk about the GPL v3 (also by RMS). I asked a question about the problem that GPL v2 only code can't be mixed with GPL v3 code, he asked me to email discuss this with him, what I will do.
There were two talks about Xgl, one from Matthias Hopf telling what xgl is, what problems they face and some compiz presentation (with the always-known whooo-effect). Zack Rusin did a »Why Xgl is not the answer« talk. Was very interesting to hear the pros and cons of Xgl, I don't have a real opinion on that (I don't feel that I understand the technical details enough), but we should probably have an eye on the different futures X has (Xgl and aiglx at the moment).
Another very interesting talk in the X room: Stephane Marchesin is working on reverse-engineering nvidia chipsets and intends to write a free driver for them. It's in a very early stage (basically at the moment just finding out how the chips work), let's wish him all success (see nouveau - his not yet working first sources).
Suse gave out free (as in beer) t-shirts, so don't be amazed if you see me with a suse t-shirt running around ;-)
Some other more or less interesting talks, overall the presentations are the highlights of fosdem, you'll probably hardly find another event with so many interesting, high level talks about open source and free software.
Pictures will be here as soon as I find time to upload them.
Thursday, February 23. 2006
FOSDEM trip

Tomorrow I'll start my trip to Brussels and hopefully will also find some time to visit the city.
Maybe I'll have a small presentation of the xgl-overlay. If you are also there, this is the chance to meet me in reality and probably exchange some pgp-keys or such.
Thursday, February 16. 2006
Copyleft film about New Orleans after hurricane Katrina
As the german newspage heise reports, there's a new copyleft film, a documentary about New Orleans half a year after hurricane Katrina.
Their website dropping knowledge seems to be down at the moment. I'll post a review as soon as I got it and found time to watch it (this may take some days, because I've got university scrutinies next week).
Update: As Netzpolitik reports, this film is not really copyleft, it's released under a cc-by-nc-nd (creative commons attribution noncommercial no derivatives) license. Beside that, the page's javascript has problems with konqueror (and I always ask myself why this obvious connection between free culture and free software seems to be so difficult).
Their website dropping knowledge seems to be down at the moment. I'll post a review as soon as I got it and found time to watch it (this may take some days, because I've got university scrutinies next week).
Update: As Netzpolitik reports, this film is not really copyleft, it's released under a cc-by-nc-nd (creative commons attribution noncommercial no derivatives) license. Beside that, the page's javascript has problems with konqueror (and I always ask myself why this obvious connection between free culture and free software seems to be so difficult).
Posted by Hanno Böck
in Copyright, English, Movies, Politics
at
12:33
| Comments (0)
| Trackbacks (0)
Wednesday, February 15. 2006
Rant: Printing with cups
Okay: If you regularly read my blog, you know that I'm a linux addict and free software fan. I really like my linux, I'm much more comfortable with it than the bad old days when I used this other system from this redmond company. I have the strong belief that free software is the better concept and will succeed on the long run. Just to make clear that this is a very rare situation when I rant about linux.
So let's start: Today I wanted to print some slides from a university lecture. They were landscape format and to save paper (52 pages), I wanted to print four of them on one page. A simple task one should think.
Started kpdf, clicked on print. As my cups was configured, I could select my printer, go to it's options and found a 4 pages on 1 option, so it seemed fine. Clicked on Print. Waited. Waited. Nothing happened.
Webbrowser, localhost:631, no printing jobs. No errors. Nothing.
Looked at the logfile (this is at least the point where every common user wouldn't come further). Nothing that helped, just a note to change loglevel to debug. Did that. Restarted cups. Re-sent page. Logfile showed up some segfault in a gs-command. Damn, why can't just the interface tell me that?
From the small knowledge I have about linux-printing, I knew that there are various implementations of ghostscript. Looked into portage, found three, replaced ghostscript-esp with ghostscript-afpl. Restarted cups.
Tried to print, my printer actually did something. Well, it looked interesting. I had the third page in the upper left corner and about a third of the fourth page beside it. Beside that, far smaller than it should be, nearly unreadable.
Ok, there are some other pdf-viewers out there. Tried kghostview. Print, select 4 pages option, etc. Printer started doing something.
The result was really interesting: The pages were printed white on white.
Next try, evince. As evince is a pretty new gnome-tool, it sticks to the gnome guidelines: Less config-dialogs, less features. It just had no possibility to print four pages on one. Oh, should I mention that evince crashed when I wanted to close the printing dialog?
Gave up. Will read it on the screen.
Conclusion: Free software had some great success in the last years. Today we have systems that can compete to commercial ones in many areas for common usage. Some areas on the other side are really horrible. Printing is one of them.
If we really want to compete on the desktop, we need to get such basic tasks to »just work«.
So let's start: Today I wanted to print some slides from a university lecture. They were landscape format and to save paper (52 pages), I wanted to print four of them on one page. A simple task one should think.
Started kpdf, clicked on print. As my cups was configured, I could select my printer, go to it's options and found a 4 pages on 1 option, so it seemed fine. Clicked on Print. Waited. Waited. Nothing happened.
Webbrowser, localhost:631, no printing jobs. No errors. Nothing.
Looked at the logfile (this is at least the point where every common user wouldn't come further). Nothing that helped, just a note to change loglevel to debug. Did that. Restarted cups. Re-sent page. Logfile showed up some segfault in a gs-command. Damn, why can't just the interface tell me that?
From the small knowledge I have about linux-printing, I knew that there are various implementations of ghostscript. Looked into portage, found three, replaced ghostscript-esp with ghostscript-afpl. Restarted cups.
Tried to print, my printer actually did something. Well, it looked interesting. I had the third page in the upper left corner and about a third of the fourth page beside it. Beside that, far smaller than it should be, nearly unreadable.
Ok, there are some other pdf-viewers out there. Tried kghostview. Print, select 4 pages option, etc. Printer started doing something.
The result was really interesting: The pages were printed white on white.
Next try, evince. As evince is a pretty new gnome-tool, it sticks to the gnome guidelines: Less config-dialogs, less features. It just had no possibility to print four pages on one. Oh, should I mention that evince crashed when I wanted to close the printing dialog?
Gave up. Will read it on the screen.
Conclusion: Free software had some great success in the last years. Today we have systems that can compete to commercial ones in many areas for common usage. Some areas on the other side are really horrible. Printing is one of them.
If we really want to compete on the desktop, we need to get such basic tasks to »just work«.
Monday, February 13. 2006
amaroK 1.4 with moodbar

The most visible new feature is the so-called moodbar that tries to color the »mood« of a track. Okay, it does a hell of cpu-usage and I doubt it's very useful, but it looks really funky.
Gentooers: emerge sync, add media-sound/amarok to your package.unmask, add use-flag exscalibar, enjoy!
Posted by Hanno Böck
in Computer culture, English, Gentoo, Linux, Music
at
20:51
| Comments (13)
| Trackbacks (0)
More free music
The AStA of the University of Marburg starts it's second Open Music Contest.
Like last year, they'll collection submissions from music artists released under a creatice commons license and will choose the best to be presented on a concert and released on a sampler. Last years sampler is available for download in ogg vorbis format.
Nice project for more free music and open standards.
Like last year, they'll collection submissions from music artists released under a creatice commons license and will choose the best to be presented on a concert and released on a sampler. Last years sampler is available for download in ogg vorbis format.
Nice project for more free music and open standards.
Friday, February 10. 2006
I'm so famous ;-)
Since my xgl-overlay, this happens all the time (from #xorg on freenode):
<...> OMG IT'S HANNO!!!! *BOW DOWN
<...> hello! i read your blog!
<...> hahaha i'm a big fan
<...> ok!
<...> OMG IT'S HANNO!!!! *BOW DOWN
<...> hello! i read your blog!
<...> hahaha i'm a big fan
<...> ok!
Thursday, February 9. 2006
Xgl and compiz overlay update

I've now created a new one based on the latest code changes in mesa and xgl, together with the new opengl window- and compositemanager compiz.
I'm releasing it although it's not really working for me at the moment.
I can run Xgl with compiz on my Radeon 9200 card with the free xorg driver, but have redraw-problems, so I don't know if the effects work at the moment. If you made better experiences, please post them here (I'm especially interested if it works better with other cards, e. g. nvidia ones).
Short Howto:
- Download latest xgl-overlay-xxxx.tar.bz2, unpack it, point PORTDIR_OVERLAY to it
- emerge mesa glitz xgl compiz
- Run Xgl :1 -ac -accel glx:pbuffer -accel xv:pbuffer (ati) or Xgl :1 -ac -accel glx:pbuffer -accel xv (nvidia)
- Run DISPLAY=:1 compiz decoration, DISPLAY=:1 gnome-window-decorator and DISPLAY=:1 xterm (or something else)
If you just wanna see how it looks, here is the Novell announcement with some videos.
Friday, February 3. 2006
Internet Explorer 7 and CSS - at least a bit better

Acid2 in IE7
As I've been interested in webdesign for quite a long time, I was often worried that you couldn't use a bunch of possibilities in modern HTML code, because the nearly monopoly browser doesn't support them. Microsoft has not done any development on the IE for several years. When Firefox started to endanger the market share of the IE, they decided to change their mind.
Now I had the chance to have a first look at the IE7 on a friends laptop. I won't comment on the interface improvements, cause I wouldn't seriously suggest anyone to use this browser. I'm only interested from a webdesigners perspective, because I know that despite of all the buzz about Firefox, still a lot of people use the »default« on their system without thinking about it and probably will continue to do so.
I held a bunch of html/css test-cases for a while with examples I faced due to my webdesign work and found out that they don't work in some browsers.
A quick note: IE7 will behave completely as crappy as before if your docs have a HTML 4 doctype declaration. So to check the new css features, define XHTML 1.1 in your doctype and IE7 will use a »strict-mode« to render your pages.
The features I was missing most in the past are finally implemented: Transparent PNGs, fixed-positioned objects and defining objects through their left and right distance.
Beside that, there are still a bunch of features missing, min-height, max-height, empty-cells, just to name some. In my test-cases, it couldn't compete with any other browser, although they may be not representative. If you have suggestions for enhancements to the test-cases, feel free to mail me or post them.
My blog looks quite okay, some minor bug with the top black line, maybe I'll investigate this further (well, maybe not, maybe you just shouldn't use IE to watch this page ;-) ). Acid2 looks crappy as before.
So as a conclusion, IE7 has fixed the most grave issues, but is far away from fully implementing CSS2. If the free browsers want to enlarge their ledge over IE, I'm still waiting for the first browser to call CSS2-complete.
I hope this can be a chance to come a step forward in webdesign. For the future, the CSS3 previews looks very promising, I hope it won't take as many years as CSS2 till browser developers will take care about it.
Friday, January 27. 2006
Web 0.1 - Channel 4, IT Crowd
Channel 4, a UK television channel, has a new series called IT Crowd. As they are very modern and as the series is about an IT company, they may have thought:
»We've heard of this bleeding-edge thing called internet. Maybe we should do something about that.«
And here is what they did: They provided an obscure mix of javascript and flash to play an embedded wmv-file (which doesn't work in my konqueror, although I have the appropriate plugins installed). Julian wrote about it and was able to extract the download URL. WMV9, so no chance without win32codecs atm.
More and more tv stations provide some stuff online and this is really fine. It could be more, it could be better quality, there should be more free licensed stuff etc., but still, it's a step in the right direction. But hey, providing proprietary file formats embedded in proprietary is not how the web should look like in 2006. RSS-Feeds are made for stuff like that. Why can't they just use them? We have a bunch of formats that can at least be played on nearly every platform (and, not to forget that I'd always prefer an mpeg/patent-free format like ogg theora).
Sidenote: Recently I wrote to the german tv magazine Monitor, that provides it's files as real-streams, why they couldn't provide RSS with other formats. Their answer was that it's due to copyright reasons so people cannot download the files ...
... with their Internet Explorer. If you come over an rtsp/mms/whatever-stream and want to download it, mplayer is your friend. mplayer -dumpstream [url] fetched every stream I ever wanted to download.
»We've heard of this bleeding-edge thing called internet. Maybe we should do something about that.«
And here is what they did: They provided an obscure mix of javascript and flash to play an embedded wmv-file (which doesn't work in my konqueror, although I have the appropriate plugins installed). Julian wrote about it and was able to extract the download URL. WMV9, so no chance without win32codecs atm.
More and more tv stations provide some stuff online and this is really fine. It could be more, it could be better quality, there should be more free licensed stuff etc., but still, it's a step in the right direction. But hey, providing proprietary file formats embedded in proprietary is not how the web should look like in 2006. RSS-Feeds are made for stuff like that. Why can't they just use them? We have a bunch of formats that can at least be played on nearly every platform (and, not to forget that I'd always prefer an mpeg/patent-free format like ogg theora).
Sidenote: Recently I wrote to the german tv magazine Monitor, that provides it's files as real-streams, why they couldn't provide RSS with other formats. Their answer was that it's due to copyright reasons so people cannot download the files ...
... with their Internet Explorer. If you come over an rtsp/mms/whatever-stream and want to download it, mplayer is your friend. mplayer -dumpstream [url] fetched every stream I ever wanted to download.
Posted by Hanno Böck
in Copyright, English, Linux, Movies, Webdesign
at
22:45
| Comments (2)
| Trackbacks (0)
Wednesday, January 18. 2006
Firefox implementing spy-feature
As reported on several news-pages, Firefox is going to implement a »ping« that implements a new »feature« to the link-tag to send a ping out to some URL defined. This has a very bad taste to me.
What I always liked in the free software world was that not every app is sending »something« to »someone« in the net, as it's quite common in Windows apps. I remember on the last days I were using Windows (98) on a regular basis, I had some of those »personal firewalls« installed. Apps that had absolutely nothing to do with the net wanted to connect to their home-server, apps where I've selected »no internet connection« still tried to do »something« online (the last one for example was winamp).
Now, I know that the design of the world wide web is really privacy-unfriendly. Yes, you can filter out a lot of things with stuff like privoxy, but in the end, you have the only possibility to disable everything (javascript, image loading from foreign servers, cookies) and lose the possibility to use a bunch of web-services. The browser can't do much about this, as this is how the web is designed.
But still, I think this firefox »feature« is a big mistake. Especially free software applications should be much more precautious about their users privacy. I can't see a big use of it for the user. But I can think of a bunch of possibilities to misuse it.
We are always crying about the »evil ones«, the spyware-producers out there, just remember the recent buzz about the iTunes-spyware-functionality. That's perfectly right. But in the end, we need to do better in free software to provide an alternative. I hope that the firefox developers re-think about this and remove or at least disable-by-default these website-pings before their next release.
What I always liked in the free software world was that not every app is sending »something« to »someone« in the net, as it's quite common in Windows apps. I remember on the last days I were using Windows (98) on a regular basis, I had some of those »personal firewalls« installed. Apps that had absolutely nothing to do with the net wanted to connect to their home-server, apps where I've selected »no internet connection« still tried to do »something« online (the last one for example was winamp).
Now, I know that the design of the world wide web is really privacy-unfriendly. Yes, you can filter out a lot of things with stuff like privoxy, but in the end, you have the only possibility to disable everything (javascript, image loading from foreign servers, cookies) and lose the possibility to use a bunch of web-services. The browser can't do much about this, as this is how the web is designed.
But still, I think this firefox »feature« is a big mistake. Especially free software applications should be much more precautious about their users privacy. I can't see a big use of it for the user. But I can think of a bunch of possibilities to misuse it.
We are always crying about the »evil ones«, the spyware-producers out there, just remember the recent buzz about the iTunes-spyware-functionality. That's perfectly right. But in the end, we need to do better in free software to provide an alternative. I hope that the firefox developers re-think about this and remove or at least disable-by-default these website-pings before their next release.
Friday, January 13. 2006
Howto install xgl with glxcompmgr and fancy effects
Warning: By following the instructions below, you are probalby replacing some base libs from your system (mesa, glitz). This can and will seriously break your whole system if you don't know what you're doing. You may likely face other problems than I, so be prepared to play around yourself if you try this.
Ok, now for the fun part. Xgl is some experimental code for the next generation of X systems, with rendering completely done in OpenGL. Recently, David Reveman presented some major updates of the xgl-code. The instructions should be generic so you can do this on any distribution, however if you're using Gentoo, it'll be much easier, because you can get my xgl-overlay containing ebuilds for everything you need (get xgl-overlay-xxxx.tar.bz2 from here, I'll put up updated versions if neccessary). Non-Gentooers should also fetch this tarball, cause it contains all patches you'll need to follow the instructions.
I did this with an ATI Radeon 9200, till now this only works with the proprietary drivers from ATI or Nvidia. I haven't tried this with Nvidia, but it should work mostly the same way.
Problems so far:
Update:
I just found a problem some people had according to a portage bug. I've put a fixed portage-ebuild into the overlay (this may lead to problems if portage-devs decide to release an update called 2.1_pre3-r2).
Ok, now for the fun part. Xgl is some experimental code for the next generation of X systems, with rendering completely done in OpenGL. Recently, David Reveman presented some major updates of the xgl-code. The instructions should be generic so you can do this on any distribution, however if you're using Gentoo, it'll be much easier, because you can get my xgl-overlay containing ebuilds for everything you need (get xgl-overlay-xxxx.tar.bz2 from here, I'll put up updated versions if neccessary). Non-Gentooers should also fetch this tarball, cause it contains all patches you'll need to follow the instructions.
I did this with an ATI Radeon 9200, till now this only works with the proprietary drivers from ATI or Nvidia. I haven't tried this with Nvidia, but it should work mostly the same way.
- First you need a system based on the modular X version, that means Xorg 7.0. Gentoo users read this, others don't ask me, ask your distribution.
- Now configure your X to use the fglrx-driver (ATI owners) or the nvidia-driver with Direct Rendering activated (check with glxinfo). I'm not going into detail, there are enough instructions for that out there.
- Get a cvs-version of glitz and replace your local glitz-installation with it (Gentooers: use ebuild from overlay). I didn't face any problems with my normal system by doing this, but this may differ.
- Get a cvs-version of mesa. David Reveman has posted some patches with the xgl-release you'll need to apply (mesa-glx-x11-get-drawable-attribs-fix-1.diff, mesa-glx-x11-render-texture-5.diff, r200-copy-pixels-1.patch). The render-texture-Patch needs some constants that were nowhere defined on my system. I've written an ugly workaround (mesa-glx-x11-glxproto-defines.diff), this is probably not the correct/nice way to do, but at least it works. After applying those four patches, compile and install. Gentooers use my ebuild as before, it already contains all patches. As with glitz, I had no problems with my normal system using mesa-cvs, but don't count on this.
- Fetch the kdrive/xserver-cvs. Configure with
./configure --enable-xglserver --enable-glx --with-mesa-source=[point to your mesa-cvs-tree]
Gentooers use the xgl-ebuild (This contains some ugly hack for the Mesa-cvs-tree, which assumes that you've built mesa before and the mesa-cvs lays in your distfiles. Better solutions welcome.) - Get glxcompmgr from xorg cvs. autogen.sh failed for me due to an error in plugins/Makefile.am, apply my patch (glxcompmgr-makefile-am-fix.diff). Gentooers use the ebuild ;-)
- Now, you've everything installed. You can now start
Xgl :1 -ac -accel xv:pbuffer -accel glx:pbuffer (ATI users)
or
Xgl :1 -ac -accel xv -accel glx:pbuffer (Nvidia users) - Now start some apps in it. My experiences was it crashes less often with Gnome stuff, konsole and xterm completely crashed xgl. I managed to run complete gnome and kde-sessions. glxcompmgr doesn't really work with some windowmanagers (e.g. icewm), but kwin and metacity at least work. To start something inside the xgl, do something like:
DISPLAY=:1 metacity - Now for glxcompmgr. This is a bit complicated, because you'll need to run glxcompmgr with the libGL from mesa/xorg, while your xserver and the xgl running on it need the libGL from the proprietary driver. Suggestion is running a terminal with LD_LIBRARY_PATH, e.g.
LD_LIBRARY_PATH=/usr/lib/opengl/xorg-x11/lib/ DISPLAY=:1 gnome-terminal
Then, inside xgl and the terminal, you can check with glxinfo if GLX_MESA_render_texture is listed in GLX extensions. It is not enough if it's only listed on server glx extensions! If this is the case, you probably didn't point it to mesa-libGL correctly. - Now run glxcompmgr, e.g. with the wobbly and shadow plugin.
glxcompmgr wobbly shadow
glxcompmgr contains a bunch of more plugins, but most of them I failed to figure out how to start the actual effect (e. g. cube). Play around with it, have fun.
Problems so far:
- Crashes all the time.
- Keyboard sometimes doesn't work, not deterministic.
- Most effects (e. g. cube, expose, zoom) not running yet.
Update:
I just found a problem some people had according to a portage bug. I've put a fixed portage-ebuild into the overlay (this may lead to problems if portage-devs decide to release an update called 2.1_pre3-r2).
« previous page
(Page 18 of 23, totaling 335 entries)
» next page