Sunday, August 21. 2011
The sad state of the Linux Desktop
Some days ago it was reported that Microsoft declared it considers Linux on the desktop no longer a threat for its business. Now I usually wouldn't care that much what Microsoft is saying, but in this case, I think, they're very right – and thererfore I wonder why this hasn't raised any discussions in the free software community (at least I haven't seen one – if it has and I missed it, please provide links in the comments). So I'd like to make a start.
A few years ago, I can remember that I was pretty optimistic about a Linux-based Desktop (and I think many shared my views). It seemed with advantages like being able to provide a large number of high quality applications for free and having proven to be much more resilient against security threats it was just a matter of time. I had the impression that development was often going into the right direction, just to name one example freedesktop.org was just starting to try to unify the different Linux desktop environments and make standards so KDE applications work better under GNOME and vice versa.
Today, my impression is that everything is in a pretty sad state. Don't get me wrong: Free software plays an important role on Desktops – and that's really good. Major web browsers are based on free software, applications like VLC are very successful. But the basis – the operating system – is usually a non-free one.
I recently was looking for netbooks. Some years ago, Asus came out with the Eee PC, a small and cheap laptop which ran Linux by default – one year later they provided a version with Windows as an alternative. Today, you won't find a single Netbook with Linux as the default OS. I read more often than not in recent years that public authorities trying to get along with Linux have failed.
I think I made my point; the Linux Desktop is in a sad state – I'd like to discuss why this is the case and how we (the free software community) can change it. I won't claim that I have the definite answer for the cause. I think it's a mix of things, I'd like to start with some points:
Okay, I've started the discussion, I'd like others to join. Please remember: It's not my goal to flame or to blame anyone – my goal is to discuss how we can make the Linux desktop successful again.
A few years ago, I can remember that I was pretty optimistic about a Linux-based Desktop (and I think many shared my views). It seemed with advantages like being able to provide a large number of high quality applications for free and having proven to be much more resilient against security threats it was just a matter of time. I had the impression that development was often going into the right direction, just to name one example freedesktop.org was just starting to try to unify the different Linux desktop environments and make standards so KDE applications work better under GNOME and vice versa.
Today, my impression is that everything is in a pretty sad state. Don't get me wrong: Free software plays an important role on Desktops – and that's really good. Major web browsers are based on free software, applications like VLC are very successful. But the basis – the operating system – is usually a non-free one.
I recently was looking for netbooks. Some years ago, Asus came out with the Eee PC, a small and cheap laptop which ran Linux by default – one year later they provided a version with Windows as an alternative. Today, you won't find a single Netbook with Linux as the default OS. I read more often than not in recent years that public authorities trying to get along with Linux have failed.
I think I made my point; the Linux Desktop is in a sad state – I'd like to discuss why this is the case and how we (the free software community) can change it. I won't claim that I have the definite answer for the cause. I think it's a mix of things, I'd like to start with some points:
- Some people seem to see Desktop environments more as a playground for creative ideas than something other people want to use on a daily basis in a stable way. This is pretty much true for KDE 4 – the KDE team abandoned a well-working Desktop environment KDE 3.5 for something that isn't stable even today and suffers from a lot of regressions. They permanently invent new things like Akonadi and make them mandatory even for people who don't care about them – I seriously don't have an idea what it does, except throwing strange error messages at me. I switched to GNOME, but what I heard about GNOME 3 doesn't make me feel that it's much better there (I haven't tested it yet and I hope that, unlike the KDE-team, GNOME learns from that and supports 2.x until version 3 is in a state working equally well). I think Ubuntu's playing with the Unity Desktop go in the same direction: We found something cool, we'll use it, we don't care that we'll piss of a bunch of our users. In contrast to that, I have the impression that what I named above – the idea that we can integrate different desktop environments better by standards – isn't seen as important as it used to be. (I know this part may provoke flames, I hope this won't hide the other points I made)
- The driver problem. I still encounter it to be one of the biggest obstacles and it hasn't changed a bit for years. You just can't buy a piece of hardware and use it. It usually is “somehow possible”, but the default is that it requires a lot of extra geeky work that the average user will never manage. I think there's no easy solution to that, as it would require cooperation from hardware vendors (and with diminishing importance of the Linux Desktop this is likely getting harder). But a lot of things are also self-made. In 2006, Eric Raymond wrote an essay how crappy CUPS is – I think it hasn't improved since then. How often have I read Ubuntu bug reports that go like this: “My printer worked in version [last version], but it doesn't work in [current version]” - “Me too.” - “Me too.” - “Me too” - no reply from any developer. One point that this shares with the one above is the caring about regressions, which I think should be a top priority, but obviously, many in the free software community don't seem to think so. (if you don't know the word: something is called a regression if something worked in an older version of a software, but no longer works in the current version)
- The market around us has changed. Back then, we were faced with a “Windows or nothing” situation we wanted to change to a “Windows or Linux” situation. Today, we're faced with “Windows or MacOS X”. Sure, MacOS existed back then, but it only got a relevant market share in recent years (and many current or former free software developers use MacOS X now). Competition makes products better, so Windows today is not Windows back then. Our competitors just got better.
- The desktop is loosing share. This is a point often made, with mobile phones, tablets, gaming consoles and other devices taking over tasks that were done with desktop computers in the past. This is certainly true for some degree, but I think it's also often overestimated. Desktop computers still play an important role and I'm sure they will continue to do so for a long time. The discussion how free software performs on other devices (and how free Android is) is an interesting one, too, but I won't go into it for now, as I want to talk about the Desktop here.
Okay, I've started the discussion, I'd like others to join. Please remember: It's not my goal to flame or to blame anyone – my goal is to discuss how we can make the Linux desktop successful again.
Friday, August 12. 2011
OpenLeaks doing strange things with SSL
OpenLeaks is a planned platform like WikiLeaks, founded by ex-Wikileaks member Daniel Domscheit-Berg. It's been announced a while back and a beta is currently presented in cooperation with the newspaper taz during the Chaos Communication Camp (where I am right now).
I had a short look and found some things noteworthy:
The page is SSL-only, any connection attempt with http will be forwarded to https. When I opened the page in firefox, I got a message that the certificate is not valid. That's obviously bad, although most people probably won't see this message.
What is wrong here is that an intermediate certificate is missing - we have a so-called transvalid certificate (the term "transvalid" has been used for it by the EFF SSL Observatory project). Firefox includes the root certificate from Go Daddy, but the certificate is signed by another certificate which itself is signed by the root certificate. To make this work, one has to ship the so-called intermediate certificate when opening an SSL connection.
The reason why most people won't see this warning and why it probably went unnoticed is that browsers remember intermediate certificates. If someone ever was on a webpage which uses the Go Daddy intermediate certificate, he won't see this warning. I saw it because I usually don't use Firefox and it had a rather fresh configuration.
There was another thing that bothered me: On top of the page, there's a line "Before submitting anything verify that the fingerprints of the SSL certificate match!" followed by a SHA-1 certificate fingerprint. Beside the fact that it's english on a german page, this is a rather ridiculous suggestion. Checking a fingerprint of an SSL connection against one you got through exactly that SSL connection is bogus. Checking a certificate fingerprint doesn't make any sense if you got it through a connection that was secured with that certificate. If checking a fingerprint should make sense, it has to come through a different channel. Beside that, nowhere is explained how a user should do that and what a fingerprint is at all. I doubt that this is of any help for the targetted audience by a whistleblower platform - it will probably only confuse people.
Both issues give me the impression that the people who designed OpenLeaks don't really know how SSL works - and that's not a good sign.
I had a short look and found some things noteworthy:
The page is SSL-only, any connection attempt with http will be forwarded to https. When I opened the page in firefox, I got a message that the certificate is not valid. That's obviously bad, although most people probably won't see this message.
What is wrong here is that an intermediate certificate is missing - we have a so-called transvalid certificate (the term "transvalid" has been used for it by the EFF SSL Observatory project). Firefox includes the root certificate from Go Daddy, but the certificate is signed by another certificate which itself is signed by the root certificate. To make this work, one has to ship the so-called intermediate certificate when opening an SSL connection.
The reason why most people won't see this warning and why it probably went unnoticed is that browsers remember intermediate certificates. If someone ever was on a webpage which uses the Go Daddy intermediate certificate, he won't see this warning. I saw it because I usually don't use Firefox and it had a rather fresh configuration.
There was another thing that bothered me: On top of the page, there's a line "Before submitting anything verify that the fingerprints of the SSL certificate match!" followed by a SHA-1 certificate fingerprint. Beside the fact that it's english on a german page, this is a rather ridiculous suggestion. Checking a fingerprint of an SSL connection against one you got through exactly that SSL connection is bogus. Checking a certificate fingerprint doesn't make any sense if you got it through a connection that was secured with that certificate. If checking a fingerprint should make sense, it has to come through a different channel. Beside that, nowhere is explained how a user should do that and what a fingerprint is at all. I doubt that this is of any help for the targetted audience by a whistleblower platform - it will probably only confuse people.
Both issues give me the impression that the people who designed OpenLeaks don't really know how SSL works - and that's not a good sign.
Posted by Hanno Böck
in Computer culture, Cryptography, English, Security
at
17:26
| Comments (6)
| Trackbacks (2)
(Page 1 of 1, totaling 2 entries)