Categories
Technology

Linux Crisis?

480px-LinuxVollwaschmittelPackung
It even runs on some washing machines

Linux – it’s like a shark or an iceberg. Most of it’s below the surface but it’s moving fast and you really ought to know about it.

The majority of people, if they’re aware of it at all, probably think of Linux as the non-commercial alternative to Windows or Mac favoured by people who use computers less to get things done than because they actually enjoy it. Which is a shame in a way, as it gives a wholly wrong impression of its significance. Linux is so much more than an operating system for nerds. Indeed you probably use it yourself, every day. Each time you visit a website the chances are good that you’re talking to a computer running Linux. Smart devices in your home like satellite boxes and DVRs, even TVs now, use Linux. If you have an Android phone, that’s based on Linux. Governments are adopting it, and it is far and away the favourite operating system for the world’s most powerful supercomputers.

So it should be surprising that on the workplace desktop – still the biggest, most visible, and most lucrative computer market sector – it runs a very distant third. How come?

Operating_systems_used_on_top_500_supercomputers.svg
Linux conquers the supercomputer (Author/Wikimedia/Top500.org)

Counterintuitively, because it’s free. After all the PC is not ruled by Windows and Mac OS because they’re cheap. Rather it’s because people can make money out of them. Huge ecosystems of supporting industries have grown around these computing platforms – software, hardware, services, training, maintenance, publishing – in no small part because there was a key partner there offering support and leadership.

How do you make money out of a system no one owns? With whom do you form a partnership? How can you be sure, when no one’s in charge, that it’s going to develop in a direction that will suit your business? It’s tricky.

Unless of course you take a leadership role yourself. It seems the best way to make money from Open Source Software like Linux is often to step up and be that key player. This is what Google did with Android, making it the world’s most popular phone OS. Or Red Hat, whose version of Linux is one of the most successful systems in the server sector. Each brought their own business model; Google of course is ultimately selling advertising, Red Hat its expertise and support.

If anyone is going to turn Linux into a household – and an office – name it is surely Canonical. This is the power behind Ubuntu, the most popular and user-friendly desktop version of Linux so far. And their vision doesn’t stop at the desktop – nothing worthy of the name could these days. In recent weeks they’ve launched Ubuntu editions for tablets, phones and TVs. It seems they plan to have devices running their software in every major market sector.

Ubuntu TV – It’s a lot like Windows Media Center except for the giving money to Microsoft part.

It’s an extraordinary ambition, and if they can pull it off then Canonical/Ubuntu will be up there with the big girls, sitting proudly alongside Google, Apple, and Microsoft. But can such a fabulous commercial edifice really be built on open foundations? So much of Linux is being developed by people who work for competing organisations – or who aren’t being paid to do it by anybody.

Indeed community disenchantment may already be starting to show. Ubuntu is no longer flavour of the month. Once hugely popular with the sort of Linux user who doesn’t actually want to reinvent the wheel but just needs something that can be installed and maintained with the minimum of fuss, Ubuntu – and its slightly geekier sisters Kubuntu and Xubuntu – drove all before it. No longer; the flavour now is Mint.

This is a different Linux variant (or ‘distro’, to use the jargon). Indeed it’s very much a variant of Ubuntu, just with Canonical’s more commercial ideas stripped out. (This ‘forking’ is perfectly legal in the OSS world – in fact it’s the whole idea. Ubuntu itself is based on the well-respected Debian distro.) Mint’s popularity though was given a huge boost when Canonical introduced their ironically-named Unity interface.

Some of the resentment of this was silly. Unlike Windows or Mac where the graphical user interface is part and parcel of the system, much of the beauty of Linux is that you can choose – even create – your own. For some however it’s a cause. There have long been two main Linux desktop camps: Gnome and KDE. Ubuntu had been on the Gnome side, so its defection – to a third camp of its own invention yet – was seen by many as betrayal.

English: Screenshot of Ubuntu 11.10 (Oneiric O...
Ubuntu’s Unity desktop, with its great big finger-friendly icons (Photo credit: Wikipedia)

More seriously though, Unity is very clearly a touch-orientated interface. As the name suggests, it’s meant to be similar on all types of device. Rather like Windows 8 this makes it less efficient – or put it another way, more annoying – for users stuck with an old-fashioned mouse. And as Linux tablets barely even exist yet, that means pretty much all of them. For the first time, Canonical were allowing their commercial vision to degrade the user experience.

But that was as nothing compared to the next change. A feature of Unity is that you can find a file or application by typing its name in a search box. If what you type isn’t on the computer, the newest version of Ubuntu continues the search on the Web – specifically, to Amazon.com.

“Crochet Patterns not found. Do you want to purchase Crotchless Pants?”

This unasked-for advertising feels a bit like an invasion of privacy. The building of it right into the operating system feels a lot like a kick in the teeth to the non-commercial ethos that engendered Open Source.

Making money in itself is not the problem. Google, Red Hat, even that old devil IBM make a lot of money out of Linux. Where I think Canonical sail close to the wind is in identifying themselves with Linux more closely than any company before. Independent computing creatives will resent it deeply if they come to be perceived as dupes – or worse, minions – of a commercial giant.

There are two questions here really. The first it whether Canonical/Ubuntu can maintain the goodwill of the wider Open Source Software community. The second is whether they can realise their vision without it. Perhaps they can, but I think it would be a minor tragedy if they did.

One thing over which there’s no question though: Open Source Software can continue without Ubuntu.

____________________________________________________________________________

Next time I’ll talk about why you should try Linux for yourself: Because it’s fascinating, informative, educational – and could save you heaps of money.

Categories
Politics Technology

Tax The Rich – Bill Gates

Historic Microsoft photo of Paul Allen (left) ...
Little known fact: Microsoft were raised by a pack of wild Commodore home computers

With deficits the way they are, the rich are going to have to pay more. Unfortunately, almost everyone’s going to have to pay more, and it should fall more heavily on the rich… Just raising taxes on the rich won’t solve the crisis, but it seems reasonable to people – and there’s plenty of room to do that without creating disincentives or distortions.Bill Gates

I always did like Bill Gates.

No I mean it. In fact I liked Microsoft  – at least, more than most people I know. Now OK, a lot of that was just my perverse nature. You were meant to hate Microsoft with the burning passion of a thousand suns, so I had to see the other side.

But there is another side. Yes it’s true that Microsoft took advantage of ideas pioneered by Apple (and others, including IBM). It’s true that they leveraged their strategic market position to gain ever greater dominance. But I’m convinced that the world would be a poorer place without Microsoft and its vision of getting a personal computer onto every office and home. Others thought big, but not that big.

Sure, I would have preferred if they’d never become a virtual monopoly. Monopolies are always unhealthy and unfair. But the need to easily transfer data between organisations, alongside huge economies of scale in manufacture, maintenance, and training, meant that office computing was a monopoly waiting to happen. We are fortunate I think that it was not won by a business like IBM or Apple, who would have wanted to make both hardware and software. That would have been a far more total and stultifying monopoly.

Microsoft’s approach was to make only the key software, and encourage an ecosystem of hardware makers, application developers and services around that. It was an innovative business model that Apple and others learned a lot from. And though the ‘Wintel treadmill’ of ever-more-capable hardware inspiring ever-more-demanding software seemed endless, it meant that powerful computers quickly became cheap and commonplace, laying the path that brought the Internet into our lives.

No one should ever have as much power in business as Bill Gates did, but somebody was going to. I’m glad at least he is that rarest of capitalists, one not afraid to admit he has too much money.

Categories
Technology

Ink With Links

A Japanese advertising poster containing QR codes
Adverts with links let you buy on the spot.

When I posted yesterday about QR codes, those little symbols used to put Web links on real-world objects, reader Azijn made this thought-provoking comment:

I find QR codes a bit weird. Why not have an app that can simply recognize a certain default font in which advertisers will agree to publish their URLs? Humans and phones alike can recognize that!

Indeed, I can find no such app. How come? Azijn’s idea would surely work.

But then you have to remember that most design actually happens by accident. QR codes are prevalent for this purpose mainly because they’ve been around long enough to catch on. They were invented by Toyota for labelling components and it was in Japan that they were first used on phones. But that doesn’t mean of course that they’re the best solution.

QR codes did have a couple of advantages. They were designed expressly to be read by machine and have built-in error correction, so they were easier for simple devices to process. But now that phones are very powerful computers they should have little trouble handling text recognition – I doubt if there’s even any need for special fonts¹.

I can think of one way to speed things up though: A typographical convention to indicate where a website address begins and ends, such as putting it between two easily recognised symbols, so that the phone doesn’t need to scan whole pages. Example:

►http://i.doubt.it◄

Any such text will be highlighted on your phone’s screen, showing you that it’s clickable.

Can I get a patent on that?

  1. There have been fonts designed to be easily read by machine since at least the 60s, for example the hardcore OCR-A, the more friendly OCR-B, or the space-age classic Westminster – which I had always thought belonged to NASA or IBM or some such but turns out to have been created by a British bank. These days though Optical Character Recognition software is so good that they are no longer really necessary, though obviously plainer, less ornate fonts are likely to get better results.
Categories
Cosmography Technology

This Is Your Brain On Screen

IBM has a really interesting – and just slightly scary – plan. In cooperation with Switzerland’s École Polytechnique Fédérale de Lausanne, they want to simulate the human brain.

They’re building a computer model. This is not the same thing as Artificial Intelligence (AI), programming a machine to act human. That would be a ‘top down’ approach; trying to understand how the mind works by looking at what it does. Instead this is ‘bottom up’, simulating the nuts and bolts of the brain, its biological wiring, its cells, even its molecules.

Which is quite an undertaking – in fact it is hard to exaggerate how big the task is. The brain is often described as the most complex thing in the known universe. Complexity is a thing that’s difficult to define but easy to perceive. Looking into the back of a TV, you’re instantly aware that it’s more complex than say a food mixer. Basically it looks more tricky to fix. The parts are small, numerous, and connected together in many different ways. Perhaps that’s the most intiuitive shorthand measure of complexity – the number of different ways that the parts of something interconnect. The human brain has far more connected parts than any other thing known, certainly more than any computer. Even Japan’s Earth Simulator, built to model the climate of the entire planet, is nothing compared to the brain of an average person.

It’s no surprise therefore that they aren’t trying to do the whole thing at once, or anything approaching that. They are starting with the best bit though: the neocortex (also called the cerebrum), the outside layer of the brain that’s most recent in evolutionary terms. It’s not unique to us, but it is far more developed in humans than in any other animal and appears to be responsible for what we experience as thought.

Even alone though, this is still far too complex for current technology to tackle. All they’re hoping to simulate right now is what’s known as a neocortical column. This can be described as a single ‘circuit’ of the brain, one of its processing units. The whole neocortex contains about a million of these. And for the moment at least, they only plan to model it on the level of its cells; to get down to the molecules that make up the cells will take vastly more computational power again. Yet even this is an immensely ambitious target. To model just one circuit of the brain in this (relatively) simple way will require four whole modules of Blue Gene – the technology IBM used to take the title of world’s fastest supercomputer back from the Earth Simulator.

So how far are we then from modelling the whole brain? Well assuming this first stage succeeds – it won’t be easy – all they really need to do is scale it up. Vastly. These four Blue Gene racks would fit in a normal kitchen. Four million? They would take up a golf course, and require the energy of five medium-sized power stations.

When you consider that your actual brain fits inside your head and runs reasonably well on sandwiches and cups of tea, you realise what a gap there is between nature’s technology and our own.

What’s the point then in going to all this trouble when a brain can be made much more cheaply using just two humans? If the object were to create machines that think, this would clearly be a madly inefficient way to go about it. But that’s not the object. The fact is we know amazingly little about how our own brains work. Simulating a part of one, even a solitary neocortical circuit, will teach us so much about what is really going on in there. Modelling allows you to find out why something is the way it is, because it can show you what would happen if it were different. The beneficial applications of that are obvious; as we see how it works, we gain greater insight into why it fails – what causes schizophrenia, Alzheimer’s, autism, the things that plague our minds.

But though it’s always good when research has palpable benefits, I think we need no  such excuse when it comes to researching the structure and function of the brain. To know ones own mind – that is surely a philosophical imperative.

(For more fun with human brains, see the comic strip)

%d bloggers like this: