Do Not Try This At Home. Do Not Try Something That Looks Even Vaguely Like This At Home.
Deadlines were passed and I could at last return to the furniture restoration. Working in the sun, I was making fine progress with a new fancy flappy sanding attachment. And then my drill decided to start stopping.
Bugger. Checked the cable, it didn’t seem to be that. This job might be non-urgent but I can’t really afford to be without a drill – or, to replace it right now. Nothing to do but strip the thing. It seemed to stop working when held at certain angles so I worried that the bearings were worn, allowing the moving parts to slide around excessively and foul – that is, hit something they ain’t meant to hit. My drill is old.
Fortunately though it’s a Bosch, nicely robust and built with a view to maintenance. Eight screws and it’s open. Here be the innards; the yellow bit on the left is the trigger and speed control, the white bit holds the carbon brushes that transmit current to the moving core of the motor, the black part is the torque control, and the big grey lump on the right is the outer, non-moving coil of the motor.
And here’s the core of the motor and the gearing, placed back into the casing without the parts that should surround it so that I could check that it moved smoothly and looked right. It did. The mystery deepens; nothing appears to be wrong with this drill.
Here it is all cleaned and reassembled. As you can see in the video up top, it runs. But only sometimes… This was infuriating now. Hours of re-checking and tweaking later, I finally realised.
It was the cable all along. Though I’d looked to that first as the most obvious thing, I hadn’t checked it thoroughly enough. The fault was intermittent – a break in a wire somewhere near where it entered the handle. Which of course caused it to stop only when held at certain angles… And it shouldn’t have been a surprise; the rubber boot thinger meant to prevent too much flexing at precisely that point had worn out years ago. I cut a few centimetres off the cable, rewired it, and remade the rubber boot thinger.
It now runs perfectly. So, good to do some maintenance on an essential tool of course. But basically I lost most of an afternoon and the whole evening because I’d leapt right in instead of being slow and methodical. I’d like to say that life taught me a lesson here, but to be honest life keeps trying to teach me that one.
Picture whipped without any hint of permission whatsoever from Irish Weather Online – hope they don’t mind. Click image to visit their Facebook page, full of climatic gossip.
Well damn. They had to go and find it, didn’t they?
The cloud, the huge one that usually sits neatly over Ireland. They finally tracked it down yesterday – see picture – and must have dragged it back last night. Probably the farmers did it. Those thirsty, thirsty farmers.
So today was the first non-rock-splitting day for over a week. I got up early and thought it was just a morning mist, so often the harbinger of a solar barrage to come. But it never lifted.
Perhaps I should be glad. It was really hard to concentrate in the sun, and yesterday I was researching an article on Big Data and Human Resources. If that means nothing to you I won’t spoil your happy innocence for now, I’ll just say that it was a bit on the technical side, requiring more concentration than I could easily muster. In the end I gave up and switched to a job that actually required a trance-like meditative state. Until the sun went down I stayed in the garden with my shirt off doing a thorough job with an electric sander on that piece of furniture I’m restoring.
The sun meanwhile was doing a similar job on my skin. It feels leathery and itchy today, which somehow seems contradictory. Another reason why I should really be glad it’s overcast. But with the help of the cool and twelve hours of almost unbroken writing I did get my article finished.
Now night has long fallen. It’s quiet – except for a neighbour’s donkey letting out the occasional long, lonely bray. That must be about the most heartbreaking non-human sound in all the world. I’m sitting up late, upgrading a friend’s Mac. As you do. It seems to have worked – which is a relief as I went straight from Tiger to Snow Leopard without any intervening Leopard, something that’s not officially possible.
And I have all the windows open, in the hope of making more flappy friends. I think I’m getting exclusively the tiny, buzzy, feeds-on-blood kind of friend though. But it doesn’t matter, I’m doing it just for the atmosphere really. The insect-laden atmosphere. When I was a child I lived for several years in a caravan, and that made me intimately acquainted with the beasts of the rural dark. We basically couldn’t keep them out. So having them around again is just kind of nostalgic. It’s not proper night air unless it bites.
Douglas Engelbart, who has just passed away at the age of 88, is referred to more often than not as the inventor of the mouse. It would be an injustice though if he was remembered only as the creator of a device now already beginning to seem dated.
The point was not the device itself but its purpose: To select and activate visual representations on a screen. Not just icons and menu items, in the fashion later made famous by Steve Jobs, but also links between texts – a vision he was promoting two decades before Tim Berners-Lee make the Web a practical reality. Despite living in the age of punchcards and paper tape, Engelbart foresaw a whole new way for humans to use computers.
This though was merely part of a wider vision, of a world where human intelligence would be augmented by machines. We have not achieved noticeably greater intelligence yet, it must be admitted, but it would take effort not to see today’s instant access to information as a big step in that direction. We are living in a world that Engelbart helped create.
Terrible fuss was made when Windows 8 introduced a whole new interface designed around touch, completely lacking the familiar and comforting Start Menu. Now instead of mousing through a list to find an app you were supposed to tap or click on its big bright “tile” on the new home screen.
This is an attractive interface, and as well designed for touch as anything from Apple or Google. The little problem is, the vast – indeed, vasty vast – majority of users do not have a touchscreen. They are still using mice, touchpads, and similar pointing devices. Because while touchscreens are cute and all, most people use Windows PCs for work things like typing reports or articles, or entering numbers on spreadsheets – things you need a keyboard for. And in those situations, a touchscreen is at best a frivolity. It’s actually inefficient because, at least compared to a touchpad, it requires you to move your pointing hand further from the keys.
For all these people, having to use an interface designed for touch is a small irritation but a constant one – and we all know how infuriating constant small irritations can be. In response, some PC vendors introduced their own solutions: third-party apps that imitate the old Start Menu. Samsung took a different turn, and equipped some of their Windows 8 laptops with an extension that looks remarkably like the Dock from the Mac OS X desktop. This has led me to formulate the theory that Samsung actually like Apple’s legal team personally, and look forward to meeting them.
Eventually though Microsoft responded to the outcry and yielded with good grace, restoring the Start Button to its pride of place in the free update called Windows 8.1.
Did they buggery.
They said they restored it. But if you’ve downloaded the 8.1 preview (or more likely, watched the demo video), you’ll see that all they’ve really done is placed a button on the taskbar of the Windows Desktop – a button that opens not the Start Menu, but that same old shiny tiled home screen. It is an improvement in that you can find your applications in the place that your hand has spent the last fifteen-odd years going to and so don’t have to change direction every. bloody. time (the “proper” shortcut is at the right-hand edge of the screen), but it’s still a touch-oriented interface on a mouse-oriented device.
Similarly, it won’t let you boot straight to a desktop like all previous versions of Windows. Even if you only ever want to use applications on the desktop, you have to get there through that damn screen of tiles. Every time.
A tip: If you move the Desktop tile to the top left position it becomes the default option, and so can be selected without any mouse movement at all by hitting the “Enter” key. Similarly you can put the Windows Media Center tile here – if you have it – to make Windows 8 more usable with a remote control. Or perhaps I should say, less unusable.
Why does Microsoft not allow these as options – even turned off by default? The reason is they want to ‘encourage’ software vendors to develop for the touch interface – by taking away any other option. In its visionary ruthlessness it’s a very Apple-like move, certainly a bold one. Probably, once the new religion catches on, they will allow flexibility and convenience again. But right now it’s just another little thing that makes me want to spend less time as a Windows user and more as a Linux one.
By the way, this is the first Windows version with a point-release name since Windows NT 3.51, way back in 1995. Though it should be pointed out that Windows 8.1 is known internally as NT 6.3. As the NT series started not at 1 but at 3.1, we deduct that to find that Windows 8.1 is really NT 5 – which was Windows 2000.
Among the more pointless things I’ve done recently is install a blacklight CFL in my bedside lamp. This is to encourage me to take up reading books again. Do you follow? It’s a simple idea really. I’ve grown so used in the last decade or so to reading from screens that paper seems a bit weird now. But turn on a UV lamp and what happens to a book? It glows. Like a screen!
Bleaching agents in the paper must make it fluoresce. Not all the fibres seem to have it equally though, and the page takes on an oddly speckled, grainy look. It is bright enough to read by though, just about.
All right to be honest this isn’t really why I got the bulb. I bought it because I’d never seen a blacklight CFL before, it wasn’t expensive, and I thought it was too interesting not to buy. In action it seems more violet than invisible, but white things around the room glow in an eerie way. The shirts I have hanging look particularly fierce, and the pale neon emanating from my map of Europe hints at the trippy possibilities. This all gives the room an… interesting look – somewhere between clinical laboratory and tatty ghost train.
And I notice that it actually makes the photochromic lenses of my glasses go dark, so it really does seem to put out a healthy (?) amount of ultraviolet light. Perhaps if I sleep with it on I’ll get a tan this summer.
I didn’t speak before now about my last exam. The thing is, I’m really not sure how I did.
It felt good. I left the exam hall exhausted, elated, as if I’d given my all.
I just wish I could be sure that my all is the all they wanted.
I have no complaints about the paper. Couldn’t really have been better from my point of view. I was able to avoid the cost analysis question I dearly wanted not to do. It wasn’t a hard one; basically it’s just a sum. The problem was those two words – “cost analysis”. I had to stay alert through a whole exam, and just looking at them makes my eyelids droop.
The systems theory question on the other hand was all too exciting. Yes, seriously. It involved concepts that have interested me for a long time. Visualising the world not as discrete objects but in terms of interacting systems, flows of activity and information. Emergent phenomena – how all the complexity and wonder of life arises out of apparently simple chemistry, or indeed solid matter out of ephemeral probability. The danger with this was that I could easily blow the entire two and a half hours if I got hooked on a wild-eyed Idea.
So I began with the case study question, which retrod a lot of ground we’d covered in our projects. This made it easier, but had the downside that my head was preloaded with too many things I could say. And I think I said too many of them, because I spent over an hour on that one.
Thankfully, next was what’s known as a decision table. These distil a complex decision-making process into a simple table you can look up. You might – as in the example – be a college book shop trying to decide whether to keep some old titles in stock or return them to the publisher. There are a bunch of factors involved, how do you decide? Well here the table shows that if, for example, an edition is no longer current. but has been requested by staff, then the correct response is to keep it. Simplicissimo.
Condition
USER RULES
1
2
3
4
5
6
7
8
9
10
Edition Is Still Current
N
N
N
Y
Y
Y
Y
Y
Y
Y
Old Edition Requested By Academic Staff
N
N
Y
–
–
–
–
–
–
–
Any Copies Sold In Last 3 Months
–
–
–
N
N
Y
Y
Y
Y
Y
More Than 15% Of Stock Sold In Last 3 Months
–
–
–
–
–
N
N
Y
Y
Y
More Than 20% Of Stock Sold By Mid-Semester
–
–
–
–
–
–
–
Y
N
N
Sales Manager Believes Book Will Still Sell
N
Y
–
N
Y
N
Y
–
N
Y
Action
Return Remaining Stock
X
X
X
Consider Returning 75% of Remaining Stock
X
Keep Remaining Stock
X
X
X
X
X
X
Why is the table so small? Having six conditions, each with two possible values – Yes and No – you’d think it would need (2x2x2x2x2x2=) 64 columns instead of 10. The trick is that some conditions make others redundant. Look at what happens if the Sales Manager decides a book will still sell. Their word goes, making all other considerations moot. By examining the logic in this way you can reduce the table to its essentials.
The problem then is making sure you’ve done it right. Do the rules really cover all possible situations? Could two different, contradictory actions be invoked by the same set of conditions? That latter is particularly significant because tables like these form the basis of computer programs, and when a computer is stuck between two conflicting responses it explodes.
Possibly.
Examining a table for logical consistency sounds scary, but when you boil it down it’s a puzzle not unlike a Sudoku. Having practised, I’d got the knack of solving them visually. Well, simple ones… That saved time which by now I badly needed. I’d left myself barely more than half an hour for all the theory. Things were now officially intense.
So I don’t recall clearly what I wrote… I do know though that somehow I got stuck on aspects of systems theory that bug me. Couldn’t I write a happy answer about the many aspects that I think are cool and interesting? No, apparently I can’t do that.
Really it was one particular lecture slide I was hung up on. This had compared science to the systems approach, contrasting them as analytical versus holistic, qualitative versus quantitative, so on. In other words presenting the systems approach as a counterbalance, even an alternative, to science. That struck me as just wrong; overshooting the holistic and heading into homoeopathic country. Or “needlessly messianic”, as I described it. (Which incidentally was the second entirely pointless Hitch Hikers Guide to the Galaxy reference I found myself slipping into these exams.)
In particular it described science as “reductionist”, which to me is to misunderstand it completely. Sure, science takes things apart and examines the components. But it doesn’t do that to understand the components; rather the objective is to see how they all work together – as a system. As a whole.
Holism is right there in science. To claim otherwise is to traduce humanity’s most important philosophical tool for one’s own obscure – or obscurantist – motives.
OK I didn’t say that last sentence, thank God. I was having a bit of a head rush but I still knew better than to condemn the subject I was being examined in as an evil conspiracy. I’m not doing English lit any more. And I don’t think that of course. What I hope I managed to convey is that I find systems theory attractive, but at the same time worry that this very attractiveness may make it dangerous. Is it a useful way of looking at the world, or a friend to fuzzy thinking? Well, I’m not sure – but I want it to be useful.
Maybe my suspicions were refreshing, maybe I’ll be marked down for insufficient imbibing of the Kool-Aid. In short, yet again I am certain that I either (a) did a really good exam or (b) plunged off the cliff in a ball of blue flame. One or the other.
As far as we can ascertain, this is all Apple actually made in Ireland
No.
OK maybe I should expand on that a little.
Hell No.
All right, let’s break it down: Should Apple and Google pay more tax?
Yes.
Should they pay that tax in Ireland?
Should they shite.
They ought to be paying the tax in – ooh, I don’t know – the countries where they actually owe the tax? The places where they did the work and made the profit. As opposed to giving it to us for letting them pretend they do their business here. Apple and Google are not the only examples of this of course, and I’m sure that they’re far from the most egregious. They do actually do some stuff here, unlike hundreds of companies that have their brass-effect plaques in the IFSC. But they are immensely profitable and we are helping them keep more of those profits for themselves. For a cut.
There is nothing fundamentally wrong with offering a slightly lower rate of corporate tax to attract business, especially if it’s a loss you’re willing to take in order to compensate for another disadvantage – a fairly peripheral location, for example. It could, and I’m sure it once did, attract people to do real business and create real employment here that they would not otherwise have.
But when the rates are so low that they tempt corporations to just start trucking money through the country, and when we provide them with “pro-business regulation” that doesn’t check excessively carefully to make sure all that money is really being made here, then we are stealing. It’s as simple as that. Those companies should be paying taxes to the people of other countries, but we’re taking it.
And ultimately, it does us no good. Just look. This easy-money attitude helped create a soufflé economy that grew and grew and grew until it wasn’t there. Some people made billions out of it of course, but all most of us have to show is debt, negative equity, unemployment.
To this we can add international pariah status. Did you not notice Eurovision?
So now we begin again. What if we try to rebuild the economy on radical principles – like proper regulation, reasonable taxation, and actual value?
Intellectual Property and YOU (Photo credit: Thomas Gehrke)
Another exam this morning. Christ what a paper. Answer three questions out of four; was going to be out of five but they had to cancel a lecture or two so they curtailed our choice to compensate… Which meant that being weak in even one area was a big risk.
And I was weak in one. This paper was Information System Innovation, a strange mix of investment decision-making, Intellectual Property law, and Open Source idealism. At all costs I wanted to avoid a question on business metrics, the tedium of which makes my brain cry.
I got lucky. My favourite area – Open Source Software – came up in two questions. If anyone on the course had been trying to avoid Open Source on the other hand, they were pretty much stuffed and mounted. And this after they told us explicitly that there would be no overlaps.
My only real problem with the paper was that there wasn’t time to say all I wanted to say. So strange to be answering questions about the likes of Richard Stallman and Linus Torvalds, people who before this seemed more like figures out of folklore. Weirder still to think that when I graduated with my primary degree, none of the stuff on this course had happened yet.
So despite the stress I actually enjoyed the exam. This may not be a good sign, as it means I managed to go on at some length about things I have opinions on. Apple versus Samsung, Menlo Park versus Xerox PARC, IP in an age of 3D printing. Did they even want opinion? Did I show I was fully engaged with the material, or rave about stuff that was only tangentially related? Essentially, I can only have done either a brilliant or a disastrous paper.
The connection between image and text is tangential at best.
Wow. Doing an exam is like shoving fistfuls of drugs into your face.
Well, doing an exam after…
Studying frantically in a sort of cold panic for over a week
Waking up at 3 a.m. and not getting back to sleep until an hour before the alarm
Rushing out of the house only to find that the car won’t start
… feels like messing your head up with all sorts o’ bad stuff. Stress, with the stress on stress.
I still don’t know what was up with the car. Yes I had checked it the night before and no, I didn’t leave the electrics on. It was the good new battery that saved me in fact, because as the last desperate throw of the dice I just turned the engine over and kept turning it over until finally, one cylinder at a time, life returned. Perhaps I’d flooded it on the first try.
So now trying to get to my exam through rush hour traffic on very little sleep but oh so much adrenalin. Made it as far as the campus with minutes to spare, knew it would take too long to find a student parking space so threw handfuls of change at a ticket machine. Ran up three flights, downed three cups of water, made it.
This was Java, at once somehow my most feared and enjoyed subject. The course had been challenging – literally half the class had transferred out – but I felt like I was beginning to grasp its rhythms and its symmetries. Some programmers dislike the language; I have little to compare it to but I see a beauty in it.
Java is perhaps the best known example of an “Object Orientated” language. If I dare try to explain that in simple terms, it means that instead of being long impenetrable lists of instructions, OO programs are made up of small units that attempt to model real things. A program with cars in it, say, would contain a subunit (called a “class” in Java) to represent cars. It would have its associated variables – colour perhaps, size, top speed – and “methods”, which represent what a car does: accelerate, brake, etc. They can be as elaborate or as simple as you need, but cars will exist in your program as discrete entities that can interact with other entities like passengers or junctions or other cars.
You can define subclasses that have things in common with some cars but not others, like 4x4s. Or superclasses – for example, one of vehicles – that comprise cars and other objects. In this way you clarify the relationships between things; you also avoid having to write the same code over and over, as subclasses inherit features from their superclasses. “Accelerate” for example need only ever be defined once to be used by every sort of vehicle. All these knit together in careful, logical ways to represent and simulate how things in the real world can interrelate. It’s elegant and subtle.
And elusive at times. So I worried that my understanding of the concepts was still quite tenuous and that an unexpected question might blow a hole right through it. But I think the exam went well. One good thing – I started at full speed, and stayed at full speed for three hours. All right, some of the answers may have been a little “Ooh, here’s another thing I remember!”, but I think I displayed a thorough understanding.
Unless of course I don’t understand, in which case I will have displayed a thorough misapprehension. To find out, we must now wait till autumn.
This is all over by 12:30, but the rest of the day is not without incident. Get some things I needed done done, fetch and carry, all in a strange trance of excess energy. I make it home eventually. The idea is to have an early night but I am as wired as I’m tired. It’s one in the morning before I finally – joyfully – go to my bedroom and reach to turn on the light.
And step in something wet.
That is never good. That is never never never good. It’s not much good in a bathroom or a kitchen. But in a bedroom, stepping in something wet is right out.
There is a puddle forming on the floor. The computer I’m building is sitting there powered up to standby, so it’s just as well I “went to bed” when I did. There is a drip from the ceiling. Deftly turning off all electrics and water with a single move, I fetch a ladder and squirm into the attic.
It’s coming from the complex pipework linking the three tanks of water in the attic space (I do not know why there are three tanks of water in the attic space). It is dropping directly onto a box of my personal memorabilia, and from there through the floor. After cutting away some of the nice new insulation I find a weeping joint. I fetch tools and tighten the fitting, squirm out and turn water back on.
Leak much much worse bugger.
Opening offending joint, I find that yet again a pipe has eroded. Don’t know what’s doing this, but it’s maybe the fourth instance of spontaneous dissolving pipe in the last couple of years. What the hell are we drinking? Spend the next hours crawling around in the dusty, glass-fibery, spidery dark doing work almost utterly unlike the pure cerebration of the morning, so tired now that – mercifully – I can’t even feel how tired I am.
Where did Linux come from? Strange as it may seem, its roots extend back to ’60s Counterculture. Not a lot of computers in the Haight-Ashbury of course. In those times the only place a young person was likely to access hardware was at university, and it was on campuses that experimentation with drugs and social non-conformity met the sort of person who doesn’t have a lot of friends but is really good at mathematics. Strange things developed out of this cross-fertilisation. Like, to a large extent, the Internet. It was the beginning of Hacker culture.
In this period one of the leading computer operating systems, on campuses and increasingly in industry, was Unix. It was interestingly designed and well-suited to the networked style of computing that was beginning to emerge, seeming almost a little anarchistic in itself. But it was still an expensive, licensed corporate product. A few brave (and possibly slightly high) young souls decided that, hey, they were programmers. They could make something just as good themselves.
The name of this project was GNU. (Standing for GNU is Not Unix. What else?) Like Unix, GNU was designed not as a single giant program but as a whole bunch of little ones, each with its task to perform. A lot of progress was made, but the project long lacked its most vital component: The one central program that organises all the others, known as the kernel. Until, that is, 21-year-old Finnish student Linus Torvalds created one for his own amusement. When the two projects were put together, a complete operating system was born. Purists to this day refer to it as “GNU/Linux”, but plain Linux does for the rest of us.
Don’t make the mistake though of thinking of Linux as an inferior imitation of Unix made by hippies. It’s true it was modelled closely on Unix. In one sense it is Unix; its commands and structures are much the same and a person who knows one can use the other. The difference is that Linux is devoid of any patented or proprietary technology, and so can be copied, changed and distributed freely. This openness has allowed countless people to improve the code – everyone from oddball geniuses just showing off to giant corporations motivated by the bottom line. The only rule is that if you make modifications you must give your work back to the community. This Open Source philosophy has allowed Linux to mushroom in capabilities and refinement, leaving the Unix it once emulated far behind.
This freedom has also led to the huge number of “distros”, as they are called. Linux comes in several major versions, and almost countless minor. Though let’s be careful to be clear about this – they are not different operating systems in the sense that Windows and Mac OS are different. They’re all broadly compatible, the differences reflecting variations more in philosophy than technology.
Nonetheless the sheer breadth of choice may be off-putting at first. Don’t let it worry you, the day is not long off when you’ll believe passionately that one of them is far better than all the others. But that doesn’t matter now (and to be honest, it won’t matter a hell of a lot then either) – what we care about is where to start.