I don’t normally think class-action lawsuits move the needle much, but in this case they seem justified because these companies are effectively dumping toxic waste onto the Internet. And make no mistake, these IoT things have quite a long half-life: A majority of them probably will remain in operation (i.e., connected to the Internet and insecure) for many years to come — unless and until their owners take them offline or manufacturers issue product recalls.
One of the many appalling things about these things is that many just cannot be secured at all. It's all smoke and mirrors : the web interface might let you change the default password, but this might not actually save it. Or there are other default passwords (for other routes into the system) that cannot be changed.
Some security experts are now coming round to the idea that the government might need to step in and mandates some fixes. The EU appears to be starting down this route now.
The future of work has been in the news off and on for a while, especially as AI and robotics start to make more of an impact. Lorry drivers are one profession said to be threatened by self-driving vehicles, but many white-collar employees are also at serious risk. For instance, computers can do very efficient legal discovery (and a lot more cheaply). We might start seeing higher unemployment, more underemployment, lower wages and the need to work much longer. Tornadoes weaving a path of destruction through the workplace?
A couple of years ago, The Economist magazine talked of :
Just as robots became ever better at various manual tasks over the past century—and were therefore able to replace human labour in a growing array of jobs, beginning with the most routine—computer control systems are able to handle ever more of the work done by human administrative workers. Jobs from truck driver to legal aid to medical diagnostician to customer service technician will soon be threatened by machines. Starting with the most routine tasks.
The article above starts off quoting David Graeber on his labelling many jobs as "bullshit jobs" (Strike Magazine), massive swathes usually administrative jobs in areas like health administration, human resources and public relations. A very different type of economy from the "classic" one of people designing and making things. I might argue that this has really always been the case however, at least since the Industrial Revolution.
David Graeber (author of Debt: The First 5000 Years and LSE Professor) :
While corporations may engage in ruthless downsizing, the layoffs and speed-ups invariably fall on that class of people who are actually making, moving, fixing and maintaining things; through some strange alchemy no one can quite explain, the number of salaried paper-pushers ultimately seems to expand, and more and more employees find themselves, not unlike Soviet workers actually, working 40 or even 50 hour weeks on paper, but effectively working 15 hours just as Keynes predicted, since the rest of their time is spent organising or attending motivational seminars, updating their facebook profiles or downloading TV box-sets.
Maybe the French have a word for this :
I was at the V&A on Sunday looking at some ceramics and glassware. The sixth floor was almost empty, apart from me and three or four other people (later). But there are still guards around, all day, every day. Lots of interesting things to look at and read, but the job seems to be very very dull.
Some glass ornaments and sculptures were extremely striking :
Above: Deep Blue and Bronze Persian Set by Dale Chihuly, 1999.
From the museum label :
Originally reminiscent of the tiny core-formed bottles of ancient Egypt and Persia, the 'Persians' series was begun on 1985. Since then Chihuly has developed the series into a range of different shapes , the outer ones often of enormous size.
My laptop of choice has always been a Thinkpad, firstly as made by IBM and latterly by Lenovo. I own an X220 (and an older X60s, still a wonderful little machine), and even though it's a few years old now it's still a great laptop.
One of the big reasons I'd still buy a Thinkpad is their build quality. Also, if you need to do any maintenance on the system (e.g. upgrade RAM, swap the mSATA SSD), the documentation is very good (much better than Dell's for instance).
People often enthuse about the build quality of Apple laptops, but I'm not willing to spend money with Apple. And even if I was, it doesn't seem such a good idea to replace Mac OSX with Linux. Linux generally runs very well on the Thinkpad.
Currently with Debian "Jessie" (Testing) installed and the i3 tiling window manager. It's very refreshing not having all the desktop clutter around. Not really any desktop at all in fact.
Another recent Adobe Flash security update has me at the Adobe site again, trying to remember the update process for Linux. I use the 64 bit version and have to un-tar the archive and copy the .so to the right folder.
Adobe stopped shipping new versions of Flash for Linux a while back, but promised to keep the Linux versions updated for security (Thanks Adobe). But it's still odd seeing the different versions available for the other platforms.
I have :
Windows, Mac and Google Chrome have :
Note that the version Adobe say is the latest for Linux is wrong - 378, compared to what I just downloaded and installed - 394. Who knows? Security is hard, as is updating web pages!
Not a fan of Flash and I look forward to it disappearing. But I'm even less a fan of computer security problems, and especially the people that inflict them on us. So keep your software up to date!
It's been a very long time since I've played around configuring X Windows on Linux but I've recently had the "pleasure" again. X mostly just works now, and there's no more need to fight monitor modelines or an arcane xorg.conf file (the file usually doesn't even exist anymore). This is a very good thing because X setup was sometimes a nightmare.
For the past few years, I've done well to steer clear of proprietary graphics drivers as well, drivers for hardware like ATI/AMD or NVIDIA. I've choosen Intel graphics hardware because Intel's writing good open-source drivers. If I happen to be using ATI/AMD or NVIDIA hardware, I try and use the nouveau or the radeon driver.
X was the only big component of the Linux desktop stack that I never compiled from source, back in the mid-1990's (when X was "XFree86"). Too scary, and perhaps my 486 CPU, 128MB RAM and 33.6 baud modem weren't so up to it.
When I bought a cheap(ish) laptop to use as my "desktop" (TV/HDMI connected) a year or so ago, I hit a small snag in that it's based around an AMD chipset for graphics (and HDMI audio) and an Atheros for ethernet. I had to download, compile and load an out-of-tree Linux kernel module to get the ethernet working. Very 1990's.
An out-of-tree module means you have to remember to rebuild it if you ever upgrade your kernel. Luckily, the ethernet module (alx) is now "in-tree" from 3.11 and I'm using Debian Wheezy backports.
Generally, all's been well. I don't play computer games so have no need for fast 3D, just decent 2D performance. I'm not sure what happened but a while ago I noticed that full-screen desktop video had got very choppy (tearing). The desktop felt "stickier" than usual. So, I decided to (perhaps foolishly) try the proprietary AMD Catalyst driver and see if things are better.
Past experience with graphics driver updates on Linux have been varied, to say the least. I recall painful times and black screens, but this was a long time ago and I have much more experience and confidence now. This stuff is generally still a bit of a black art if things don't work out though!
Got the Catalyst 13.12 release driver installed and working, after a bit of messing around. A bit more playing with xrandr on the laptop to sort out display outputs to laptop, TV and/or both. Made sure I was using kernel 3.11 from backports as well and that the AMD driver supported this. Result? Video playback seemed good now and all appeared well (plus it didn't take long). Success! Or so I thought ...
A Key Problem
A problem quickly manifested itself during the week: key press delays in X applications.
Pressing a key in the web browser search box (for instance), would exhibit 1/2 second delays occasionally. Maybe 1 or 2 seconds sometimes. Consoles were fine: at least konsole and rxvt. Laptop keyboard directly or USB keyboard showed the same issue ... very annoying.
So, on the merry-go-round again ...
To investigate, I planned another look at X for a Saturday. This time I updated the kernel to a new backports version 3.12, and downloaded and installed the AMD Catalyst driver 13.11-beta (which said it supports kernel 3.12). A bit of trouble :
- I had to do a force install because the previous AMD driver didn't want to --uninstall.
- I had to do some symlinking to link libGL.so.* from /usr/lib to /usr/lib64 (not sure why this was wrong - poor AMD/ATI Debian x64 support?).
But still the same key press problem ...
Key presses worked fine with the radeon driver but not the AMD driver. So I started to look at X Server options, starting with the "easy" stuff via the memorably named amdcccle, the AMD graphics control centre (a graphical application).
I enabled the "tear free" control (sync to vertical refresh), which was off, and this seemed to fix my key press delay trouble.
Apparently, I could have used the following command to enable this as well :
Thank goodness for that!
At the end here, I was going to start up the control centre and get a small screenshot of it. But it gave me a segmentation fault ... ugghh.
X Windows is due to get a replacement in Wayland at some point in the future. I can't say I'll miss it. In fact, there are a few quite exciting developments happening in desktop Linux-land just now so it should be an interesting year or two.
As much as I liked the Firefox OS phone, I've stopped using it and bought a new Motorola Moto G Android phone.
The ZTE is just too slow and I found it increasingly painful to use. I hope to see Mozilla get their mobile OS on better hardware and, at that point, I'd have another look. For the money paid (£65) it's no great loss and the Moto G (at £160) is an amazing phone.
There'll be no FF OS updates from v1.0 here it seems, and a last straw was discovering my bluebooth headphones won't work with it: extremely minimal bluetooth support. Couple this with a slow touchscreen, sometimes needing multiple presses to get a response, and then a few complete freezes and I've given up. For me, attempting my own OS builds doesn't seem a reasonable thing to do.
The Motorola Moto G is a new "Google" phone and runs (almost) stock Android 4.3 (upgrade to 4.4 soon apparently). I haven't used Android on a phone since 2.2 (Cyanogen) and the changes are huge. It's an extremely polished interface, the whole thing looking and feeling great. It's fast, has a great screen and seems to have very good battery life as well. I am very happy with it.
Less satisfying is that MTP, the "Media Transfer Protocol", doesn't work very well on Linux. Ironic this is so bad, considering a) Android is based on Linux and b) Google do a lot of work using (and engineering) Linux. Go-mtpfs seems to work on my desktop (manual mount, fine) but not on my X220 laptop (this morning). MTP support seems to be fragile, spotty and therefore quite annoying!
I've never been happy with relying on a cheap ADSL router/modem for firewall security on my home network, but this is what I've been doing for a long time now. How secure is its firmware, does it get updates? Control and configuration is often poor.
Basically, a little "white" box running  who knows what.
 Usually some version of Linux, often old and perhaps with "patches". Security updates either non-existent or hard to find.
This runs FreeBSD wrapped up in a very nice web-based GUI to manage a pretty sophisticated firewall/router. A little red box running a known quantity.
What this means in the first instance is I've had to familiarise myself with firewall rules and logs again, something I've not done for a long time (when I used to run one or two Linux firewalls). I've set the box up to be a perimeter device and also plugged in my WIFI as its optional interface. Staring at logs and trying to tweak rules to minimise logs, in some cases scratching my head over odd packets, or hard to hide logs ...
As a slightly paranoid system administrator, the easy availability of firewall logs and rules can keep me up a bit later than usual now.
Intersecting this interest was a report I saw about a test Channel 4 TV are doing just now called Data Baby. They are monitoring the information mobile phones are sending out, which turns out to be a lot, even when they're doing "nothing".
As the phone sat, apparently silent, contacts were in fact being made with 76 different servers around the world, in countries from the US to Europe to China and Singapore.
Mr Miller said: "the interesting thing is, and (it) might be surprising to a lot of people is, that (the) phone is always active. It always has an internet connection, and so the applications, if they choose to, can continue communicating after you've put it down."
My Nexus7 Android tablet sits in my kitchen and I sometimes use it for streaming radio (Tune In), Skype or browsing. It's idle for most of the time, but there is constant traffic to Google's servers, and even the BBC. I haven't captured the traffic to look at it in detail, no doubt it's all quite innocent and normal. But now we all carry these little networked multimedia computers in our pockets, do we need to have some assurance on what information it sends out? What is your phone doing? What permissions do you give an application when you install it?
Not only the phone. Recent reports detail how LG televisions might be logging information about files you are using and sending data back to the manufacturer. Spying basically. Not the sort of thing people would expect of a TV, but all of a piece when consumer electronics converge to be multimedia networked devices. Time to get the wire sniffer out ...
I own an HP ProLiant MicroServer, a great little box I bought a couple of years ago to act as my main file server/NAS machine. It's held up very well and it was very cheap because I got £100 cashback in a deal (and it was cheap already).
It's not a powerful computing machine by any stretch but a very decent server: I've put 8GB RAM in it and 4 2TB disks in RAID5. It's also very decently built by HP, with some care and attention you'd except on a bigger server. Hence the Proliant badge.
One reason I prefer it to my QNAP T419P is that it's got a display connector. The QNAP is serial only, so a bit more fiddly.
To maximise the available storage capacity, I installed Debian on an 8GB USB stick and use the 4 hard disks for the RAID only. Generally, this has been fine, but I have started noticing some fairly severe I/O latency hits recently and this has started causing more frequenet pain elsewhere. Combine this with some USB filesystem corruption a few weeks ago and I wanted to switch away from this configuration.
However, I also learned that the stock HP BIOS does not enable all the system features, including a "spare" SATA port on the motherboard, supposed to be used for a DVD or CDROM. Without another SATA port, it's impossible to add another drive for the OS.
Luckily, I came across a great web page by Joe Miner describing how to update the HP BIOS and enable these hidden features. The usual caveats apply: this is not an officially sanctioned "update" (in fact, it isn't adding anything, but "un-hiding" things. The version remains the same).
Having done the update, I now have an extra motherboard SATA port and have also made all the ports default to 3Gps. I've also stuck a spare 2.5" SATA hard drive in the empty CDROM space.
With this extra disk installed, I used debootstrap to install a new version of Debian on the disk and configured this new install, adding boot loader etc., while the "old" system was running. On Sunday morning I rebooted into the new system, fixed up a few missing bits and pieces and now have a brand new OS installed on a proper disk. So far, so good.
Computers can be pretty frustrating, even when you think you understand them fairly well. This understanding might make things worse in some ways, as you'll go the extra mile, persevere a bit longer, do the extra debugging and perhaps end up no better off (except even more frustrated).
What's brought this about? Well, over and above the usual nitpicks :
Mozilla Thunderbird IMAP Issues
My domain email stopped working a few days ago. Initially I thought it was just an email dry spell, but some more concerned digging showed a problem connecting to my IMAP server (dovecot).
Some potential complexity here ...IMAP itself but especially the SSL layered over it (imaps). So a fair amount of anxiety about what might have been broken - server update? expired certificate? problem ertificate? or a problem on the client computer, or client mail application?
Suspicion settled on the client application, Mozilla Thunderbird, and I went through a slightly painful process of regressing some major releases and finding that version 23.0 broke things.
Somehow I had managed to get through v23.0, 24.0 and 24.0.1 via the automatic updates with a working mail capability. At least until last week. I am not sure how!
Posted some notes and asked for comment on mozillaZine, and then ended up logging a bug. Should have expected this, but then tasked to find the nightly regression point, a potentially painful process. "Luckily", being on holiday meant I have had some time to do this ...
Mozregression didn't seem to work well for me, not finding any break point, so I took the manual route of downloading some releases close to the last version that worked for me (release 22.0) and seeing where it failed :
2013/05/2013-05-23-00-40-20-comm-aurora/ ----- BAD
2013/05/2013-05-20-00-40-04-comm-aurora/ ----- BAD
2013/05/2013-05-16-00-40-19-comm-aurora/ ----- BAD
2013/05/2013-05-14-00-40-02-comm-aurora/ ----- BAD
2013/05/2013-05-13-00-40-21-comm-aurora/ ----- OK
2013/05/2013-05-12-00-40-18-comm-aurora/ ----- OK
2013/05/2013-05-06-00-40-01-comm-aurora/ ----- OK
2013/05/2013-05-02-00-40-01-comm-aurora/ ----- OK
So, IMAP to my domain broken with the 2013-05-14 build. Let's see how things go.
- Bug : 930878
IMAP with SSL/TLS,normal password fails to retrieve mail after v22.0
One always wonders ... it's probably my fault somewhere. Still Diggin' :-)
Software RAID Failure
Did I mention holiday? A couple of days ago I got an email with subject line :
Fail event on /dev/md/2:shuttle
That's a disk failure with a RAID mirror I have in a system (where I normally stage the blog). Something to look forward to fixing when I get home. Hopefully the remaining disk stays well, always a slight concern with something like this.
On top of this issue, I have smart complaining on another system about "unreadable sectors" but this is something I've been momitoring for a couple of months, the number not increasing for now. RAID is not a backup, but it helps mitigate hardware failures.
A quick followup to this. The 500GB 2.5" SATA disk I was going to use as a replacement might not be healthy itself. I did a quick smartctl health check on it and it spat our some warnings :
==> WARNING: These drives may corrupt large files,
see the following web pages for details:
I didn't know smartmontools did this. What a great feature. So, looks like I need to flash the Seagate firmware.
Disk firmware updated, replaced in RAID and syncing the mirror ...
Laptop Random Hibernations
I'm trying Debian Testing (Jessie) on my Thinkpad x220 and it's generally been fine. In fact, in many ways it's the best and fastest version yet (and the laptop's pretty good as well)
However, I've had it decide to hibernate itself when I'm not looking. This wouldn't be so bad except it has a problem resuming (libgcrypt message, similar to bug 724275), so this turns into a hard reset. As usual, a number of places I could look to solve this (initramfs, acpi, uswsusp etc.) and I'll see if I can find some time and do some debugging. Chasing this sort of issue is particularly tough because of the need for rebooting/hibernating to test things.
I was going to followup a post on the Debian Users web forum but it looks like my account has been "deactivated" manually by an admin and I can't re-activate or re-register (username in use!). A large bit of friction having to send a mail to the admins about it and a bit of a crappy policy if you ask me ...
Maybe I have too many computers, and too many computer related activities going on. I'm juggling different virtual machines running different versions of Debian, doing different things and occasionally thinking about synchronisation. Silly things such as whether to run the development system VM on KVM or switch to VirtualBox? If I use both, best ways to sync them up? Converting raw KVM disk to a VDI etc.
No wonder the odds increase that I end up in pain sometimes. The aim is always to get things sorted and arranged in such a way that I can actually do some work, or something worthwhile. Not spend all day fixing or configuring things before managing any of that!
I've been setting up my Raspberry Pi again. Last time I used it to monitor a server room door by taking webcam pictures and emailing them to me. Now I'm considering if it would work as a WiFi access point.
Downloading and imaging the latest Raspbian to an SD card worked fine but I kept having problems. Works fine initially but the next morning I'd see massive filesystem corruption (segmentation fault, signal 11 etc. just doing an ls etc.). Maybe a bad SD card? So, try another: same thing the next day. Maybe my Pi is broken?
Turns out to be a poor power supply. I'd plugged it into a micro-USB phone charger that just isn't giving out the right power (needs a good 5V). Try a better supply and it seems fine now. This shows how critical a good power supply is!
How critical? Well, the wrong type can kill you, so be careful. A while ago, a Chinese iPhone user was supposedly electrocuted and died, probably using a poor (and fake) charger. For a detailed look at this, see Ken Shirriff's blog.
This is a big turnaround for me however because it is not that long ago that I would have sworn never to use KDE. I last tried it over 10 years ago and thought it had a lot of very rough edges, plus felt and looked "cheap". Alongside too many half-baked "K" applications and a ridiculous number of configuration parameters and settings, a significant part of it didn't work very well.
It's completely different now and the Debian Wheezy version I'm using (KDE 4.8.4, a few versions out of date now) not only looks fantastic but almost all of it works as expected.
I've had it crash once whilst messing with a particular desktop setting, but it restarted itself automatically and carried on from where it left off. In comparison, I booted Gnome 3 a few weeks ago, just to have another look, played with the ALT+TAB/ALT+` switching for a few seconds and it crashed the desktop. Unlike KDE, it didn't recover but logged you out, losing everything.
The desktop still has a lot of settings to browse but the control panel makes sense and it all seems properly integrated. There are still a few areas I don't properly understand, KDE activities being the main one, but you can ignore them until you feel like having a better look. Virtual desktops work as usual.
All in all, I'm really liking KDE. The developers are doing a wonderful job and the QT toolkit gets better and better. Definitely worth a look.
For people that want to have a play themselves, you can download a KDE live DVD (e.g. Suse do KDE well), boot from it and play without affecting or changing anything on your PC.
In 1981, the BBC released a personal computer, the BBC Micro.
Although fairly expensive, they were quite popular - even more popular in schools and colleges (which got subsidies for them). This machine might have been responsible for the start of many (perhaps most) computer departments in schools!
There's a great YouTube video online from "Computerphile" called Original Elite on the BBC B that really takes you back to the "golden" age of the hobbyist computer in the UK. Lots of odd machines coming out like the Sinclair ZX80 (later ZX81 and Spectrum) and Acorn Atom. For the Atom, from Wikipedia :
At the time 256×192 was considered to be high resolution.
Richard Hill, now a physicist, talks in a very engaging way about the computer game Elite, a game that a lot of people fondly recall and one that introduced many to realtime 3D computer graphics. This was all done in 32Kb of RAM only and very little processing power compared to today. Not only does he demo the game (loading off cassette tape!), he also shows off a game he wrote himself inspired by it (coding in assembler, later C). Programming was much harder work in those days.
One thing he mentions near the end is that if he was starting out now, he might never have got into programming his game because of the huge number of distractions that make up the modern computer experience. Yes, we have much more power and they are much easier to use, but we also have a massively expanded number of distractions: emails, notifications, tweets, alerts, posts, streams, chats, messages, funny videos. The list goes on. The internet, for all its amazing usefulness, can also be a real time-waster.
It's hard to imagine the enthusiasm and joy this early tinkering can bring to a youngster. This early spirit is something that both the Maker Movement and the Raspberry Pi Foundation are trying to encourage again.
The Raspberry Pi (right) is a real computer and much more powerful than the BBC Micro. Complete with built in distraction though!
Amazon, the online store, do a lot more than just sell books and computers. They also created a massive cloud computing platform to enable their own huge operations. They sell it as a service to everyone else, a so-called Platform As A Service (PaaS) provider. I've been meaning to take a closer look at Amazon Web Services (AWS) and I think now is a good time.
I actually created an AWS account four years ago but never actually used it. I think I was too busy probably, and perhaps also baulked at adding a credit card to the account at the time. I've decided to reactivate it and have a play in Amazon's free tier. This lets you use a few services for free for a year, as long as you use a small system and stay within certain resource limits.
So, I have a test EC2 instance up and running, Apache listening and a single static web page being served. It's using an EBS volume and ... well, the alphabet soup soon kicks in here. Awash in a sea of TLA's. This is one of the reasons I wanted to have a play, to learn about some of the terminology, including how to operate and manage things programmatically. I've created and used a number of virtual machines on Linux, and use quite a few now: this blog is written inside one, and hosted inside another for instance. So one of my questions is to see how different AWS is to the VM's I run now.
Good to see that Debian 7.0 Wheezy was released last week after a 10 month release freeze. Some people think this is far too long, myself included.
Russ Allbery and Lars Wirzenius have written up a proposal to improve the Debian release process, with much inspiration from the agile development method. In short, they want to promote the Testing distribution to a state that it is always releasable, at least to the extent that the entire release process takes only 2 weeks to a month (or so, at most).
In brief, the proposal covers :
- Keep the Testing distribution as close to releasable as possible, with more constant attention to bug fixing (especially release critical bugs).
- More constant and automated testing done of the distribution.
- Focus on ensuring the right core packages are ready and releasable, being less concerned with those packages deemed of secondary importance.
The idea that Testing be changed to either a rolling release or to some form of constantly releasable distribution comes up regularly e.g. see Tanglu and CUT. Allbery and Wirzenius are very well known Debian "old-hands" though, so may be able to make more of a impression on the project. I hope so, but Debian is a very democratic and distributed organisation and consensus is hard to build on this type of question. Debian is also well known for its focus on stability rather than freshness, and stability is an admirable goal. I think the process Allbery and Wirzenius describe can speed a release without sacrificing this by optimising and focusing resource better though. Automation may be the key.
Finally, I really appreciate all the hard work done by all the Debian developers and contributors - I've been running Debian Linux on all my machines for a few years now, a very satisfied user. Like many, I installed Testing many months ago but the long stabilisation period is a bit of a drawback. Right now, it's fine but fairly soon it will start to seem slightly stale. This is less of a problem for a server but more of one for a desktop. A way of optimising the release and update process would be very welcome.
Zach thinks CA is a less optimal way to learn because it's all done inside a web browser and doesn't teach you how to set up a real development environment i.e. the tools and resources you need to program properly standalone.
Learning some basics using a simple browser based environment can definitely be a useful start though.
If you're interested in learning to code, the post is worth reading.
Vim is a great text editor and I'm constantly surprised (and amazed) by the sort of things I discover you can do with it.
The latest surprise was :
if you place your cursor on a number and press Ctrl-a, the number is incremented by one. Ctrl-x on a number will decrement it by one. Further, modify the command with a count to increment or decrement by count i.e. 6 Ctrl-x decrements 6 from the number under the cursor.
Vim is full of odd little features like this.
Speaking of online education, as you might imagine the field has been particularly attractive to people in the computing industry. Those that build and design the internet and all the various programs and services that run on it have been early adopters of the technology. Podcast style videos for instruction, demo and training are fairly common and are often very good. They're much easier to create nowadays as well.
One of these new companies is called Code Academy.
The course starts as a very basic introduction to computer programming itself but builds up to decent coverage of topics like functions and objects. What makes these courses good is the interaction and encouragement the system gives the student. This is all programmed into the web site code ofcourse but makes the experience much more engaging.
Every track is broken down into sections, further sub-divided. A key here might be that the exercises make up a great deal of the instruction: a short introduction and explanation of a concept leading very quickly to hands-on coding.
To keep you interested, encouragement is given by earning "badges" and you get emails praising certain accomplishments. These might be completing a section or working on the course for multiple days in a row. This sort of immediate feedback is one of the key elements of a course like this.
So I earned a few badges ... :-)
MOOC stands for Massive Open Online Course, a bit of a mouthful and not a particularly nice word, but something that will be a very important component of the education landscape in the future. Students, teachers and parents should be watching developments here very closely.
In translation: teaching over the internet. Courses covering everything from maths to art and using all the latest multimedia and web-based technology.
From the Khan Academy to a multitude of universities and colleges putting courses online (often free), things are taking off. In the USA, prestigious institutions like Stanford, Harvard and MIT have blazed the trail but there are many more, all over the world.
Open Culture has a list of 700 free online courses. Everything from Ancient Greek History at Yale with Donald Kagan to Reading Marx’s Capital by David Harvey at City University of New York.
Also check out Coursera - "Take the world's best courses, online, for free".
The modern web environment is an increasingly rich one but there are many challenges ahead of a non-technical nature. These will include how grading and certification will work. To see proper recognition of online qualifications we will need to ensure some sort of reliable quality standard and a way to properly compare grades from very different institutions, perhaps across continents.
In the future, the hope is that the cost of education will fall significantly and quality rise. Great teachers will no longer be available to only a privileged few but can, if allowed, reach huge (massive online) audiences: education can be interesting and exciting in the right hands.
This will all take some time to shake out and there are many vested interests that might work to obstruct online education. Like with a lot of things the internet touches, some wrenching change is on the horizon.
Well, I tried. I actually ended up liking Gnome 3, or at least Gnome 3.4. Yes, there were some rough edges and parts of it didn't work the way I wanted, but most of that I fixed using an extension. I was never 100% happy with things like application (alt-tab/` switching) but learned to live with it.
I'd switched to Gnome 3 almost everywhere except my living room (still on Debian Stable/Squeeze so running Gnome 2). Laptops and work workstation. Then it all went wrong ...
- Laptop - my primary laptop, a Thinkpad X220 (so pretty new), suddenly wouldn't load the full experience, insisting on "fallback" mode. OK I thought, I'll live with it and it will eventually fix itself with an update somewhere. This is Debian Testing and using Intel open-source drivers after all. But it wasn't fixed as long as I tried - for 2 or 3 months eventually. I started getting (sort of) used to "fallback". But this is clearly a crippled Gnome 2.
- Work system - Debian Stable using the nouveau NVIDIA driver. All is well for a couple of months, then - an update somewhere and boom. Now the desktop is locking up. Switching workspaces freezes it for 10-20 seconds, other desktop interaction similarly causes lockups or high loads. Basically unusable.
At work, I need a working desktop now and don't have time to wait or mess around. So I booted into XFCE and carried on.
So I set it up and configure it, and then realise that XFCE (pretty much) just works. In fact, it works much like Gnome 2. I also realise that I like the Gnome 2 style and notice all the papercuts that are no longer present.
I've now switched everything I run to the XFCE desktop and am both happy, comfortable and productive with it.
I may try Gnome 3 again in the near future, especially if and when v3.6 hits Debian, but thank goodness I have such a great choice of free software alternatives.
I got my Raspbery Pi on Friday and although I knew what to expect, it still amazes when you see how small it is.
The Raspberry Pi is a tiny ARM based computer, with 192MB RAM, CPU, GPU, network and USB on a board a few inches across. It only costs £25, so I felt, why not? Here it is sitting on top of a laptop, hooked up and running :
This is a real computer not a toy. I just need to figure out what I might do with it now!