Coding 2 Learn

Education and Technology Ramblings with a little Politics for good measure.

One Year On

| Comments

About a year ago I wrote a post called Kids can’t use computers… and this is why it should worry you.

In the post, I expressed my opinion that the concept of children being digital natives was a misnomer, that computer literacy is on a decline thanks to the ever increasing sophistication of our devices and software, and that we need to refocus our efforts as educators and parents to ensure that kids can use computers.

I thought there was nothing much more to say on the issue; my opinion had been stated and I’d put forward what I thought the solutions were to the problem. I had no intention of writing a followup post.

Two things occured recently that changed my mind. The first was my involvement with the Young Rewired State Festival of Code. The second was the release of Ofcom’s Communication Market Report 2014

Ofcom report

The Ofcom report has been widely publicised in the media under titles such as Tech is child’s play, unless you are an adult and Ofcom: six-year-olds understand digital technology better than adults once again propagating the myth that preteens and teens are digital natives and those of us over thirty are clueless when it comes to technology.

Let’s make something clear, the Ofcom report focuses on the UK population’s use of digital media, and is not a report on computer literacy. An example question from the survey that Ofcom commissioned specifically mentioned the app – Snapchat. The interviewers asked people which statement best described their knowledge of Snapchat.

  • I know a lot about it and I have already used it.
  • I know a lot about it, but haven’t used it.
  • I know a bit about it.
  • I’ve heard of it but don’t know much about it.
  • I’ve never heard of it.

Unsurprisingly 47% of adults had never heard of Snapchat. I fall into the “I know a lot about it, but haven’t used it” category. For instance, I know Snapchat is a messaging application that allows users to send each other images and video that are hidden after a few seconds. I know that Facebook and Google have both tried to acquire Snapchat for billions of dollars. I know that Snapchat has faced controversy over local storage of photos on some devices, its compliance with Government agencies over photos stored on its servers and that it had its database of user details hacked. I know all this because I read Hacker News, and if I didn’t then I’d be in the “I’ve never hear of it” category.

Analysing the digital divide between adults and juveniles depending on whether or not they have used or heard of Snapchat is ridiculous. Snapchat is aimed at a youthful demographic that use smartphones, not at people in their mid forties. Over many years I have managed to whittle my friendship group down to about half a dozen people, and if I were to send any of them a Snapchat, they’d think I was just being a bit weird and very creepy.

Using an individual’s knowledge of Snapchat to judge digital literacy, is akin to judging an individuals ability to compose and perform music based on whether or not they have heard of, or listened to, One Direction. But this hasn’t stopped the popular press jumping all over a few statistics and proudly announcing that our 6 year-olds are more digitally literate than their parents.

There are plenty of other skewed statistics that the press have sank their claws into. For instance the “Digital Confidence Scores” show that kids aged 14-15 are at the pinnacle of digital confidence. In reality their digital confidence is only in relation to the products and services they use. My six year-old son is very confident with the iPad I let him use. He’s a whiz at Minecraft and his knowledge of the game mechanics far exceeds mine. He’s a pretty dab hand at Angry Birds, iMessage and using YouTube as well. He would score quite highly when it comes to digital confidence, and when I asked him whether “new technology confuses him” he proudly said “No”, after I had clarified what “new technology” means.


New technology confuses me all the time. For instance, every time a new JavaScript library pops into existence, I feel an overwhelming dread of being overtaken by technology, and a slow descent of my digital skills into obsolescence. When Apple announced the Swift programming language the weight of yet another set of syntactic rules to battle with pressed down heavily on my shoulders. Every time I decide to sit down and learn a functional programming language I break out in a sweat and have to steady my nerves with a couple of shots of Scotch. I recently spent a couple of days playing with Google Glass, and was exceptionally confused, as I really couldn’t understand what the fucking point of it was, or why anyone would want to pay over a grand for such an ugly and limited gadget.


My son scores highly on the digital confidence scale as tested by the Ofcom interviews and questionnaires, yet he’s the last person I’d advise you speak to if you needed help with formatting a Word document, debugging an Excel formula or setting up an Outlook appointment; all of which many adults find exceedingly easy.

I’ve denigrated kids enough now, so I’d like to spend some time looking at the other end of the spectrum, and explain why I am still filled with hope for the future of digital literacy amongst the youth of this country.


The YRS Festival of Code is in essence a week-long hackathon for teenagers based at centres all over the country, culminating in a weekend get together of all the participants for a competition to judge who created the best apps. If you’re a teenager and reading this, go sign up for next year’s event now. Even if you’ve no coding experience at all, you’ll get something out of it. If you’re a developer then you should go and sign up as a mentor or centre lead.

This year I mentored a group at the CompareTheMarket offices in Peterborough, along with Alex Shaw who heads up the Labs division at the company. We both gave up our time, as volunteers, but in reality we both got far more out of the experience than we put in .


For five days we helped eleven kids hack together a couple of apps using Open Data, marveling at their drive, enthusiasm and ingenuity. We then spent three days in Plymouth and were quite frankly blown away by the quality and originality of the apps that had been created by the teenagers from across the country.

The Festival of Code has been running a few years now, but the number of children attending keep doubling each year, as more and more kids catch the coding bug. There were fifteen year-olds at the event who already dwarf me in terms of their skills and knowledge. We saw kids presenting apps that ran on Android, iOS and the web that demonstrated a youthful ingenuity and ambition that was simultaneously heart warming and terrifying. Check out some of their apps – here, here and here.

The kids I met at the Festival of Code are the future of Computing in this county, and the fact that their numbers are growing year on year, assures me that it won’t be long before kids are back in their rightful place as masters of technology, and putting us oldies to shame once more.

Computer Games Are a Waste of Time

| Comments

When I was a kid I played a lot of computer games. I once played Betrayal at Krondor for so long that I started hallucinating from sleep deprivation. When I was at Uni I chose to mow-down humanoid warthogs in Duke Nukem rather than learn metabolic pathways for amino acid synthesis and failed an important exam.

When I was a child, I spoke as a child, I understood as a child, I thought as a child: but when I became a man, I put away childish things.

I gave up gaming in the traditional sense, years ago. I now recognise the stupidity of playing a computer game for hours on end, hoping to make the value of some variable creep upwards. Thank goodness I don’t do that anymore. These days I’m far more sophisticated.

The Hacker News Game.

I’ve been playing The Hacker News game for just under a year. When I first started playing I was killed fairly quickly, what’s known in HN circles as being Hellbanned. It’s a neat little mechanic where by the NPCs introduce large amounts of lag into the game and hide your character from the other players. I’m still not sure what got me Hellbanned, so I just started again.

My second attempt at playing has been a little more successful. I created a new character called ‘coding2learn’, who is a kind of geeky computer science teacher. Since playing this new character I have gained nearly 1000xp, known as Karma in the game. The concept of The Hacker News Game is to go on a search for websites that would be of interest to Programmers and Technologists. When you find a website, you submit it to the community, who can then upvote your submission and reward you with xp.

Like most MMORPGs, The Hacker News game allows you to interact with other players. To do this you use comments. Comments can also be upvoted, for additional xp, and if you are a high enough level you can even downvote comments. You’re best off only commenting on things you are quite knowledgeable on. If you want to gain lots of upvotes, the more pedantic you are the better.

The Hacker News game is web based, although there is talk of a mobile app or optimisation coming soon.

The Twitter Game

I used to play The Facebook game, but I stopped when they kept changing all the game mechanics without notification. Besides, I interact with my friends and family enough irl. Instead I thought I’d give The Twitter Game a chance.

The basic idea of The Twitter Game is to gain followers. Followers are other players who might decide to join your clan, so that they can receive your pronouncements. There is no limit to the number of clans that a player can join, although it is often deemed a measure of success if you keep the ratio of people in your clan, to clans you are a member of, as high as possible.

You gain followers by making pronouncements. Humorous pronouncements are often best, but you can also make informative and controversial pronouncements in an attempt to gain followers. One of the quirky mechanics of The Twitter Game is that you have to make pronouncements in less than 140 characters. Sometime your pronouncements are spread further by your followers. This enables you to reach a wider audience and therefore increase your clan size.

In The Twitter Game I play more or less the same character as in The Hacker News Game. There are a few differences though. ‘coding2learn’ in The Twitter Game tries to use humour a little more and is a little less arrogant (although not much).

I’ve had moderate success in The Twitter Game, gaining over 1000 followers. I don’t play it enough really to become an elite player.

The Twitter Game is web based, although there are apps for mobile that are quite good.

The Blogger Game

The Blogger Game is my favorite game of all. If you’re reading this you’re playing The Blogger Game right now, and so I should really say ‘thank you’ for the additional xp.

In the Blogger Game you write what are called ‘posts’. It’s a good idea to write posts that, like in The Twitter Game, are either funny or informative. There’s no character limit in The Blogger Game though, and you can write really long posts if you want, although be aware that sometimes your post might be so long that people will comment TL;DR (too long; didn’t read).

Success in The Blogger Game is determined by your scores in something called Google Analytics. Google Analytics is like a personal high-score table, that can tell you how well each of your posts has done and how well your blog is doing in general. There are loads of stats to play around with, such as first time visitors, bounce rate and time on page.

If you want to play The Blogger Game it’s a good idea to decide on a topic to ‘blog’ about and stick with it. Regular posting is quite important, and something I struggle with. To get really high scores you need to be ‘blogging’ about once a week.

What I like about The Blogger Game, is its crossover with The Hacker News Game and The Twitter Game. Success in any of them can lead to success in the other two, even though they’re created by completely different studios.

The Blogger Game is available on the web or as a stand alone app on mobile, PC, Mac and Linux. web based, although there are apps for mobile that are quite good.

So that’s my gaming life these days. I’m sure we can all agree that it is far preferable and productive than the waste of time that is CoD or WoW

Please Stop Sending Me Your Shitty Word Documents

| Comments

Throughout this rant I use the second-person personal pronoun (you) quite a lot. This does not necessarily mean I am speaking to ‘You’ the reader, but rather some other ‘You’ who will probably never read this anyway.

When Microsoft announced Office for iPad I shed a small tear. Excel is an incredibly useful application, without which my managers couldn’t inundate me with graphs, statistics and indecipherable Look-ups that reference hidden and protected sheets. PowerPoint allows literally anyone (regardless of their public speaking skills, understanding of image aspect ratios, or ability to use less than 15 different fonts on a single slide) to prepare presentations for their audience. These apps becoming available on iOS did not worry me. What upset me however, was the fact that all of a sudden, swathes of iPad users will now have the ability to view, edit and most worryingly of all – create Microsoft Word documents.

Here’s what I have installed on my Mac:

  • Alfred – searching for anything
  • Python – coding anything
  • VLC – watching anything
  • FireFox – browsing anything
  • Homebrew – installing anything
  • Emacs – anything

You’ll notice that Word is not on the list. I have nothing against people who use Word, I am just not one of them. There was a time when if I wanted to put text on a screen, it was my go to software, and I thought I was a pretty 1337 hacker because I knew how to do mail merges. I’m not that guy anymore. I don’t tell you what software to install on your computer, and I don’t assume you have the same software installed as me. For this reason I am careful to use non-proprietary file types when sending documents via email. I expect the same courtesy from you, and here’s why…

I don’t have Word installed

When you send me a Word document, you are making some pretty major assumptions, and as Samuel L. Jackson once said in the outstandingly amazing film The Long Kiss Goodnight

“When you make an assumption, you make an ass out of ‘u’ and ‘mption’.”

Firstly you assume that I have Word or some clone of it installed. I know you think the words ‘Computer’, ‘Microsoft’, ‘Windows’ and ‘Office’ are synonymous, but they’re not and there are plenty of people in the world who use *nix operating systems. By sending me a .docx file you’re forcing me to find a work around, so that I can use your document. What are my options? Well I could install an Office clone like Libre or Pages. I could use an online service like Google Docs or Zoho. I could even attempt to get Emacs to read the data and make a go of presenting it to me in some recognisable format. Do you see what you’ve done? You’ve made more work for me. You’ve sent me a locked box and asked me to either pay to get a key cut or smash it open with a crowbar.

Plain text should be plain

What happens when I finally manage to open your document? Well 90% of the time, all it contains is text. That’s it. Text. Strings of characters. So why the hell did you send it as a Word document to begin with? Why not just write the text directly into the body of the email? If it’s that important for you to write in Word, then save it as a .txt file. There’s not a computer on the planet that can’t read plain-text. (Well, that’s not technically true, as I’m pretty sure my Microwave contains a computer, but that’s besides the point.)

Are you really that good a designer?

The only possible reason I can imagine that you had to send me the document in Word format is because you are the world’s finest graphic designer/type-setter. Maybe your choice of fonts, margins, kerning and paragraph indentation are so awe-inspiring, that the very act of viewing the document will have me gouging my eyes out with a spoon, knowing that the gift of sight is no longer of any consequence as I shall never again behold a thing of such beauty. Of course the small flaw in your plan is that I don’t have the Lucida handwriting font installed on my system, and Preview struggles to display Word-Art clearly, so all your efforts are probably in vain.

Tables grrrr!

Sometime you send me the Word document as a container for other joys, such as tables. I understand that a .csv is ugly to behold, but computers don’t tend to worry about aesthetics too much, so they really are preferable. There are prettier tables available if you’re into that kind of thing. HTML tables are great, easy to parse and render, but Microsoft obviously think they’re the devil’s work and so prefer to use their own method of tabulating data. I don’t know how Microsoft has chosen to represent tables in their .docx files, but I do know that if Linus, Stallman and ESR got together and hacked away for a decade or so, they wouldn’t be able to create a program that could render a sodding table created in Word, correctly.

What’s with the crud

Sometimes the documents you send me contain other interesting elements. You feel the need to augment your text with such things as; little animated gifs of a stick-man who is frustrated with his computer, borders of coloured apples and 3D Word-Art. Now I know you think that such embellishments will bring a smile to my face and ease my reading of your text, but I’m sorry to inform you that you’re wrong. Very wrong. Criminally wrong. You see, with out Word installed, I won’t be able to view these quirky little additions to you plain text. Your efforts were in vain. I could additionally argue that if your text was too boring, without such witty little quirks, then you might like to consider whether the content is worth reading in the first place.

A heartfelt plea

So please… pretty please… please with bells on top, borders of apples and the word PLEASE written in bright blue Word-Art; think next time you want to send a Word document by email or put one on your website, think about your recipient. Could you use the body of the email or a page on the site? Perhaps you could save the file as a .txt, .rtf or PDF. Just spare a thought for those of us that choose not to use Microsoft Word, and respect our right not to do so.

Oh… and learn to write in sodding Markdown.

Installing Pygame on Mac OS X With Python 3

| Comments

This has been a bugbear of mine for sometime now. I like using Python 3.x. I like teaching kids how to use Pygame. I use a Mac. Trying to get all three to play nicely with each other has been impossible for me up to now.

I’ve trawled through web pages and blog posts that recommend all manner of ways in which you can install Pygame on a Mac for Python 3, I’ve tried numerous solutions on StackOverflow, and I’ve even tried angrily shouting at my computer and threatening to throw it out of my classroom window. None of them worked.

Today I finally nailed it, and I have Pygame running. Here’ what I did.

1) Install XCode and command line tools
2) Install Homebrew (ruby -e "$(curl -fsSL")
3) brew install Python3
4) brew install git
5) brew install sdl sdl_image sdl_mixer sdl_ttf portmidi
6) Install XQuartz -
7) brew tap homebrew/headonly
8) brew install --HEAD smpeg
9) brew install python (needed to install mercurial)
10) brew install mercurial
11) pip3 install hg+

And that’s it. If you have any problems yourself or a better way then please let me know in the comments.

note: the smpeg install is failing for me at the moment, so I’ll look into this a little more. Pygame seems to be working without it though.


I had some brew doctor issues (around 20!), which might have been due to me trying to install Pygame from source earlier and therefore manually installing all the dependencies, which then conflicted with homebrew.

I deleted everything brew doctor suggested and overwrote all links as suggested. The brew install --HEAD smpeg suddenly worked (although that might have been because I was no longer behind a proxy). I then did a brew unlink jpeg and brew link --overwrite jpeg.

Everything is working perfectly for now. (Crosses fingers, touches wood and searches for a black cat to cross his path.)

What Exactly Are We Teaching Anyway?

| Comments

On Twitter, numerous education blogs and even on Hacker News, there have been more than a few debates of late regarding the education of students in Computer Science/Computing/Coding/IT. In the UK, in particular, the debate has been fueled by Lottie Dexter’s “Year of Code’; a government backed scheme to encourage everyone to learn a little more Computer Science/Computing/Coding/IT this year.

The usual suspects have all weighed in on this debate.

There are those that consider Computer Science and Computational Thinking the very purpose of a modern education. They argue that without the ability to fully comprehend the Halting problem, no child could ever tie their own shoelaces without entering some sort of bizarre shoelace-tying infinite loop.

Then there are those that argue it is impossible for a student to understand any abstract Computer Science problem. In fact, children are incapable of writing code, finding an on switch, or even manage to sound out the words on a “C” is for “Computer” nursery school flash car. They make the case that we should give up now and all go back to making pretty pictures in Paint.

C is for Computer

The arguments seem, more often than not, to be focused on what name we give to the subject that we are teaching. In reality, of course, the name means very little. During a child’s early years we teach them what we think they need to know and during the latter years we’re subject to the whims of the exam boards and organisations such as Ofqual.

It is for this reason that I propose a radical reform of the name given to Computer Science/Computing/Coding/ICT, that I will hope will clear things up once and for all, and prevent any future arguments, back biting and bullying. From this day forward I intend to teach students “Getting Shit Done With A Computer”.

Getting shit done with a computer is at the end of the day, what I’d like every kid to be able to do. Regardless of what you call my subject, I’ll always teach students how to get shit done with a computer, as that’s what I think they need to know.

I like my students to be able to recognise when the cable has been removed from the Ethernet port, and understand the reason why they have no network connection. I like my students to understand the basics of a file system and how to navigate it. I like my students to be able to use a spreadsheet, operate a database and write up a project. I like my students to be able to choose the right tools for the right job (The right tool being emacs and the right job being any job, always.). I like my students to be able to knock together little scripts that will recursively grab files out of a nest of directories, or as one enterprising young fellow did the other day, write a script that replaces all your files with this picture of Chuck Norris.

Chuck Norris is the Internet

Along the way I’ll teach them a little Boolean logic, some binary and maybe Big O notation. This isn’t just high flung theory with no practical use though. I recognise that when you want to get serious shit done with a computer, these are important concepts to have at hand.

When all is said and done, if we could just lose the pathetic tribal mentality that causes some of us to identify with the moniker Teacher of Computer Science, or Teacher of Computing, or Teacher of IT, the students would benefit in the end, and we could all just get some shit done using our computers.

X Days of Christmas

| Comments

Just a quick one from me today.

I woke up this morning with a lesson idea in my head, that was also a Python script.

I’ve a few teacher followers, so I thought I’d shove it up here for others to use if they want. You’ll have to forgive my poor coding and poorer use of the English language.

The challenge for the students is to create a program that will produce the lyrics for ‘The X Days of Christmas’.

You can find my solution at the bottom of the post, and here’s a few files –, nouns.txt, verbs.txt

The results can be quite amusing – my particular favorites have been “20 kittens a bleeding”, “99 rats a computing” and “11 creators a mating”

I’ve even had ones that make sense, like “8 pies a baking”.

Here’s an example final verse that I quite enjoyed.

On the 12th day of Christmas my true love sent to me:
12 roads a developing
11 faucets a mating
10 planes a combing
9 worms a solving
8 tents a liing
7 kittens a slaying
6 tubs a handing
5 passengers a scattering
4 bats a utilizing
3 mornings a promoting
2 pollutions a foreseing
and a partridge in a pear tree

from random import randrange

days = int(input('How many days of Christmas are there?'))+1

with open('nouns.txt','r') as file1:
    for line in file1:
        noun = line.rstrip()
        if noun[-1] == 's':
        elif noun[-1] == 'y':
            if noun[-2] == 'e':

with open('verbs.txt','r') as file2:
    for line in file2:
        verb = line.rstrip()
        if verb[-1]=='e':
            verb = verb[0:-1]+'ing'
            verb = verb+'ing'

for i in range(days):
    paring = nouns.pop(randrange(len(nouns)))+' a '+verbs.pop(randrange(len(verbs)))

for day in range(1,days):
    if str(day)[-1] == 1 or day ==1:
        ending = 'st'
    elif str(day)[-1] == 2 or day ==2:
        ending = 'nd'
    elif str(day)[-1] == 3 or day ==3:
        ending = 'rd'
        ending = 'th' 
    print('On the',str(day)+ending,'day of Christmas my true love sent to me:')
    for count in range(day,1,-1):
    if day == 1:
        print('a partridge in a pear tree')
        print('and a partridge in a pear tree', )

How I Rediscovered Experimental Learning and Why It Doesn’t Matter

| Comments

A few months back I received this tweet.

I was a little surprised, but DM’ed them back with my address and then promptly forgot all about it. (I don’t know what the past tense of the abbreviation of Direct Message is, so please feel free to correct me in the comments below.)

About a month later I was at home when the doorbell rang and a delivery driver handed over a parcel for me. I opened it up and was surprised to find three littleBits kits.

Now I’d just (literally that week) started teaching a new subject called Systems & Control, that has a heavy element of electronics involved, so I packed the kits into my car and took them to school the following day.

For those that don’t know, littleBits make electronics kits consisting of snap together magnetic modules. You can use the kits to make a range of electronic circuits – driving motors and buzzers and blinking LEDs.


My colleagues and I stood around the opened boxes, picking up the little plastic modules, snapping them together and building an array of little projects. After fifteen minutes or so we all came to the same conclusion. The kits were clever, easy to use, accessible for students but completely impractical for a classroom of thirty secondary school students.

The kits were placed back in their boxes, and left in a cupboard, forgotten about.

Then, due to personal circumstances, my six year old son, Jimi, had to be home schooled by his grandmother. I remembered the kits in my classroom cupboard and took them over to her, suggesting she might like to put together some of the kits with him. This she did, dutifully following the instructions and assembling the kits. She reported back that they had built the circuits, and then the kits came home, to be once again forgotten about, this time in a drawer in Jimi’s bedroom.

On Saturday I was standing in the kitchen, enjoying a steaming cup of coffee and my morning Nicotine Replacement Therapy Lozenge, while perusing Hacker News. Jimi came bounding in.

“Dad, come and look at the machine I’ve made.”

I’ve been caught out by this one before. The “machine” is normally a cardboard box with a cushion inside it and a pencil stabbed through the side. Sometimes it’s supposed to be a rocket, sometimes a train or sometimes an Angry Birds catapult.

“Why don’t you bring it through here?” I asked, reluctant to leave my coffee and laptop.

“It’s a bit delicate,” he said “but okay.”

He came in a few minutes later, but I wasn’t really paying much attention as he began messing around on the kitchen floor.

“Look Dad.”

I looked over and was stunned by what I saw. Jimi had pulled out the littleBits kits and had assembled a monstrous creation.

I got down on the floor with him and asked him what he had made. He explained the contraption. How you had to turn this thing and push that thing and hold down this part and then this part lights up and this thing makes a noise. He didn’t use a single punctuation mark in his excitable and breathless sentence.

We moved his machine onto the kitchen table and his big sister suddenly made an appearance. Within minutes the two of them were busily clicking together bits of electronics, having fun, amazing themselves, and more importantly learning.

Jimi and his Sister Concentrating Lantern

Jimi figured out that a push-to-make switch needs to come between a power supply and an LED to have an effect. He learned what a variable resistor does in series with a motor. He learned what a piezoelectric sensor does when combined with a buzzer. Obviously he learned none of the terminology, but that wasn’t important.

I’d dismissed the littleBits kits because I have preconceived ideas of what education should be. These misconceptions have been honed by years of operating in a pressured school environment, where results rule above all else, and where you’re required to demonstrate progress in every lesson, term, year and key stage. Watching my son experiment, succeed and learn, all while having fun was sobering.

Unfortunately the lesson I learned is one that I will struggle to apply in my professional life. There just isn’t the time available to allow students to experiment at their own pace. I would love nothing more than to give the students the tools they require, be that in computing or in electronics, and allow them to explore the possibilities available to them, but alas I operate in a system where data rules and demonstrable progress is required.

There are areas in which I can allow students to experiment. Thanks to a particularly keen and talented student, I have become involved with a project called THINKSPACE. This involves giving students a time and place that they can come and begin to develop apps. They work on what they want to, at a pace that suits them and where I am a facilitator and troubleshooter as opposed to a driver of progress. I can spare only an hour a week to THINKSPACE. It deserves two or three hours a day.

I admire littleBits, THINKSPACE and similar initiatives. I think they’re admirable endeavors with the right mindset when it comes to education. I just wish that policy makers within the education system shared their ideals and attitude towards learning, so that we could give all our students the opportunity to truly experiment in the classroom, to succeed and to fail, to have fun and most importantly to teach themselves to learn.

Computing Is Much More Than Coding?

| Comments

Trying to justify that every student in the country should learn computing is quite a tricky endevour, and it shouldn’t be. We’re all users of technology after all, and benefit from the advantages and disadvantages of being so. We should all have a basic understanding of computers, networks, encryption and software. Knowledge of technology can help keep us safe, make us money and help our productivity. Why should every student in the country not be given access to such knowledge? After all, we don’t balk at the fact that every student in the country should study Shakespeare, trigonometry, the causes of The Second World War or how to throw a rugby ball. I probably use trigonometry about once a month, I haven’t thrown a rugby ball in decades but I use a computer every day.

I studied French for five years at secondary school. I hated it. There’s nothing worse than sitting at a parents’ evening and listening to your mother speak fluent French with your teacher, and knowing, despite not understanding a word they are saying, that he’s detailing your every shortfall in the subject. Do I resent the fact that I was made to study French? No. Of course not. I have the utmost admiration for multi-linguists. I marvel at their ability and I know they perform an essential role in our society. Every student should study a foreign language, because a few of them will find they have a talent for it and go onto greater things. Did attempting to learn French benefit me? No. Not one bit, and that doesn’t matter

Other subjects have no need to justify their existence in schools. English, Foreign Languages, Maths and Science have the weight of centuries of educational history behind them. As does History. Computing is new. The bastard child of Maths and Electrical Engineering. It’s a subject that’s only been in existence for a few decades. Because of this it has struggled to gain a firm foothold in our schools, yet I would argue that it is a subject that has a far greater impact on our lives and the lives of our children than any of the others.

There are quite enough reasons to make Computing a compulsory subject in schools. We don’t need spurious excuses that confise the issue. What annoys me about many of the “Computing Apologists” is their overwhelming desire to insist that Computing has benefits outside of the sphere of technology. I’m all for Computing in primary and secondary schools, because I think the ability to code is important and I feel that every student should have the opportunity to learn to program. If we teach a thousand students to code, then maybe we’ll find a hundred that enjoy it, ten that excel at it and one that goes on to revolutionise our society.

When many advocates of compulsory Computing education are asked why it is so important, they often hail “Computational Thinking” to be the panacea to all our woes. They state that even outside the field of programming, Computational Thinking is an important life skill. They argue that Computational Thinking can be applied to many problems in the real world and that every student should learn to tackle problems in a “Computational way”. I teach Computing, and to be honest I only have a vague idea of what Computational thinking is.

I know how to teach programming. I know how to teach students to break down problems and tackle bite-sized chunks with different algorithms. I know how to get students to manipulate large data sets, and to make simplified models of real-world situations. I can teach students all of these things, once I’ve taught them the basics of the programming language we are using, be it Scratch, Python, Javascript or whatever the flavour of the week is.

The idea that Computational thinking is an essential life skill is nonsense.

If I was to shuffle a pack of fifty-two playing cards and hand them to you, then asked you to sort them, you’d do what any sensible person on the planet would do. You’d sort them into suit order first, then into value order. Is this decomposition or just common sense? Did you require lessons in Computational thinking in order to achieve this task?

My degree is in Biochemistry. I would never argue that Scientific thinking is crucial for everyone. It’s useful, in certain situations, but not essential. When some Daily Mail reader argues that the presence of a minority group in our country is resulting in a broken society, I might apply the Scientific Method to analyse their evidence, find its flaws and then disprove their hypothesis. If they told me they’d eaten a bacon sandwich, I’d probably just believe them. My wife is an English teacher. When she reads a novel she drills down into the layers of meaning and the subtext of the book, to elucidate a truer understanding of the authors message. When she reads the menu in a local cafe, detailing the contents of their bacon sandwiches, she just takes it at face value.

You can live a successful life without knowledge of the Scientific Method. You can live a successful life without knowledge of Literary Deconstruction. You can live a successful life without knowledge of Computational Thinking.

If I ask you to build me a shed, do you pick up an armful of planks and repeatedly throw them into the air until a shed has been built? Of course not. Maybe you’ll start by building a floor, then some walls and finally a roof. Is this decomposition? Are you “Thinking Computationally”? Maybe you’re actually engaging in abstraction. After all, there’s no such thing as a roof. A roof is just a series of planks of wood, joined together at an angle that is optimal for self-support and the shedding of rain water. Of course, there’s no such thing as a plank of wood. That’s just really bundles of xylem vessels, cut into regular geometric patterns. Of course, there’s no such thing as a xylem vessel. They’re really just arrangements of cells composed of cellulose and… well I’m sure you get my point. We’re all fairly familiar with abstraction. We just might not recognise it for what it is, and we certainly don’t have to be taught Computational Thinking in order to build a shed.

I think one of the major problems is our labelling of the subject.

Computer Science is no more about computers than astronomy is about telescopes

Edsger Dijkstra

Unfortunately, Computer Science has little to do with Science either. Let’s get straight what we are actually teaching here. We’re teaching programming, network infrastructure, databases, communication protocols, markup. We’re teaching these things because these technologies are so ubiquitous and important, that it will benefit everyone to have a little understanding of them. We are not teaching a revolutionary new way of thinking that will have wider benefits to society.

Why should we give every student an opportunity to learn Computing? Some of the students we teach might one day become the next Linus Torvalds or Steve Wozniak. Some of our students might become senior developers, collaborating on amazing new technologies and changing the world for the better. Some of our students might develop the algorithms for more realistic flag fluttering in Call of Duty XIII. Some of our students might become Gregs executives and not give the developers such a hard time when they can’t sort six million customer records according to when they last ordered a bacon sandwich, in real time, in a browser… that’s IE6.

Lets stop trying to make Computing something it isn’t, and instead be clear as to what we’re teaching and why we’re teaching it. Let’s stop being afraid of the words programming and coding, as if it’ll scare students away. Let’s be honest about Computing, and we’ll see it’s popularity soar.

A Rant From My Brother

| Comments

My brother is the reason I learned to code. To be honest, he’s probably forgotten more about programming than I’ll ever know, and I’m not exaggerating. His preferred languages are Haskell and OCaml, but he’s recently had to dive into Javascript for a project he’s working on. I received this email from him tonight, and I found it amusing so I thought I’d share it. (Note – he talks about Python a lot as it’s the language I understand the most.)

Javascript is pretty pathetic when it comes to bug-finding. Here’s some Python:

>>> foo = {}
>>> foo["bar"] = 3
>>> foo["baz"]

The dictionary foo doesn’t have a key “baz”, and this is likely a typo. Python sensibly throws an error, and execution will not continue.

In Javascript:

>> var foo = {};
>> foo["bar"] = 3;
>> foo["baz"]

This does not throw any errors, but instead returns undefined. This is not entirely retarded, until we find that Javascript happily coerces undefined to NaN (Not a Number) whenever it appears in arithmetic expressions. Since NaN is a valid floating point number, it can happily propagate through running code. Things go from entirely retarded to completely fucking braindead when we find that Javascript will accept NaN as an argument in most functions:


In other words, what started out as a typo which would have Python raise an error at the earliest possible opportunity is silently ignored by Javascript, only to be found if one notices certain rectangles not being drawn. Tracking down such a typo from a bit of missing graphics is going to be a pain in the arse.

Now functions: Javascript has no time for conventions of mathematics, programming, or basic sanity. In Javascript, any function can be passed any number of arguments without raising an error. The concept of arity be damned. Extra arguments in Javascript are ignored. Missing arguments are set to undefined. And, as explained before, undefined will be coerced to NaN in arithmetic expressions to create lots of great bug-full code when you forget the number of arguments required of a function. For further hilarity, undefined can be used as a key to a dictionary. So if you do:

function insert(y,x) {
   dict[x] = y;

and you accidentally call insert(3), you won’t be told, as you would be in Python, that you are missing a required argument. Instead, x gets bound to undefined, and the dictionary will be become

{ undefined : 3 }

That’s almost certainly an unexpected behaviour.

The way that function parameters are interpreted leads to this truly bizarre example, which I got from another site:


this yields the truly weird


The function map is supposed to apply its argument to every value in a list. In sane languages,


should give you the list


In Javascript, for likely dumbfuck reasons, map takes a function of three arguments. The first argument is bound to the element in the list. The second argument is bound to the index into the list. The third argument is bound to the entire list. This will cause surprise when you don’t know exactly how many arguments the argument to map is expecting (parseInt in this case), but don’t expect a prompt error in case of mistakes, as you would get in Python.

It turns out that, in this case, parseInt takes an optional second argument which is the base in which the first argument is to be interpreted. For unexplored reasons, when the base is 0, the argument is read in base 10. In base 1, NaN is always returned. This explains the first two elements in


The third element is “10” in base 2. The fourth element is “10” in base 3. The last element is 10 in base 4


How We Were Trained to Lower the Drawbridge

| Comments

A few years ago, the parody news network The Onion released a video claiming that Facebook was a massive CIA surveillance project. It was funny at the time. It’s not so funny any more.

Perhaps naively, I believe that Facebook, Google and the other tech giants reluctantly cooperate with the NSA. I believe that they comply with FISA requests because they have to and that they have remained tight-lipped about their cooperation because if they don’t then whistleblowers could expect the same fair, just and proportionate treatment as has been meted out to Chelsea Manning and Edward Snowden. These corporations exist, after all, to make money and handing over vast swathes of user data to spy agencies just isn’t in their financial interests.

I feel however, that the tech giants have accomplished something far more insidious, and in many ways more detrimental to our privacy than is claimed in the video. They’ve trained us to devalue privacy.

There’s an old saying that actually used to mean something.

An Englishman’s home is his castle.

In the UK at least, it used to be the case that we valued our privacy. What went on between the four walls of our homes was our business and nobody else’s. There were only three people in your life you would ever share your private lives with: your doctor, your priest and your spouse – in that order.

Then along came the Internet. At first it was a place only a select few could publish. You had to have the technical ability to setup a webserver and write in HTML, and the World Wide Web was a curious place filled with niche websites created by geeks, academics and hackers. But it didn’t stay like that for long. Facebook, Wordpress, Twitter, Google+ all came along and made it easy to share everything.

We’ve been trained to lower the drawbridge, lift the portcullis and let the world into our castles. Social networks reward us every time we publicise our lives, and we eat it up. This is most startlingly apparent amongst Generation Y, for whom sharing their lives with the world is so natural and ingrained, they almost see it as a basic human right. They consider privacy as something archaic and quaint, no longer relevant to the world we live in. They like it when they Google their own name and see images of themselves on the front page. They compete to gain followers on Tumblr, friends on Facebook and mentions on Twitter.

We don’t yet know what full the consequences of the sharing culture will be. When today’s fifteen year-old students attempt to stand for Parliament in twenty years time, and the front pages of the red-tops are plastered with embarrassing Snapchat selfies, will we look at them and decide that they are not fit to represent us in Government, or will we just shrug and acknowledge that ‘everyone used to do that’?

We can see one consequence of our training by the social networks here and now though – apathy. When Snowden’s revelations first hit the Guardian’s front page almost nobody cared. Hacker News was filled with NSA stories, but you’d expect that from a community of technophiles. The BBC seemed to include Snowden stories as an after thought though, and even then, they focused on the human element of where he was and what he was doing, rather than the surveillance programs he had exposed. Glen Greenwald promised there would be more to come, and he didn’t disappoint. But the latest revelation, that the NSA and GCHQ consider most of the widely used encryption technologies as a mere hindrance to their dragnet data gathering, has caused barely a ripple in the public consciousness.

Why are we not out on the streets protesting these flagrant invasions of our privacy? Why are we not holding our parliamentary representatives to account, and demanding the end to mass surveillance of innocent citizens? Why are we not doing something… anything?

Generation W didn’t think about privacy, they just had it. Amongst Generation X there are precious few of us who care. Most of Generation Y consider privacy a barrier to their lives. It’s Generation Z where the only hope lies.

As a teacher I grow tired of the government and media constantly passing the buck and demanding that all societal ills need to be cured in schools. Teenage pregnancy rates too high? Teachers can fix that. Young adults can’t manage their finances? Teachers can fix that. Too much apathy at the polling booth? Teachers can fix that. Government, the media and parents constantly abdicate responsibility and throw more into the curriculum in an attempt to fix society. When it comes to teaching about online privacy however, I don’t see who can help Generation Z other than the teachers. The media, the Government and the corporations have no vested interest in a generation that considers privacy important. As for parents, they’re already setting up Facebook accounts for their babies so that we can track their offspring’s progress from cradle to grave.

I’ll start. I’m currently working on a scheme of work about cryptography. There’ll be plenty of Computer Science concepts in there, but I also intend for students to understand the importance of strong cryptography from a societal perspective rather than just a technological perspective. My hope is that it will make them think a little when using digital communication technologies, about exactly who has access to the content they send. I’ll publish it here when I’m done. If you would like to join me in this campaign, then please feel free to share links to resources in the comments section, or on Twitter, and I will endeavour to curate and publish what you send.