Leave a comment

Never teach Microsoft Word in schools again, please!

This week the UK announced that it will scrap its failing IT education program.

In a speech, the education secretary will say the existing curriculum in Information and Communication Technology (ICT) has left children “bored out of their minds being taught how to use Word and Excel by bored teachers”.

Instead he will, in effect, create an “open source” curriculum in computer science by giving schools the freedom to use teaching resources designed with input from leading employers and academics, in changes that will come into effect this September.

The announcement follows pressure from businesses critical of a shortage of computer-literate recruits.

As a professional computer engineer I welcome this news and hope that any replacement program is better able to foster understanding and enthusiasm regarding computers, and ultimately to lead more people into what is a deeply satisfying and rewarding career. Before we talk more about education I will take a few moments to show how and why I see this career as so important.

As an HPC (High Performance Computing) Engineer, I write scientific software that is used on the world’s most powerful supercomputers. In 2008 I ran the first program to achieve 1 PetaFlop on an open supercomputer (meaning one quadrillion calculations per second). The technical challenges that were required to make that program work were huge and I will discuss in a later blog post, but the techniques and methods I used were the same I use and develop on a day-to-day basis.  The world’s most computationally demanding science is run on our systems and this requires unprecedented advancements in computer software in order to even run correctly. Now, although I work at the extreme end of computing, I believe that the challenges and rewards that my team and I face are fairly typical for a career in Computational Sciences and even for Computing in general. The most important contributing factor to this comparison is also the most commonly misunderstood – computer programming is not a pure technical science but a blend of technical and creative expertise, and so “Super-nerds” with massive technical knowledge and no people skills do not make the most successful computer engineers. To design a technical solution from scratch requires a vast range of problem-solving, creative and inter-personal talent. So those kids with a strong technical basis combined with highly imaginative faculties and good interpersonal skills are the ones most suited to a career in technical computing. It is also important to point out that those skills are transferrable and those of us that can demonstrate them remain in high demand, seen here for my specific technical field and here more generally in Britain.  Generally, computer engineers are well paid professionals (especially in the US) and technical entrepreneurs among the best earners in the world.

JaguarPF - A Cray supercomputer at Oak Ridge National Laboratory in Tennessee

Now, I am the first to admit that this is not the sexy job that I dreamt of during my teens but, since Nirvana already had a guitarist and since my musical skills were vastly inferior to my technical ones (as it panned out at least), here I am. Many people like myself realize after significant scientific training that they are better suited to computer sciences than to science itself, but when the former can be used to advance the progression of the latter it then it is even more rewarding. It is perhaps surprising that I didn’t envisage a career such as this, because in hindsight I can see somewhat clearly that I and my peers were primed to engage with technology and to ‘program’ rather than ‘use’ computers from a very early age. In the 1980’s we were on the brink of a technological revolution though we were not aware of it at the time. “IT” was a term that didn’t appear until much later, and professional computer programmers were extremely rare. Parents were utter digital illiterates, which was a source of great amusement to us kids who seemed innately able to program the new devices. And the nature of those devices themselves was I believe, partly responsible for the technical priming of a generation.  At that time every school in Britain (where I grew up) owned at least one BBC micro – a programmable and hardy system that was built entirelyfor the education of children. In my school we had a single unit that was permanently positioned on a giant trolley and was shared between all classes in the school. One afternoon every two weeks it would be the turn of my class to have the computer! The BBC micro was an inherently programmable computer and all kids needed a few commands in order to be able to do

The Sinclair ZX spectrum was perhaps the most programmable home computer ever made

anything. I am not pretending that many children were skilled programmers but that that the lack of any significant user-interface instilled a closer relationship to the hardware from the outset. At  my home, like millions of other 80’s homes we had a ZX spectrum – possibly the most programmable home computer ever made. Even just to load a computer game one would have to write the following command into at the cursor

load “”

This simple command told the system to load from an external storage device an unknown program.  One would then sit for several minutes watching pretty borders sparkle to the accompaniment of strange sounds, as the compuer “loads” a program from external storage (tape) into physical memory (around 64k at the time). The only magazine I have ever subscribed to in my life for a long period of time was called “Input” and was a computer programming magazine that included full code listings for the Spectrum, the Amstrad, the BBC micro,

The BBC micro - a computer designed entirely for the digital education of children

Acorn Electron etc. Here you could write graphics programs, system software for controlling peripheral devices and even games for your home computer. I spent hundreds of hours typing the listings word-for-word and then modifying slightly the eventual code to understand what was happening in the machine.  My main preoccupation was games, but these tech experiences at a young age taught me about programming languages and machine code, and unquestionably conditioned me to be comfortable with programming computers over simply navigating user-interfaces. There was something we were getting right with computers and kids back in the eighties, and I would like the opportunity to thank whoever it was that pushed out the BBC micro programme into British schools.

But I keep hearing that children are even more “digitally literate” than of 25 years ago – What does that mean? Programming of computers is no longer required to make them work correctly. Apple have made such enormous advances in user-interface design (and even Microsoft computers work pretty well these days) which places a hugely insulating layer of well-running and intuitive software between the user and the computer . “Digitally literate” may be more accurately described as a “digital comfort” with machines or as a literacy with the specific styles of user-interface that are used on modern machines and the web.  Very little awareness of computer programs is gleaned from a modern computer experience without digging, and even less awareness of the hardware underneath that is performing the transactions.

Does this “comfort” translate into a prospering market of professional computer scientists? No, in fact worryingly computer science admissions at university are drastically down. In the US, the 2000s saw a 70% decline in computer science college graduates, and this was mirrored in the UK.  No employer ever hires a person based on their ability to use Facebook, and knowledge of Word or Excel at a base level is considered so trivial and ubiquitous that it has little if any bearing on recruitment. In other words, the IT skills that our youngsters have been taught either at school or by themselves yield little in terms of job prospects and even littler with respect to life-skills. Perhaps even more devastating is the effect that attempts to “teach” IT can have on the enthusiasm of children to engage with computer science. To teach a modern child how to use an application (which is what IT education has amounted to) is like asking the child to re-learn how to dress in the morning and is certain to induce apathy and cycnicism with respect to the subject. To confuse Microsoft Word with Computer Science  is to confuse boiling the kettle with particle physics.

So I welcome the news that the UK has scrapped its computer education programme and I hope that what replaces it will develop real skills as well as real technical ambition in our kids. But what do we replace it with? The technical specifics of this are less important than the empowering of children to be able to actually do things with computers. HTML5, Ruby on Rails, Cloud computing, graphics. Highly accessible frameworks exist for each of these that could be used very easily to empower children to actually create cool things that they can show other people. I find it very hard to imagine that a child who has just learned to write a small fun application on her own android phone would not find this deeply amusing and stimulating.  Only when we have a generation of children able to build tools that confuse their parents, and when those children are familiar with the workings of a computer as opposed to the specific layout of its user-interface, then we will once again be primed for a generation of entrepreneurs capable of revitalizing a country through the emergence of a “digital economy”.

I do not want to see any child in the UK or elsewhere be taught how to use Microsoft Word ever again! This is waste of time and resources, is an insult to the intelligence and creative potential of children and most importantly is wasted opportunity for technical and economic flourishing.

1 Comment

Agnostic? or Apatheist…

Religion is not a topic that is acceptable at most bars and dinner tables, but when it does arrive it is very common to hear people describe themselves as “Agnostic”. Of course I am deeply interested in Agnosticism, in both its historical / philosophical sense and in this more popular usage, and especially in terms of what has driven people towards this position. But I also have to note that the position taken by many people appears, on examination, to be somewhat confused.

When used by the cocktail party Agnostic, the term is intended to state a position of being undecided, and evokes an image – a sliding scale of belief, with the Pope and the clergy on the left, Richard Dawkins and other heathens on the right, and the middle clearly labeled “Agnostic”. This attractive center ground becomes an alluring default intellectual position for those who want to close conversations on the topic as soon as possible and is also a comfortable respite for those who value the spirit of “religious-tolerance”, which seems a natural, thought misguided perspective for those of a liberal and fair-minded nature.  I will show that the very idea of seeing Belief as a sliding scale and taking a middle position on this issue are logically incoherent notions and cannot be captured by Agnosticism at all. I will begin with a couple of definitions

Agnostic  A (Not) Gnostic (Knowing). So Agnostic is the position of not knowing in a active sense. We can be agnostic about many things, though the term has a popular usage referring specifically to the position taken on the existence of god. It was first used by Thomas Huxley in the 19th century and was used to describe a position of deep reflective practice in the natural sciences. It was a highly controversial position at that time and was equivalent to heathenism in the eyes of the religious

Atheism – A (Not) Theism (belief in god). So Atheism is the lack of belief in a god. No surprises there. But let’s also point out that culturally this has been twisted a little and the term is seen as an extreme position only. Really the definition implies that anyone lacking such beliefs, whether passively or aggressively, is a literal Atheist. This also applies to the various Gods when viewed anywhere from outside – so Hindus are Atheists when we are discussing Christianity and vise-versa.

It is clear that the two definitions reflect two related but differing aspects of experience – Knowledge (in the case of Agnostic), and Belief (in the case of Atheism). What is the relationship between Knowledge and Belief? I can disbelieve or believe an assertion based on knowledge of some relevant facts or lack of facts. When we see that those facts are relevant to the issue at hand, the combinatorial knowledge of those collective facts can lead us to believe in something. Sometimes that belief will be proven valid and useful such as I believed that a storm was coming based on weather formations and previous intuitions, and sometimes it will have proven to be poor judgment, such as when we confuse causation and correlation. Clearly knowledge represents relationships and states of the observed world, and we use those known relationships and states to infer relationships and states on the unobserved world including the future.

Now it is important to point out that religious belief does not quite operate in the way we just described. On the contrary, it appears that often religious belief requires one to ignore certain facts that are well known and understood (e.g. contradictions within scripture, metaphysical incoherence of the scriptures, scientific invalidation of the scriptures) in order to be believed. However, even in the case of those who scorn evidence-based thinking, the believer uses judgments based on knowledge statements in order to validate a system of belief – e.g. “god speaks to me, so I believe…”, “the feeling I have when I walk into a church is so strong so I believe…”, it is just that they have made poor choices into what pieces of knowledge supersede others. Even in this case, there remains a separation of knowledge (the voice or feeling) and belief (I believe in god). In neither this extreme religious position nor in the case of everyday beliefs, can one coherently see knowledge and belief as various extremes of the same thing. Thus I can no more construct a coherent sliding scale between extremities of knowledge and belief than I can between whether you are a lover of animals and if you liberal.

The sliding scale model is also reinforced through a misunderstanding about Atheism. As pointed out in the definition above, any person not holding the belief in god is an atheist by definition, and it is not therefore an extreme position but the default position. What makes this seem strange is because of a subtle but popular misunderstanding of what Atheism itself means. Atheism is not the belief that god does not exist – this is Antitheism and before you balk at the semantics please consider the difference carefully. Atheism itself is not a statement or theory to be tested and proven. It needs no evidence because it declares nothing. It is simply the lack of belief in another alternate thesis and this is extremely important. This popular misunderstanding of Atheism as a competing notion to be weighed up against Theism contributes to the image of polar extremes and the sliding scale.  To stir things up further, it has to be pointed out here that many Atheists are Agnostic, another point that would be contradictory if this sliding scale were valid and to which I return later. Now, one can envisage a sliding-scale that represents a belief in a single notion, including belief in a god. But that sliding scale is actually a meter of a single measure, the extent to which I believe X to be true, or in other words the probability that I state that X is true. Lack of belief entirely could also be represented in this single measure as the rightmost or lowermost position, but not-knowing or Agnosticism would more obviously have no role to play whatsoever. A middle position here could only honestly be described as “I half-believe in god”. In my experience this is never the position that is meant by the Agnostic and is it is debatable about whether one can actually see a religious belief as half-held.

Agnosticism therefore, cannot and should not be seen as the middle ground between belief and lack of belief. But to many this conversation will involve too many semantics. The self-labeled Agnostics may agree in entirety with the argument made above, but may say that the word Agnostic is not an accurate reflection of the position they were trying to take on the issue. So what is a more accurate description of what they mean? Here are a few of my attempt to guess.

“I do not know whether god exists. I am not afraid to say this because nobody really knows. But I have not studied carefully the arguments for his existence because; quite frankly I do not class this as the most important issue in the world”

“I do not know whether god exists. Really it sounds very silly but if it really is true then the consequences are so huge that it seems illogical to take a strong anti-god position. So I prefer not to close all doors and stay undecided.”

There are two things at play when statements such as these are made – residue of religious belief, and ignorance in the thesis being stated, and I believe that the first is a manifestation of the second. Ignorance is not meant in a pejorative sense, but simply in the sense of passive not knowing. We do not know because we have not inquired, in the same sense that I am ignorant in the best practices of midwifery and in the social etiquette of Central Asian countries. So is Agnosticsm ignorance? Certainly not, since Agnosticism is the active form of not-knowing where, after taking into account all the available information regarding a question can only conclude that one does not know. Agnosticism and ignorance are hugely different. Whilst I am ignorant on midwifery and Central-Asian custom, I am agnostic on whether string theory is valid and whether supercomputers should be built on graphics processors rather than tradition CPUs. To return to belief, to say that I ‘don’t know’ because I have not looked is a declaration of ignorance.

The residual belief demonstrated by the first statement does not take long to crack open from either perspective. It should be clear the religious traditions of the world would see the holder of this notion as no different or even worse than heathens like myself come judgment day. To come at it from the other side, one does not have to look very hard to see holes in the Theistic thesis. One can do so for example, by watching a David Attenborough documentary, in the prone position if that is desired. This is why I believe that the residual of belief is actually a different expression of ignorance on the subject and can be treated the same.

Are there valid Agnostics at all? Yes, though not in the way that is popularly thought of. Many Atheists are Agnostic as pointed out earlier. Though the lack of evidence for god speaks volumes we cannot honestly say that we know that god does not exist. Nor does that make one particularly uncomfortable since this is the only workable starting place to scientific enquiry. Science, as human beings’ attempt to understand the way things are based on evidence, is the great Agnostic practice.

Since we have seen that Agnosticism is not often the accurate description of an undecided believer, then I will introduce a new word  that represents the positions we described – Apatheism. Apa (from Apathetic) and Theism. So an Apatheist is someone who does not care about whether they believe in god or not. It performs as required – since it can be coherently viewed as the central point in the sliding meter of belief in the Theistic thesis and we can allow its adoption without contradiction and muddying of more useful notions. I believe that the people we are discussing have three options available

  1. You are an Apatheist. Thanks for taking the time to read this
  2. You are a religious person and should get more involved with the chruch
  3. You are an atheist

If you are an Apatheist then there is a very good chance you are also apathetic to the institutionalized rape of children, science-denying, the social and moral manipulation of the African continent, the subjugation of women, lack of sexual freedom, faith-violence, the right to die with dignity, asymmetries of natural rights, freedom of speech and most importantly the excavation of the true state of things.  Instead, adopting the label of Atheist is a small step of opposition to those things and will reveal the presence of a vastly growing and empowered movement that is making steady, incremental progress against each. Since we have shown that the middle ground was not ground at all and that the previous position must reveal some dissonance in both logical and hence in mental events, then to unravel one’s true position can lead only to more ease and accepting and to a better understanding of oneself.