Technology

Born in the computer age, ‘digital natives’ have little clue as to what goes on behind the reflective screen, leaving levels of tech literacy dangerously low. By Ruby J. Murray.

Computer coding for the future

Children attend a coding camp in Silicon Valley
Children attend a coding camp in Silicon Valley
Credit: STEPHEN CHIN / FLICKR

Apple and the FBI are fighting. Last month, the intelligence agency tried to force the computing behemoth to write software that would disable the security features of an iPhone belonging to a terrorist who killed 14 people and injured 22 more in an attack in California. In response, Apple didn’t roar about privacy. Instead, it threatened to invoke the first amendment protecting freedom of speech. Programming languages are a form of speech, said Apple, and the United States government is forcing us to speak on their behalf. It isn’t the first time the argument has been made – court cases over code as speech have been popping up since 1995 – but it is the most high profile.

The divide between those who can speak the new language of code and those who cannot is widening. As decades pass and technology gets smarter, the advent of thousands of computer programming languages has meant many of us, especially kids, are becoming increasingly deaf and dumb to the modern world.

Late last year, the Australian federal government finally endorsed the first national curriculum to include computer programming in primary and secondary schools. The new curriculum will roll out across the country over the next two years. Sadly, it’s too little too late for the first generations of “digital natives”.

Modern civilisation floats on a vast sea of code, and the tide is rising with every second. Beneath the surface of the screen, the code is always shifting. A million lines of code translates to about 18,000 pages of printed text. Slap some text on a lolcat: Photoshop runs on four million lines of code. Send it to a friend: Google’s combined internet services run on two billion lines of code. When you put your ATM card into a machine you touch the edge of many billions of lines of COBOL that supports today’s core business applications, making Google look like a haiku.

The term “digital native” was coined by academic Marc Prensky to describe kids growing up after 1980 in the brave new digital dawn. Technology was meant to have fundamentally changed the way we digital natives think. “Today’s students,” wrote Prensky in 2001, “are no longer the people our educational system was designed to teach.” He urged teachers to bring more technology into the classroom.

To educate the digital natives, the Australian approach – like most of the rest of the world’s – was to plonk laptops down in schools. Kevin Rudd’s 2008 “education revolution” was in large part based on his promise to provide every high school student in the country with a screen. It was hugely expensive, and rapidly obsolete.

We’re only now beginning to realise the more-screens-equals-better-digital-education model is false. A laptop on the desk hasn’t helped students understand what’s going on behind its reflective surface. In fact, there’s evidence that some interfaces, such as the proprietary black box of the iPhone, are barriers to understanding.

As the machines get smarter, we run the risk of getting dumber; the largest study on Australian kids’ media use found that 80 per cent of 16-year-olds spend more than the recommended two hours a day on screens, yet they understand less than ever about the computer in their hands. Two generations of so-called digital natives have grown up as passive consumers of technology. In 2014, the percentage of year 10 students attaining a “proficient” computer literacy standard was 52 per cent, the lowest since the Australian Curriculum, Assessment and Reporting Authority began national evaluations in 2005.

The first digital natives are now classroom teachers themselves. In an age when we interact with computers in nearly every aspect of our lives, computing has the second-highest number of “out-of-field” teachers in Australian high schools. A third of teachers in IT classrooms have not studied a single computer-related subject after their second year of university.

The shortage is not confined to the education sector. Few organisations, from casinos to hospitals, have adequate IT professionals. World governments are constantly hacking each other; in February, the New South Wales government sheepishly admitted to being hacked by China. According to the Australian Department of Education and Training, 9062 students graduated from tertiary IT courses in 1999. Since then, the numbers have almost halved. Students studying information technology as an award course dropped to a low of 4293 in 2010. In the professional world, 69 per cent of developers are wholly or partially self-taught. The education system has clearly failed us.

The fanfare around the coding aspect of the new curriculum’s digital technologies component was immense. But what does it mean to teach computing languages?

Nearly 200 years after Samuel Morse picked out the pattern for “What hath God wrought?” on the first telegraph line, computers themselves are still monolingual at base: they speak binary. Humans, in the meantime, have thousands of computer languages.

We’ve developed programming languages because, as time passes and our hardware improves, the sets of instructions we need to perform basic functions build up. Increasingly abstract coding languages are written to deal with the amount of code needed to perform complex tasks. Instead of going over each instruction, programmers use shorthand to reference already existing banks of code. A compiler then translates the programming language into the machine code. The more “high level” and abstract the language, the fewer keystrokes required to perform tasks, and the more code begins to resemble human speech.

Developed in 1959 by a team inspired by visionary computer scientist Grace Hopper, COBOL (common business-oriented language) was one of the first languages to program in English instead of machine code. “A language has to help people talk to people. People do not talk to machines,” said one of COBOL’s developers on its 25th anniversary. In 2013, a University of Limerick academic estimated that 90 per cent of Fortune 500 companies still ran on COBOL.

“COBOL-based systems touch our lives at a minimum of 10 times a day,” explains Ed Airey, solutions marketing director at Micro Focus, an international consulting firm that helps organisations deal with the problems that come up in their code banks. “Most people travel by rail or by plane; those systems are underpinned by COBOL in the mainframe. If you go into any sort of bank, you’re interacting with a banking system, if you carry an insurance policy for your house or car … COBOL touches every aspect of our lives.”

COBOL might be functional and reliable, but it’s not popular. Certain languages are better fits for specific jobs – COBOL was developed decades before the internet, after all – but there is a cool factor at
work, too. The internet is a battlefield of nerds fighting for their favourite language. COBOL is seen as slow, inelegant.

Any code bank is a living, changing field of instructions. If you can’t understand what the person before you wrote, you can’t alter it without bringing everything to a halt. Once a year, the software industry debates whether the world is about to “run out” of COBOL programmers. Ed Airey thinks the hysteria is hyperbolic; it doesn’t take long for programmers to learn a new language. The problem, however, is on the design side. Writing good code involves communicating with other people, sometimes across many generations. Design paradigms shift in the blink of an eye. Bad programmers churn out incomprehensible instructions, when what we need is elegance and intention.

This is the problem of legacy code: code that is unintelligible to everyone but the person who wrote it. Some programmers consider code “legacy” from the moment it’s written. For veteran software engineer Robert C. Martin, legacy code “conjures images of slogging through a murky swamp of tangled undergrowth with leeches beneath and stinging flies above. It conjures odours of murk, slime, stagnancy and offal.” Dealing with legacy code takes up vast amounts of resources, and it’s a problem that grows more dire by the minute.

When the digital technologies component of the Australian national curriculum was first announced, commentators and industry alike began crowing that teaching kids to code was the best way to ensure their employability in the 21st century. Whatever they learn could be tomorrow’s COBOL. The language is a means to an end: communication.

We have always been too transactional when it comes to understanding the power of computers. Today’s illiterate digital natives need to learn how to code, but more importantly, they need to know how to think computationally; how to talk to each other as well as the impenetrable black box and permanent companion in their hands. Understanding code is just the first step in designing a digital future in which we retain our agency, both on and off the screen.

Our digital lives increasingly involve complicated ethical choices, in terms of both hardware, software and culture. Is code speech? Can the government force Apple to write code? What does it mean to retain privacy and how should we do it? As the first generation of digital natives has taught us, sitting in front of a screen doesn’t automatically teach you how to manipulate it.

There are lots of great resources for teachers to use to teach kids how to code, from Massachusetts Institute of Technology’s free program Scratch to non-profit support groups such as Code the Future, which places developers in classrooms. When it comes to teaching the crucial questions of intent and design, we aren’t ready. The federal government has pledged $3.5 million towards developing support resources for teachers. It doesn’t seem like a lot for teachers battling to reverse the current downwards trend in tech literacy and roll out a whole new curriculum. Not to mention the fact the teachers themselves need teaching.

We have the choice now over whether we want to live in a black box floating on a mass of stinking code, or in an elegant world where we understand what is happening behind the reflective screen.

The future descendants of Samuel Morse won’t be looking at each other and asking, “What hath God wrought?” They’ll be looking back at us, the first generations of so-called digital natives, and asking: “What the hell have you written?”

This article was first published in the print edition of The Saturday Paper on April 9, 2016 as "Coding the future".

For almost a decade, The Saturday Paper has published Australia’s leading writers and thinkers. We have pursued stories that are ignored elsewhere, covering them with sensitivity and depth. We have done this on refugee policy, on government integrity, on robo-debt, on aged care, on climate change, on the pandemic.

All our journalism is fiercely independent. It relies on the support of readers. By subscribing to The Saturday Paper, you are ensuring that we can continue to produce essential, issue-defining coverage, to dig out stories that take time, to doggedly hold to account politicians and the political class.

There are very few titles that have the freedom and the space to produce journalism like this. In a country with a concentration of media ownership unlike anything else in the world, it is vitally important. Your subscription helps make it possible.

Select your digital subscription

Month selector

Use your Google account to create your subscription