Wednesday, March 5, 2014

Australia's Digital Technologies curriculum

I recently stumbled across this article on Australia's new Digital Technologies curriculum at The Conversation.

I was considering writing a reply to it here, when I scrolled down to find a brilliant comment from Bruce Fuda that I really think deserves highlighting. Re-posting with Bruce's permission here. Bruce writes:
As an IT teacher who has the skills and knowledge to deliver this curriculum, I get a little bit frustrated about some of the ongoing concerns people keep expressing with the curriculum, largely because I feel like many of the criticisms are being made with underlying assumptions in place that need to be challenged.

The Digital Technologies curriculum does not insist that students become programmers - at least no more so that the English curriculum insists they become authors, the Mathematics curriculum insists they become mathematicians or the Science curriculum insists they become Scientists.
Many of the same arguments and/or questions about the relevance of some of the content included can be asked about other learning areas - such as the need for students to understand stem-and-leaf plots in Mathematics, or the structure of multi-cellular organisms. Look at all of the curriculum documents (and it is important we differentiate the curriculum from a syllabus - they are different things) and you'll find that if it really came down to it, you could question the inclusion of many of the skills and/or understandings that the writers in each area have decided to focus on.

That aside, the other major consternation people have about it all is the time / crowded nature of the curriculum, however this all comes about because many commentators still insist on looking at the subjects as being independent of one another. We look at the Science curriculum and then, at school, we teach kids Science. We do the same with Maths, English... Why? How many times in the real world do we look at a problem and say "oh, that's a problem that can only be solved by mathematics, I'm not going to consider any of my scientific or social understanding to come up with an answer"?

The curriculum has been written with the interdependence and relationships between the learning areas in mind - or at least that is my understanding. We talk about falling levels of literacy and numeracy, and then argue that this is a case for eliminating non-critical subjects from the learning of students? Surely the reason they are not engaging with school has to do with the fact that the way they are being taught isn't working for them? It is possible to teach many numeracy and literacy concepts using much of what has been included in the Digital Technologies curriculum. Similarly, you can teach programming within the context of mathematics, algorithms as recipes in a kitchen, and data representation as an exploration of pattern recognition and language translation.

To simply look at the fact that programming has been included in the curriculum and then dismiss it due to the fact that not every kid needs to be a programmer completely fails to recognise the importance of logical reasoning and the methodical development of algorithmic solutions when faced with complex problems - a critical skill that can be developed through learning computational thinking. Not every student will end up being a mathematician, so why do they need to know about polynomials and parabolas?
And I also don't think it is sufficient to argue that a lack of trained teachers is reason enough for the subject to be relegated to a position of less importance. The curriculum should be both aspirational and intended - it is up to schools, society and teacher-training programs to find reasons to encourage people with the skills and knowledge required to teach the curriculum to consider joining the profession. The same argument would not be applied to any other learning area - we would never say that not having enough English teachers would be reason enough to stop teaching English, would we?

The use of technology for the "thrill" of using it is fine - I've got no problem with people making use of the great technology available to better their lives etc. But accepting technology as "magic" is not acceptable in the longer-term if we want to continue to develop as a society. Would we be where we are today if we had simply accepted the idea that rain just happened and didn't instead seek out a reason for it? We have the technology that we have today because people who found the passion and excitement to learn more about it did so through curiosity and interest.

We can make the Digital Technologies curriculum interesting for all students, just like we can for every other learning area. The first step in making that a reality is to stop artificially segregating the subjects and to emphasise the interdependence that exists across every discipline of knowledge. When designing a lesson or unit of work, what we need to do is look across multiple learning areas and find ways to engage students with lots of different interests - to connect what they are learning to their world.

Does this mean every child will like learning every aspect of the DT curriculum? No, just like not every child will enjoy Maths, Science or other subjects. But we can at least develop in them an appreciation of the value each discipline has, and the impact of each on their way of life now and in the future.

Oh - and on the last point re: not including Scratch (or anything else) in high school - the curriculum doesn't do that. There is nothing that precludes the use of visual programming to teach concepts from any learning area. What has been expressly mentioned is that students learn about general purpose programming languages. These languages are different when compared to drag-and-drop type visual languages because they allow us to perform significantly more computation than is possible otherwise. They are important, but that doesn't mean that other, more familiar platforms or languages can't be used to address other aspects of the curriculum. I use a similar technique to explore recursion with my students, producing fantastic looking artwork using Context-Free grammars and exploring randomness as well (which is a nice way of visualising genetic mutation).

We need to stop looking at movement through the bands as discrete periods of learning - it is a continuum and the learning that takes place in earlier bands should be used as the foundation for learning in later ones.

Wednesday, January 19, 2011

Digital Literacy - a follow-up

Last time I posted here, I critiqued the concept that we should all become programmers. Instead, I suggested that interface literacy was the key to giving people the power they need to make informed decisions about current technologies.

Today, I've come across this article - Teaching Digital Literacy - which includes a video featuring Douglas Rushkoff talking about the subject of his new book Program or Be Programmed: Ten Commands for a Digital Age.

I'm currently trying to purchase a copy of the eBook to get a better understanding of the arguments Rushkoff is making (the vendor, OR Books, keeps 404-ing when I try to pay - nice work*). I still have issues with the terminology used, but the point he makes that the digital medium is just as full of bias as television, print and radio is really valuable. And that bias is not made up only of the content itself, but the actual method in which the content is shared.

Will post a follow-up here when I've had a chance to sit down and spend some quality time with the book. (If OR Books manages to fix their website.)

*UPDATE 29th Jan: I emailed them regarding the payment issue; still unable to pay via PayPal, and their non-PayPal payment method is not handled with a sufficient level of security, so I won't be purchasing the book for a while longer. Still, I have found this free sample chapter. And ooh, ooh. I have something to say about this. Next time.

Monday, November 1, 2010

Programming - the New Literacy?

We use technology to create content. Literacy is a measure of how well you can create, contextualise, and accurately interpret content.

That content could be updating your status on a social network. It could be your SmartRider card logging your daily trips. We are all generating data, telling stories, leaving imprints with our technology - be it pens and paper or binary code in a machine.

In his article Programming is the New LiteracyMarc Prensky postulates that knowing how to write programming code will become an essential skill. You will not be considered literate without it.

It is an interesting argument. But I think he is somewhat confused about the nature of programming.

A computer is a tool requiring both hardware and software. This is where the confusion lies. Prensky confuses the language of software with the language of human thought. But software, while written using a 'language', does not communicate human ideas. It communicates machine ideas. Software itself is a construction - it is a tool made out of code. And just as we are not required to build a telephone to be considered capable of holding a discussion, we will not all be required to build software to be considered literate in computer interfaces.

The author also seems to confuse interface literacy with programming. Examples provided of programming performed by teens are things like "downloading a ringtone" or "customizing your mobile phone or desktop". This is not programming. This is understanding how to use a tool - much like understanding how to change the ambient temperature of your refrigerator.

I love programming (well, on good days) but do I think it will become a required skill? No. Do I think interface literacy will become an essential communication skill? Yes! Eventually. Just as we are required to understand how to use pen and paper in order to write essays in Year 10 English Literature.

You might think I'm being pedantic. Listen, this is me being pedantic: Flash is not a programming language! It is a piece of software! Actionscript is the 'programming language' used by Flash. That said, you may hear  old-school programmers refer to it as 'scripting' rather than 'programming' due to it not needing to be compiled amongst other things (Okay, one of the 'other things': True programming languages usually let you write any sort of utility you want. While I could make a game or short animation with Actionscript, I could not write a boot loader with it - hat-tip to Pixelseeker for the example).

Yes. Now I am being somewhat pedantic.

Towards the end of the article Prensky re-defines programming as 'the ability to control machines'. While the semantics make me cry (oh, fine, not really, at most they add another micro-twitch to the nervous tic I'm working on to increase my nerd cred) I agree completely with the message. The abillity to control our society's current, predominant technologies is vital for any individual.

Once upon a time, our latest technologies were zippers and velcro. Today, computer interfaces. We need to understand these in the same way we need to understand how to use a zipper; if we fail, we'll end up looking like a bit of a dunce.