22 October 2016

1:1 - Why?


1:1 via edtechteacher.org

Every now and then I come across an article that, while on the surface level seems fairly innocuous, causes me incredible consternation, articles like this,

"Kids Who Have to Share iPads Learn Better Than Kids Who Have Their Own".

The article is not a new one, but like many articles of its ilk, it has a habit of resurfacing periodically, as it did this week, finally motivating me to put fingers to keys.

There are so many things wrong with the assumptions made by the writer of this article, that it’s hard to know where to start. So in the absence of any better course of action, I’ll start at the beginning.

Firstly can we all just assume that of course sharing is a good thing, and so by implication is learning to share, but the truth is that it's the sharing that is beneficial not the device being shared, I see kids sharing and collaborating all the time even when using their own screens; the extent to which this happens is all to do with the classroom culture carefully crafted by a caring teacher and nothing to do with the nature of the particular item.

Secondly, what is the evidence basis for the the findings of the research? Performance in “a standardized literacy test at the end of the year compared to the beginning”. Oh, that’s okay then; God forbid we should have any other metric in school for judging the efficacy of any initiative other than a test, and I hate to imagine what the nature of this test was, but something tells me it involved a lot of multiple choice questions, maybe even a few cloze passages...   I loathe the way so many of these kinds of studies assume that standardised tests as the measure for success for everything is acceptable, it's not acceptable it's completely unacceptable, not to mention completely irrelevant... Just because it's easy to measure doesn't make it valuable. There are plenty of other people who have done a better job than I could do here, starting with the magnificent Alfie Kohn.

An improvement of 28% v 24% in a study of 352 students really is not statistically significant, despite what the study's author says, another reason why we don't rely on one source for anything of any real substance. Then, if that wasn’t bad enough, the study extrapolated the results of a literacy test, to relate to their work with basic geometry?

I could possibly accept basing the efficacy of a study on a standardised test if the focus of the study was specifically related to the test, eg working on improving spelling for example, but in this case, as in most of the cases of this kind, they make no effort whatsoever to relate the standardised test to the actual nature of the use of the devices. Which tells you a great deal about the study, that they didn't feel it worthwhile to actually describe what they are using the iPads for, which would seem to be glorified textbooks, which would explain why they felt standardised test would be a valid measure. All they are concerned about is to measure the extent to which students have absorbed specific surface content, without any consideration about deep conceptual development or creativity and all those other soft skills that really do matter much more. You see a classroom where all the iPads are used for is glorified textbooks, or for educational "games" and skill drill, then sharing one iPad between five, or ten or even twenty really is not a problem. But a classroom where the teacher expects kids to actually create things that are meaningful over time is a classroom that benefits from the lowest possible ratio of student to device.


What has all this got to do with 1:1?

Whenever I encounter someone who is under the impression that providing students with their own device is a little, well, excessive, I know there is something profoundly dubious about the assumptions they make about the way we encourage students to use these devices. The truth is you can be sure that any advocate for shared devices never shares their own device 50:50. Can you imagine how far you would get in your daily work if you had to share your laptop 50:50 with a colleague in the office? You can be sure that the same person so gleefully anticipating a social nirvana where all of these students happily share their devices is suffering from a profound case of media bias, or device disorder. I’m sure the same person would never countenance asking the same students to share a pencil, or a paintbrush. How about an exercise book? You start from the front, and I’ll start from the back... These devices are all tools, very few of which were purpose built for a classroom, but all of which can be very successfully repurposed for an educational context by skilled teachers. I find teachers that are blasé about the need for students to have their own devices tells me more about the lack of importance they associate with the device than it does about the use of it.

Don’t misunderstand me, I am not saying that a 1:1 context is a prerequisite for successful learning, many teachers all over the world, do amazing things everyday with limited resources, but that doesn’t mean that this paucity of resources is something they find preferable! Anyone who thinks so, clearly has never attempted to use these devices themselves.

Allow me to illustrate with an analogy.

Cycling is good for you, it’s also much less harmful for the environment than an aeroplane. So next time you want to travel between, say London and Singapore, don’t fly, cycle!

This logic only makes sense if you never had to actually travel between London and Singapore yourself (and if you’re not in hurry). There is something to be said as well for determination, I have a good friend who shares his laptop with the 24 kids in his class, on a rota basis. Do they benefit? Yes. Is the sharing beneficial for them? Maybe. Is this his preferred arrangement? Of course not.

Back to the bicycle.

Would I ever countenance the idea of cycling from London to Singapore? No ... unless that was the only way I was ever going to visit Asia, and time was no object. Consciousness of the desirability of the goal has a direct bearing on one's determination to persevere despite the obstacles that may be present. Would it be good for me? Yes. So am I going to do it? No. I am not. Well, maybe. For many years, teachers who are profoundly aware of the value of designing experiences for their students to enhance their learning with digital tools have persevered despite many obstacles to make this a reality for their students, but would they prefer 1:1? Of course they do. How do I know? I was one, more than once. Scavenging abandoned computers, salvaging parts, and spending hours beyond number to build a rudimentary ‘lab’ for my students was a frequent experience for me when I was wrestling to enhance the learning of my students in the early days at the turn of the century when ‘TEL’ still was yet to become a ‘thing’.

1:1 works better - shall I count the ways?

When my school announced five years ago, that we were embarking on a tech enhanced learning (TEL) initiative, it was assumed that the 1:1 ratio only applied for older students, middle school and up. While the ratio of devices in the Primary School was going to be increased, from about 5:1 to more like 2:1, the intention was never to provide 1:1 in the primary school as well. So what changed their minds? I did.

Can we work with shared devices? Yes. Can we work better when we have our own device? Yes. Interestingly the main pressure to go 1:1 came from our teachers, even when we expanded to a 2:1 ratio, the more effective they became at utilising digital tech, the more ridiculous expecting the kids to share devices became.

The truth is that the benefits of 1:1 have really surprised me, I was kind of oblivious of how powerful that really is, just from a logistical standpoint. With shared devices it is all too common for students to accidentally delete each other's work which is quite soul destroying, and especially with video editing in the junior school attempting to work on a project over several weeks is impossible with a shared machine. This means that any creating on the device (the most important use) has to be confined to short simple activities that can be started and completed within one lesson, this really does diminish the power of those tools.

This means that the main reason for going 1:1 is not really about two kids needing to use the device at the same time, although that is a factor, it's about honouring and protecting the importance of the media created by each individual child. The biggest advantage I found by going 1:1 is to do with the fact that the work on that device cannot be accidentally tampered or deleted by a well-meaning (or maybe not so well-meaning) friend. If all the kids use the device for is shallow tasks like skill and drill apps, taking tests, and passively consuming media, then clearly sharing them is less of  an issue. However I think this is actually highlights a bigger problem! If we are encouraging our kids to do meaningful creative work on these devices and they will have media saved on the device that they would be upset about if it was accidentally deleted by a classmate.


Not to mention the issue of 'ownership', a child who is responsible for their own device, apart from the obvious personal social merits of having to take that responsibility, is also a child who feels like the work on there is work that is all theirs. This aspect became quickly apparent, kids really do benefit from their "ownership" of one device, including in ways we hadn’t anticipated, such as: customising it so that it operates the way they want it to; using a picture of their face for the wallpaper; being able to actually choose to share content on their iPads with their parents directly, this is the kind of thing that a one-to-one environment would make very straightforward but that they shared environment would be quite difficult. This even extends to the physical device itself—sharing ‘their’ device with their parents at parent teacher conference means there is something quite empowering about that kind of "ownership" even at such a young age. This aspect encourages a sense of responsibility that is powerful in terms of 'digital citizenship'; such as the fact that the teacher can expect the student for example to curate and manage their camera roll with their media responsibly; there is no way the student can evade responsibility by blaming other students who also use the iPad—a common issue with shared devices.

So when I encounter people who are under the impression that 1:1 is excessive (the implication in this article) I know there is an assumption behind these ideas that the digital tools are used so infrequently and so ineffectively (ie skill drill, and games) that expecting kids to share them is no big deal, but in classrooms where these tools are effectively integrated and used to record, reflect and create, they are actually very difficult to share, not because of a lack of willingness to do so, but because both kids actually need to use the device at the same time, and really value the content they are curating and collecting on their own device.  You can be sure the journalist who wrote the article wasn’t using a machine she was sharing; why?

She uses it to create

16 September 2016

Creative Commons—Criticised


Ever since I ventured into the world of pedagogical technology/tech integration I have never ceased to be amazed at the way that so many tech integrators wear the Creative Commons (CC) thing like a badge of honour, it annoyed me then, and it annoys me now. A simple bit of Googling will reveal that I am not alone in my criticism, although John Dvorak's stinging criticism of Creative Commons puts into words all of my concerns better than I could or will attempt to do in this post, not to mention this post by Kent Anderson. My concerns here are as an educator who feels constantly pressured by a minority of 'techie types' to push this agenda in my practice. There are few examples better of the way tech types can lose contact with the reality of the classroom than this. So here it is, TEN reasons (in no particular order of importance) why Creative Commons drives me crazy.

Reality Check #1

Ignoring Google for image searches is inauthentic and quite frankly ridiculous. Look, I hate to break it to you, but the stunning fact is that anyone (other than members of the cult of CC) who is looking for an image to adorn any aspect of their digital practice will ... GOOGLE it. That's it, expecting anyone, but especially kids not to use Google but instead search for inferior images in obscure (to most people) corners of the Web really is ridiculous.

Reality Check #2

If IT interferes with the actual goal of teaching and learning, the actual curriculum content, forget IT (geddit?). When I send my kids on an image scavenger hunt (eg find images of 5 famous people on the internet who inspire you) I don't care about CC, I care about their grappling with the key concepts and content, so I want my students to google it, not CC it. To pursue CC practice you will effectively turn what should be a 10 minute activity, into a 40 minute trawl accompanied by tedious (not to mention ugly) attribution. The focus effectively sifts away from the content and onto the technology, the tool, this misses the point; the tech should be the medium, not the message.

Reality Check #3

It's theft? Nonsense. Can we please STOP lying to our kids? While a handful of CC activists relentlessly pursue their moral crusade, the rest of us in the real world will need to figure out how best to work within a web where 'Piracy' is rampant,  symptomatic of deeper issues that are a great deal more complex than the simplistic arguments pushed by those who should know better. Paul Tassi over at Forbes, explains this nicely:

"Piracy is not raiding and plundering Best Buys and FYEs, smashing the windows and running out with the loot. It’s like being placed in a store full of every DVD in existence. There are no employees, no security guards, and when you take a copy of movie, another one materializes in its place, so you’re not actually taking anything. If you were in such a store, you’d only have your base moral convictions to keep you from cloning every movie in sight. And anyone who knows how to get to this store isn’t going to let their conscience stop them, especially when there is no tangible “loss” to even feel bad about.
It’s not a physical product that’s being taken. There’s nothing going missing, which is generally the hallmark of any good theft. The movie and music industries’ claim that each download is a lost sale is absurd. I might take every movie in that fictional store if I was able to, but would I have spent $3 million to legally buy every single DVD? No, I’d probably have picked my two favorite movies and gone home. So yes, there are losses, but they are miniscule compared to what the companies actually claim they’re losing."


Reality Check #4

If you don't want it stolen (or used) simple‚ secure it. Now I can accept that there are 'starving artists' out there who desperately need to paid for the work they do, well ... if you want to get paid for it, you need to take some basic steps to secure your own intellectual property, just like you would do with your physical property—lets face it, it's not hard to watermark an image, or only use low res images for browsing purposes (these are just two examples from many). Look, I own a car, motorbike, and a bicycle, but when I park them I ... LOCK them, to prevent their theft, if you're going to put your content out there, without restriction, then you can expect it to get used; in fact I operate on the assumption that the images thrown up by Google fall in this category, either free to use (public domain‚ and let's face it, domains don't come much more public than Google search results), or the owner doesn't really care—like the newspaper I perused the other day that some other kind soul left on the train, the friend's book I've borrowed, the DVD I watched at a relatives abode, my public Picasa albums, this is fair use. More on that later... 

Reality Check #5

Not everyone is bothered about getting credit. In fact in the 'real world' the norm, especially amongst teachers, is to 'freely receive freely give'. When I share teaching resources with teachers, do I expect to be cited or credited? No. Why does it matter to me that a bunch of people I don't know, have never met, and likely never will meet don't know that I created it? My name, my credit, my status are not important here, I would rather my content gets used regardless of who gets credit, that's the goal, not status, not ego, but helping others with materials I had to create for myself anyway—I'd rather they got used than languish in obscurity. So my resources are out there, free to use, granted I used images I acquired freely via Google, and for that reason I'm happy for the content I 'remix' or repurpose to be used likewise, it would be rather hypocritical of me if I didn't. Let's face it if you really believe that everything you create is truly—never been seen/done before—unique, I am sorry to say it, but you are naive my friend, and you need to watch the 'Everything is a Remix' videos, and/or the TED talk ASAP.
"If I write something on my blog, for example, and decide not to cover it with the general copyright notice, I can simply say that it is in the public domain and be done with it. I do not need permission from Creative Commons, nor do I need to mention Creative Commons or anything else. It's in the public domain by my personally allowing it to be so. This is my right! I don't need a middleman—a Creative Commons Commissar—to approve my decision. And yet there is this perception that I do." (Dvorak, 1995)

Reality Check #6

Not all content is created equal. There is a big difference between the use (and reuse/repurposing) of text and use of image. When we're watching someone's amazing, riveting, bullet point riddled PowerPoint, I do not assume that any images contained therein are images they created themselves or 'own', in fact, like most people, I believe, we assume the opposite; the images they use to illustrate their work are not theirs unless they say otherwise. BUT, the same can not be said to be true for their writing... Any writing in a published piece of work is assumed to be the creation of the author unless is appropriately cited; put simply, copyright is a very different issue to plagiarism, let's not convolute this issue by confusing them. So, as far as I'm concerned I accept the reasons why it is important to cite text but don't feel the need to attribute images I found in a 'public domain'.

Reality Check #7


Attribution is not simple, and it is not age appropriate. In a primary school, just teaching 8 year old kids how to search for an image in the default search window is tricky enough without further complicating things by making them only search specific types of databases, and of course extending that practise home is even more tricky. A CC search doesn't even aggregate images, so kids could need to go to literally 10 different sources  (Europeana, Fotopedia, etc.) to search effectively for one image. CC attribution in relation to their licences and and the permutations thereof are complex (not to mention contentious). CC is understood by very few people, especially the permutations of the six types of attributable licences that may be applied. Are we really expected to get into this with school kids, and their teachers who are struggling to rename a folder? Sounds like misplaced priorities to me.

Reality Check #8

Most of us are not living in the USA. US copyright law does not have global jurisdiction. For example Singapore (where I live and work) have their own version of 'fair use' copyright law, called 'fair dealings':

Permitted Acts
12.1.11    The CA has several provisions permitting certain acts which do not constitute copyright infringements. These acts are intended to strike a fair balance between the interest of copyright owners and the public interest. They include acts (popularly known as “fair dealings”) for the purposes of research and study, criticism or review, and reporting current events.
http://www.singaporelaw.sg/sglaw/laws-of-singapore/commercial-law/chapter-12


'The effect on the potential market' being a critical element. The use by our kids of Google images, has/does and will not effect the market potential, ie I'm not charging for these, and they're not losing income by my use of them, there is no way I could have paid for any of them even if I wanted to, as actually locating the original copyright holder is very difficult, and certainly not something I would encourage primary school kids to do, "Hey kids, email <complete and utter Internet stranger> and ask if you can use their Flickr image: ... ? No way. And yet this is exactly the advice I hear CC proponents giving. No, really.

Reality Check #9

This is not a legal issue, it is a moral one. We're no longer talking about a question of blind adherence to law, we're now engaged in a moral argument instead. And that's where I get uncomfortable, I have all sorts of moral positions I keep to myself, and I feel that this is a case in point. It feels too much like someone else's moral perspective being forced upon me.

Let's face the FACT (see what I did there?) that this is a moral issue, and accept that like many moral issues there are nuances to these arguments that are perhaps best not placed in the hands of a teacher who just wants their kids to find a picture of a Pol Pot and Mother Theresa (probably not in the same image) for a project illustration.

What with T-Shirts and passionate evangelism, the proponents of CC sound a lot more like religious activists than people who are serious about engaging in the nexus of technological tools, pedagogy, and subject content (TPACK). So, like fundamentalist religious beliefs, let's keep moralising to ourselves, and please, keep it out of the classroom, the rest of us (yes I'm a 'fundamentalist'—but about what?) manage to do that, all I'm asking is that CC fundamentalists do likewise.
Fundamentalist: A usually religious movement or point of view characterized by a return to fundamental principles, by rigid adherence to those principles, and often by intolerance of other views... 

Reality Check #10

Treatment of virtual and actual property needs to be consistent. If we're going to continue down this road of virtual content as 'property' are we going to follow through? No more lending of books, DVDs to friends, no use of any music you 'own' (you don't own it) purchasing your media in multiple formats, that film on DVD, mp4 and Blue Ray—things will get very tricky very quickly.

A case in point: I'm currently reading a book given to me by a friend, of course he read it himself first, but now will I purchase that book? No. And what if he picked up the book in a used book store in Phuket (like I often do), that's two purchases the publisher (with 10% to the author) will never see, and if I pass my copy on to someone else?

The point is the book is a COPY, and ownership of a copy is a very different matter to intellectual ownership, ie at no point in this little chain of exchange did anyone assume that any of the parties was actually the author of the book; likewise with images found on the Internet. In my experience the people who shout loudest about intellectual property do so because they are oblivious of its complexity and its irrationality, and is there any chance that they have no problem with loaning copies of their books (et al) to their friends?



Creative Commons Conclusions

In the interest of keeping an open mind, I have tried to embrace CC... this post is largely the result of the frustrations of those attempts. Whenever I have attempted to restrict myself to CC images, the pickings were very poor, so I voted with my feet, and went back to Google.

The images I found in CC sources were generally, well, less than useful; I'm not looking for pretty desktop 'wallpaper', I'm looking for a powerful image to illustrate a point. Like 'problem solving' 'frustration' 'gaming' 'balance' the images in CC = useless, Google = awesome.

All of my presentations definitely fall within the legal definition of:

"for the purposes of research and study, criticism or review, and reporting current events"

Like one I worked on about Gaming - where I needed to illustrate all sorts of things, lots of game covers for starters, COD4 etc., but also images of kids gaming, broken tennis racquets, frustration, anger, joy, flow, boss fights, screen shots of recommended websites (that incorporate incidental trademarks). But I need powerful images to make my point, my Keynotes are about 90% images, 10% text. Does anyone really expect me to talk about the pros and cons of COD in a presentation and not have an image of the game covers under discussion on display? I mean check out these puppies, I got for a search on 'call of duty'. Or ironically I could get this, which is listed as 'Showing Creative Commons-licensed content' but clearly is NOT.

More to the point, most of the game covers I showed will more than likely be purchased by parents in the audience, for their children, specifically because of my recommendation. So my use of non CC images, far from conflicting with a 'potential market' is actually creating a potential market.

I actually began the laborious process of finding those images (about 80 in total so far) in good faith, using CC, Wikicommons etc, all added to my search engine list, and it was/is dire. It quickly became apparent that it was/is utterly impractical.

Would I encourage expecting this from my students when I can't cope? No way.

Fair use is most likely our best solution...

fair use
noun
(in US copyright law) The doctrine that brief excerpts of copyright material may, under certain circumstances, be quoted verbatim for purposes such as criticism, news reporting, teaching, and research, without the need for permission from or payment to the copyright holder

Free & Easy

It all boils down to the old adage—Do as you would be done by—or another one, from one of my favourite books, "Freely you have received, freely give". From time to time I find myself forced into creating content, only because a thorough trawl of the web yields nothing suitable, so I find what I can, create what I can't find and cobble it all together (remix) to make something fit for purpose—do I smother it in CC? No, I don't—it's out there, it's public, help yourself, sure it would be nice if you gave me credit for creating it, but do I really care that a bunch people somewhere who I don't know, that I've never met, and I most likely never will, heard or read my name? Not really.

Do I want my work to go further than myself? Absolutely. 


Do I condone anyone stealing someone else's intellectual property? Absolutely not. 


There is a world of difference between copying and repurposing content, and copying and pretending that you are the original creator/author of it. Blurring these boundaries with confusing and condescending 'creative [not so] commons' is not helping anyone, least of all the creators [and REcreators] of content themselves, and certainly the not their teachers, the vast majority of whom ignore it or are oblivious of it anyway.


15 June 2016

Laptops and Lowest Common Denominators


A colleague shared the outcome of recent research into the efficacy of 1:1 laptop schools, more specifically it was a meta-analysis of 10 studies that examine the impact of laptop programs on students’ academic achievement.

I read it with mixed feelings. On the one hand the fact that the researchers concluded that schools where students have their own laptops see a "significantly positive average effect sizes in English, writing, mathematics, and science" was encouraging. They felt that "the impact of laptop programs on general teaching and learning processes and perceptions was generally positive".

So that's good then. Right?


Well, no; not really.

I couldn't help but notice that their expectations of the actual use of these devices was, to put it mildly, far from tapping into the true potential of these devices. This is despite their inspirational opening clarion call to change the world,

"We believe that the affordances of computers for learning and knowledge production are radically different from those of radio, television, and film, which explains why computers, unlike those previous technologies, are bound to have a very different educational fate from the one suggested by Cuban (1993a, p 185), who wrote that “computer meets classroom: classroom wins.”

So exactly what uses do they have in mind? How do they envisage these radically different affordances? By inspiring the creative expression of learning through the exciting synergies between video, image, text, audio and the deft analysis and application of data?

No, they see the main affordances of these devices in terms of use "to write and revise papers, conduct Internet searches, and engage in personalized instruction and assessment using educational software or online tools". (p 2)

What? That's it? These devices hold the potential to radically transform their world, but let's just use them to type up reports (so they're nice and neat), Google stuff, and take online tests.

How depressing.

Then it gets worse. Having applied the law of the lowest possible expectations of these tools, they proceed to use the worst possible measure to determine their efficacy. We find ourselves in the familiar territory of, when faced with the option of assessing those aspects of learning that are the most important (creativity, solutions, innovation etc), but of course the most difficult to qualify, instead they opt to measure the aspects of learning that are easiest to quantify, with, yes, you guessed it, standardised tests:

"quantitative findings of the impact on students’ academic achievement. [...] Measurements of academic achievement were standardized assessments or norm-referenced district- or school-wide tests." (p 5)

So the measure of efficacy all boils down to that which can be measured on a standardised test. How depressing, and how inappropriate for a medium as rich as that of digital technologies. Like judging a ballerinas dancing ability, based on her spelling. So we use them ineffectively, then assess the efficacy of their use in ways that are utterly unsuitable. Is this really what we expect when we talk about 'technology enhanced learning' in 1:1 environments?

I sincerely hope not. I tell you what though, it would make my job a lot easier if I did.

To be fair to this study, they do accept that there are problems with the ways they are assessing the efficacy of these devices, "studies on this topic have largely done a poor job of assessing learning outcomes that are not well captured by current iterations of standardized tests. As the United States and other countries move to more sophisticated forms of standardized assessment, these new measures may be better aligned with the learning goals believed to be associated with laptop use." (p 25)

I have to wonder whether the corporate world is as obsessed with trying to validate the influence of digital technologies in the workplace as we are with attempting to defend them in the classroom? Do any of us really believe that the corporate world would be better off without digital technologies? Then why would we would we believe that classrooms would be better off without them? Do we really believe the corporate world would spend the millions if not billions it must cost every year to maintain their IT infrastructures if they did not feel they was essential, important, effective?

Don't Settle

As it is I'm not prepared to settle for a lowest common denominator approach, where we abandon any attempts at using these devices anywhere near their potential, and instead settle for using them in the ways that are the easiest, and therefore the most common, even if they are far from being the most effective. By easiest/least I mean ways of working that most closely replicates the traditional approaches to learning that were the norm before the advent of the digital revolution; writing becomes typing, and researching in the library becomes Googling, the results of which we present, in writing. That's it.

Vitamin D[igital] Video, Image, Text, Audio, Data. VITAD.

No, these laptops should be used to create in all five domains, not just text, but image, video, audio and data, and all sorts of overlaps between them. These technologies should exploit all of the attributes that digital tools excel at; situated learning/working, access to global communities/experts, multimodal artefacts, mutable work flows, sharing and collaborating on content using social networks.

In the paper, they state,

"Contrary to Cuban’s (2003) argument that computers are “oversold and underused” (p 179) in schools, laptop environments are reshaping many aspects of education in K–12 schools." (p 24)

But the truth is that if all they are exploiting is their use is as word processors and web browsers, these machines are definitely underused. I guess this could be described as giving up on transformation and focusing on amplification instead, if I'm honest, is this a bad thing? It would certainly make my job easier... Maybe having tech being oversold and underused is better than having them ignored and unused.

I guess the question that I need to wrestle with is, if I need to lower my expectations... Maybe if we just focus on using one domain effectively (text), we'll still see benefits in terms of learning, but perhaps more effectively and consistently—less is more? Perhaps, but I doubt it; focusing on just one domain out of five strikes me as making as much sense as buying a car and using it to keep you dry in the rain, or cool in the sun.

Useful? Yes.

Appropriate? Maybe.

Ideal and Effective? ... 😕🤔😬


Zheng, B., Warschauer, M., Lin, C. H., & Chang, C. (2016). Learning in One-to-One Laptop Environments A Meta-Analysis and Research Synthesis. Review of Educational Research, 0034654316628645.

09 April 2016

Desktop Zero - 4 compelling reasons to make this an essential habit

Yes, this is mine. No I did not cheat, well maybe one folder... 


You don't need me to tell you that your environment affects your productivity. Since a great deal of our work is now done with a screen, it stands to reason that your desktop environment can play an important role in your productivity. Seriously can you really look me in the monitor and tell me that you'd rather work on a desktop that looks like, this?

Messy Desktop by RuthOrtiz


Five Reasons to Change

There are plenty of reasons for making the effort to aim for 'desktop zero', I'll attempt to lay out a handful for you here:

It is Irresponsible. 

Desktop etiquette—every teacher is a role model, and as a teacher, every time you share your desktop with your students, you demonstrate to them the kinds of organisational and work habits you expect them to imitate. 

Everytime we share a cluttered desktop with a class, or even with parents, we effectively also share our inability to self manage, our lack of organisation, perseverance, diligence, need I go on? The biggest problem is that all of these behaviours are built on bad habits, but these are bad habits I see teachers (and parents) passing on to their children every day.

It is Insecure.

Ironically one of the most common reasons I hear for storing files on the desktop, is their critical importance, 'those are files I need, and I can't afford to lose them...' Really? Because unless you are in the habit of fastidiously backing up your Mac with Time Machine, like every day (in which case you are probably already at Desktop Zero, or close enough), you run the risk of losing it all, one hard drive failure, and that's it, all gone. Desktop files, are the most common space/place where data is lost in my experience. If those files had been placed in a Google Drive folder (or DropBox) then they would have been safe. literally every edit, backed up, in real time—but nothing on your desktop (and your students, if they're imitating you) is being backed up to the cloud, nothing.

Top Tip -  on the Mac, you can create an Alias (right click, or command+option drag and drop) from any 'buried' folder/file so there is a shortcut or alias of it on the desktop, it acts just like the real thing (the parent folder) but with the advantage that it's really ensconced safely within a cloud backed up folder. 

It is Inefficient.

Your computer's desktop is a starting point for your entire computing experience, but—like anything else—if you let it get cluttered your productivity will take a dive, and your stress levels will rise; few things are as frustrating as you or our students not being able to find that file exactly when you/they need it, especially if that entails creating it again... and again... Next time you save a file to the desktop, wouldn't it be nice to be able to find it immediately, and not have to engage and a insanity inducing game of 'Where's Wally'. That's a game I have to aly almost every say that I work with a teacher on a desktop like ... that *shudders*


Clean-desk-high-productivity-toblender.com [modified]

It literally impedes

Because of the way OS X's GUI (graphical user interface) works, the icons on your desktop take up a lot more of your resources than you may realise... Just remember that every single icon on your desktop is actually a small transparent window with graphics (the icon) inside, so if you have, say, 100 icons on your desktop you have 100 windows open, each one stealing memory. And no, dumping them all in folder doesn't really help much, the fact that there is 2764 files in ONE folder, still means that OS X will still have trouble handling one folder with that many files in it..


Computer Desktop & Table Desktop

When we work with students on this, we are attempting to inculcate good habits, habits that will last a lifetime, one such habit is to work from desktop zero, an analogy we find helpful is for them to treat their computer desktop the same as they treat their table desktop in their classroom, as busy as it can get in the course of a normal working day, every day before they go home they are expected to return that space to what is effectively desktop zero 'IRL' (in real life). Everything gets put in it's right place, whether they have finished with that project or not, it goes in the appropriate folder. The difference being with computers being that you can actually work in files while they are in the folder, there's no need to take it out, and so need to put it back, this is why Desktop zero on a computer is easier than desktop IRL. In the same way when you place a folder in the appropriate folder (in Google Drive in the Finder) you can leave it there, and work on it while it is in there.

So, with this in mind, you shift your conception of the role of the desktop, the desktop becomes a temporary, easy to locate, grab, upload, rename "I need it in ten minutes or so" dumping ground. I only use my desktop as a temporary holding place for files I'm working with. Nothing remains there past the end of the day.

Cluttered desk via abcnews.com (Getty Images)

Upgrade Your Workflow

In actual fact the desktop is a folder, it's just a folder that you start from, and while it can function as a storage folder, as so many people have unfortunately proven, that is not its purpose. It was only created as an allegory so people would have something analog to relate the new digital experience to, just like the trash can in the corner‚we don't really keep tiny trash cans on the corner of our table tops, but it functions as an approximate analogy. And like most analogies, it has it's limits. One way forward is to start working the way you do when you use an iPad or similar device. 

New OSes like iOS and Android have thankfully ditched the "file icon sandbox" idea. The only things you are presented with when you look at your device is a launchpad for apps and services. Your data is invisible and agnostic and available only when you are in a program that knows how to display or use it, and you know what it works just fine, no clutter. 

Become more app oriented and less file oriented

In iOS, if you're working on a file, you start by opening the App, then you locate the file from within that App, well the exact same method work on the desktop. Working on a word document? Don't look for the file first, open Word, then you will easily find any recent files in the recent files view. All you need to is drop down the menu bar 2 spaces from Open, to Open Recent—there that's not so hard is it?

Open Recent, don't just Open.

You will find the same feature in any application you use. Trust me. These are conventions that are cross-platform, that means you will be able to take advantage of this workflow no matter what computer or platform you ever use. Invest in now, and you will reap the rewards the rest of your life.

File less, search and sort more

I've written about this already here, spend less time creating and organising folders (although that is important too) and make sure you name your files with keywords you can search for. On all your devices now instant search is everywhere, and on your Mac, you can search in literally any folder you open, from 'All my Files' to 'Documents' if it's in there, somewhere, search will find it, regardless of the folder it's in, but that's no use if the file is called 'Untitled.doc' or "Screen Shot 2015-03-14 at 5.38.12 am". Rename it, then move it.

Sort out your Sorting

When you have a bunch of files on display in your finder, make sure you take advantage of the button which lets you 'change item arrangement' pick whatever option will make it easiest to move the files you want to the top - I personally find the 'Date modified' to be the most useful, but there are options there for everyone.

Illustration by Ben Wiseman via nytimes

Don't procrastinate you can do it today!

The solution is not to just create another folder (which is actually inside the folder which is the desktop) and dump them all in there, it just means you've buried the problem. By all means dump all the files in a (cloud connected) folder (or 3 or 4), just make sure you've deleted the files you won't need again, and give the ones you do need a name you can search for. Once you've done that you'll probably find there are 'themes' forming that lend themselves to folders, but don't let that be an excuse to procrastinate, as you can always change your mind later, computers are convenient like that... 

Clean desk[top] policy via awanbee.com


07 April 2016

Aims, objectives and semantics


One of the first goals we were faced with on our first visit with the T2T Cambodia team was to really establish what the fundamentals of a lesson need to be. It is not until you are forced to defend your rationale for the structure of a lesson that some of the issues of semantics really do come into focus.

Take the typical traditional lesson structure:
  1. Objectives
  2. Activities
  3. Outcomes

With some seasoning from our recent workshops in formative assessment with Dylan Wiliam, this quickly morphed into something a little more nuanced ... When combined with the  5 key strategies of formative assessment, the first three of which are more or less synonymous with the traditional lesson structure...
  1. Clarifying learning intentions
  2. Eliciting evidence
  3. Feedback that moves learning forward
[Students as learning resources for one another
Students as owners of their own learning]

We ended up with something more like:
  1. Learning intentions/objectives
  2. Activities that elicit evidence 
  3. Outcomes as a result of feedback  

And before you know it, with a room full of teachers, it looked like this:
  1. Learning intentions/Aims & objectives
  2. Activities that elicit evidence though active engagements 
  3. Outcomes informed by feedback and based on clear success criteria 

Now trying to explain all that though a translator to a room full of teachers in a room without air conditioning in a temperature in excess of 30° with only the most rudimentary of teaching resources...

What I have found is that you find yourself having to distil everything down to the absolute bare essentials which for me now look something like this, something which funnily enough has enhanced my own understanding of my own practice an intern hopefully improve my practice as a teacher.

For me it is ended up being as simple as:
  1. Aim or Goal or BIG IDEA
  2. Activity
  3. Feedback

But, and this is essential, it has to be iterative.
[Aim, achieved? Great. No? Either change the activity (or maybe even the aim) then try again]

Getting the Aim right is CRITICAL, if the aim is any good then in order to achieve it you will have to move through a series of "objectives" which will automatically require the achievement of "learning intentions" and the design of an activity that facilitates those goals, but that ultimately has one outcome, the achievement of the aim...

Example...

The last IT lesson I observed had learning intentions of:
  • Create a table in a spreadsheet
  • List occupations
  • Add a new column for images
  • Insert images that match the occupations

But what was the AIM? And a well considered aim would make the individual learning intentions redundant. Of course the aim has to be worthwhile, authentic, meaningful… in this case because it was a FOCUS lesson I was able to intervene and redesign the lesson with the teacher) right there, right then. What we did was establish an aim which in this case was...

Use a table to compare a range of at least 5 career opportunities that interest you. Consider the following aspects of each of the occupations you have chosen:

  • Title
  • Brief description
  • Illustration
  • Positive
  • Negatives
  • Salary
  • Qualifications required

With a well-written aim, the specific articulation of learning intentions naturally follows, agonising over them is no longer actually necessary as they will have to be identified in order to fulfil the aim of the lesson. Don't they need to be expanded? Articulated in sentences? I don't think so, any teacher worth their salt will out the mat on the bones, and hopefully also provide feedback in relation to those specific learning intentions, whether or not they actually need to write them on the board is another question.

What was even better about this was that it quickly became obvious that there were quite a few aspects of the occupations that interested students that none of us were in a position to answer… for example salary, instead we asked the students to estimate what they think the salary per month would be, the we did the same for  of the other aspects of each of the qualifications that they chose. Then (using the student resources for one another) we are the students to compare their work… this sparked some passionate discussions as some students had (for example) the lowest paid as a police officer while other students had the police officer highest paid… Discuss!

What was fascinating is when we get the students to then research online to find out what the actual answers are and then compare the estimates with the reality and then to reflect on the disparities or consistencies that they found.

What started out as a rather banal activity in table creation and meaningless data entry became a transformational lesson in career guidance while also fulfilling the (arguably more mundane) ICT requirements. It's all about the aim

05 April 2016

Deliver us from tedious tests and rubrics


via hippoquotes
Assessment drives everything educational. So, not surprisingly, assessment is the biggest factor in terms of planning the use of tech in effective ways. This means that it's critical to ensure that we use a varied range of assessment strategies, which is where I find a surprising lack of options.

Why do so many teachers assume that only rubrics and tests are suitable for assessment? Sure they have their place but only within a suite of assessment strategies...


It feels to me like every educational reference I read or hear about, especially in tech circles, assumes that the only viable option has to be a rubric. I don't mean to denigrate any particular assessment tool—clearly rubrics and tests can be effective assessment tools, but when they dominate, they have an unfortunate tendency to diminish the importance and efficacy of all of the other tools that are available. It is depressingly common to me that in virtually any educational context (classroom, conference, online) when the conversation inevitably turns to assessment, the question seems to default to, 'what rubric or test will we use?' rather than any awareness that there are a plethora of other tools and strategies that could be just as effective if not more so.

Now I am concious that I may be overstating my point, after all, I have to confess I don't hate them, I hate the way they are so often assumed to be the only option worth considering. I loathe the majority I see that are poorly conceived and poorly written. They are often bloated verbose attempts at teasing out questionable differences in attainment, many that seem to be based on the assumptions that just adjusting superlatives is sufficient, like well, very well, independently, with assistance...

Of course I'm not the only one who has a problem with rubrics:

The most famous of whom is probably Alfie Kohn who speaks to the false sense of objectivity and how rubrics have misled many.

And I really like Joe Bower's take on Rubrics, in 'The Folly of Rubrics and Grades'

"Grades and rubrics are a solution in search of a problem that will further destroy learning for its own sake.  
It’s been five years since I used a rubric. I simply don’t need them, nor do my students.
Rather than spending time asking how can we grade better, we really need to be asking why are we grading. And then we need to stop talking about grading altogether and focus our real efforts on real learning."


Most of the rubrics I've seen could be easily replaced by a continuum, at least then all you would need to is define the extremes, but and I guess this is a statement about teaching as a profession, far too many teachers use the term 'rubric' as if it is synonymous with 'assessment tool'.

Rubrics are one of many ways to assess learning, and they are used far too often. Used well a rubric can be a powerful assessment tool, but in my experience I rarely see them used well, and I often see them used inappropriately.  So, yes, they have their place but only within a suite of assessment strategies...

Here's one way to use a rubric well, by making it more student centred' this way the teacher defines a central standard (eg a level 3 on a 5 point scale) and then leaves the students to define and justify the level they feel there work sits in comparison to that, (above or below, or in the middle) with examples.

There are other ways to assess... 


Next time you're assessing, at least consider some alternatives to rubrics. Now before someone accuses this of being more new fangled thinking, here's some out of the Ark:



But one of my favourite summaries of assessment strategies and tools, is this grid from the PYP:


Unfortunately the PYP is allergic to the term 'tests' and (somewhat simplistically in my opinion) assume that all tests can be summarised as 'checklists'. Still, if more educators made more effort to tick all the elements in the above diagram in one year everyone would be a winner. I've always found this matrix from the PYP to be particularly useful to illustrate this, although you may be surprised by the omission of tests from this grid, I believe they (somewhat disparagingly?) categorise these as 'check lists':

Do less, but do it better.

Now of course it's highly possible that teachers are unaware of the wider range of assessment tools they use effectively almost everyday, such as the ad hoc/informal conversations (conferences in the jargon) with students every day, to spirited class debates (not lectures) that utlise skilful Socratic strategies, which are in and of themselves valid assessment tools. The problem is that I think these are seen as somehow inferior to a "proper" test/rubric. All this does is create a lose/lose scenario for the teacher and the student. Rather than focusing on tests and rubrics, wouldn't it be better for everyone if we were to embrace a much wider tool kit when it comes to assessment? To see them all as valid/powerful, maybe that conversation/conference was so effective that adding a rubric or a test is not only unnecessary but possibly even counter productive?

I think if you had asked most teachers why it is that they rely so strongly upon rubrics and tests as opposed to all of the other powerful forms of assessment, I think you would find that they would point to one sad fact; they feel they need paper with marks on, that they can attach a grade to, so they can point to it as being hard evidence of their assessment judgement. While there is clearly a place for this kind of formal (usually summative) judgement, in my experience it is far too frequent and far too common. Teachers could do themselves a favour and do their students a favour by focusing on the goal of learning rather than the need to have a hard artefact to present evidence of every stage of progress.

What if instead we were to focus on the goal, that is, as long as the assessment tools you use allow you to provide effective individual feedback to the student and enables them to progress in their learning point where they are improving compare to their previous level of competence (ipsative assessment), then the goal has been achieved! So why not work a little smarter and use a range of assessment tools that are a far more varied. In so doing you create a classroom environment that is more dynamic, and far more effective for both the teacher and the student.

So what does this have to do with edtech?

From my perspective, a classroom that exploits a wide range of assessment tools is a much richer environment within which to be able to integrate digital tools that can truly enhance and transform the way teachers teach and the way the students learn, and demonstrate the extent to which they have mastered the skills, knowledge and understanding that is truly the point, not just in ways that can be measured quantitatively on another test or a rubric. You don't have to look much further than an early childhood classroom to see this in action. Why? One thing these very young students can't do is demonstrate their understanding via tests or rubrics, which opens up a whole range of extremely rich engaging ways of demonstrating skills knowledge and understanding that would benefit many students that are considerably older, 

04 April 2016

Kids, Concentration, Boredom, & Tech

Photograph: John Slater/Getty Images


Boredom is not a new problem, it is a condition that has to a greater or lesser extent been an aspect of human existence for eons. And yet it seems to me that a pervasive myth is developing, along the lines of assuming that boredom is the fault of computers, that students that use computers are students that cannot concentrate, articles like these are a case in point:

"Technology Changing How Students Learn, Teachers Say"

also

"Technology Creating a Generation of Distracted Students"


The general gist of the arguments could be summarised thus:


Teachers (from middle and high schools) say today’s digital technologies “do more to distract students than to help them academically.”

"There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to persevere in the face of challenging tasks, according to two surveys of teachers..."

".. roughly 75 percent of 2,462 teachers surveyed said that the Internet and search engines had a “mostly positive” impact on student research skills. And they said such tools had made students more self-sufficient researchers.

... nearly 90 percent said that digital technologies were creating “an easily distracted generation with short attention spans.”

... of the 685 teachers surveyed in the Common Sense project, 71 percent said they thought technology was hurting attention span “somewhat” or “a lot.”

That said, these same Teachers remained somewhat optimistic about digital impact, with 77% saying Internet search tools have had a “mostly positive” impact on their students’ work.

Arguments abound, although ones like this strike me as quite strange:

"This could be because search engines and Wikipedia have created an entire generation of students who are used to one-click results and easy-to-Google answers."


Wait. What?

You're saying that if you can get an answer to a question with one click, that is a bad thing? Sure, there will be times when you will have to do a lot more than one click, because you have not been able to get a satisfactory answer to the question. But... if I could get a good answer in one click, believe me I would. If anything, access to the treasure trove of information that is the Internet, makes it much easier to get a multiplicity of sources, rather than only one, much easier than I could with books - yes I said it.


If your students can get the answers to your questions with one click... You're asking the wrong kinds of questions, boring questions. Maybe try asking questions that they can't just google, or that are difficult to google?



So. To the hordes of disgruntled teachers who are so quick to blame technology for short attention spans, I have this to say.

Get better. Get creative.

If your kids are bored, that is because, you are boring them, you are allowing them to be bored. Face it, move on, build a bridge, get over it, and use this as impetus to improve. As Dylan Wiliam says, "teaching is the hardest profession because you can always get better at it; and, "A complaint is a gift" (Although it won't feel like that at the time)."

"The cure for boredom is curiosity. There is no cure for curiosity."

(Widely attributed to Dorothy Parker)

"by removing lecture from class time, we can make classrooms more engaging and human." 

"Why Long Lectures Are Ineffective" Salman Khan


It is unfair to blame technology for short attention spans… We (the human race, not just kids) have had short attention spans for many years, it's just that students are now less inclined to put up with it. Certainly the Time magazine article cites research from 1976, well before the advent of digital technology as we know it - I was a (bored) 6 year old.


I know this may come as a huge shock to anyone who knows me, but I have always had a short attention span; and that predated computers by at least a decade... I am not the only one. Chances are many of them are in your class (and are also your students' parents).


In 1996, in a journal called the National Teaching & Learning Forum, two professors from Indiana University — Joan Middendorf and Alan Kalish — described how research on human attention and retention speaks against the value of long lectures. They cited a 1976 study that detailed the ebbs and flows of students’ focus during a typical class period. Breaking the session down minute-by-minute, the study’s authors determined that students needed a three- to five-minute period of settling down, which would be followed by 10 to 18 minutes of optimal focus. Then—no matter how good the teacher or how compelling the subject matter—there would come a lapse. In the vernacular, the students would “lose it.” Attention would eventually return, but in ever briefer pockets, falling “to three- or four-minute [spurts] towards the end of a standard lecture,” according to the report.


Just in case you didn't catch that. Let me just make that a little clearer:

10 to 18 minutes of optimal focus.

That's it.


So, what we need to do is instead of complaining, get creative.


via technorati

Maybe, just maybe, boredom is nature's way of telling you that you need to change.

03 April 2016

To Code or Not to Code?

That is a good question - and one I am commonly asked by both parents and teachers.

Technically this is not code, it is script... 

There is a LOT more to this than a simple yes or no answer, but my opinion is that I'm not convinced that encouraging kids to become coders (actually computer programmers—coding is actually a slang term) is a great idea, I think they should learn to code, if they're keen, but only so they can understand it better, so they can be creative with it. You see you can employ coders, they are a dime a dozen, they're all over the web. It's the creative 'big picture' aspect that is lacking, ie what to code, not so much how.

That said... It's hard to know what you can do if you don't know how. Basically you don't need to be the best coder, you need to be good enough to really know what its potential is.
"Someday, the understanding of computational processes may be indispensable for people in all occupations. But it’s not yet clear when we’ll cross that bridge from nice-to-know to must-know." 
http://www.nytimes.com/2012/04/01/business/computer-science-for-non-majors-takes-many-forms


"But is it really crucial to be able to code? Many content producers use technology virtually every waking hour of their life, and they don't know a variable from an identifier, or an integer from a string. Personally, I'm conflicted: I have a technical background, but for most people I just don't see how being able to compile code is going to prove useful."  
http://m.gizmodo.com/5897020/is-learning-to-code-more-popular-than-learning-a-foreign-language
"Coding is not a goal. It’s a tool for solving problems. [...] However, much of the “learn to code” frenzy seems to spring from the idea that you can achieve fame and riches by starting a tech company and you need to actually code something first. Programming is not a get-rich-quick scheme. Even if you do hit the jackpot, the CEOs of successful tech companies do not spend a lot of time coding, even if they started out behind a keyboard. There are simply too many other tasks involved in running a company. So if coding is what you really love to do, you probably wouldn't want to be a CEO in the first place.."  
http://www.fastcolabs.com/3020126/no-you-dont-need-to-learn-to-code

Please don't advocate learning to code just for the sake of learning how to code. Or worse, because of the fat paychecks. Instead, I humbly suggest that we spend our time learning how to …
• Research voraciously, and understand how the things around us work at a basic level.
• Communicate effectively with other human beings.
These are skills that extend far beyond mere coding and will help you in every aspect of your life.  
http://gizmodo.com/5910497/please-dont-learn-to-code 

So, don't believe the hype; there is no more need for this generation to learn to code, than there was for the generations that preceded them to do the work of a car mechanic.

Clearly there is no shortage of people that want to code, in the same way that there was no shortage of people throughout the 20th Century who wanted to become automobile engineers, and those that have the predilection will. I mean, the point is it's not hard to act on it, to make it happen, and ... if you can't, then coding is probably not an option for you.

Compare that to say … learning the oboe, well that's not quite so easy to learn if you only have a computer and an internet connection. But there are millions of people out there who do, and are honing their abilities every day, and they don't expect to be paid as much for it as you might think.

So - how do we learn this stuff?

All the people I know who are any use with IT and ICTs (yes, there is a difference) are those who basically taught themselves (including myself). It's almost a rite of passage. My instinct tells me that the kind of kids who can code WILL code, and if they can't find ways to teach themselves using the plethora of resources online, then, they probably haven’t got what it takes to code. Despite the glowing 'FUN, FUN, FUN!' messages that proliferate from some quarters of the web, the truth is, if you want to code, really code, you will need to work hard, you will need to persevere, nothing that is worth having comes easy, and coding is no exception. Simple as that.

"Top companies expect you to know what a recent comp-sci graduate would know, which could include SQL vs. NoSQL databases, the time complexity of some algorithm, or how to implement binary search. ... opportunities are few and far between."

"While there are some excellent companies willing to hire driven and intelligent self-taught engineers, they lie in the minority. Many companies pass over candidates without a formal degree in computer science before reading on; the stigma of low experience is a hard one to break in any industry but especially in those involving technical abilities."

http://qz.com/193896/no-three-month-course-can-teach-you-how-to-code/

I have never been taught 'IT' but I had to teach myself HTML to design web pages, and ActionScript to create Flash animations - at its best, that is what things like coding 'computer science' and subjects like DT teaches kids - YOU can solve your own problems, and you can teach yourself how to do it. The WWWHWW have getting from A to B, even if it means going through D, H and X to get there. The first time.

That's another argument for coding, not so much as a skills for the workplace, but the process, the rationale it demands, here's a quote from my colleague Helen Leeming who teaches IT in MS and HS, from an email exchange we had on this subject: This point about developing critical/analytical thinking through coding is powerful -

"It isn't the coding… its the critical thinking… they don't need to code any more than they need to be able to do quadratic equations - for most people either would be redundant the minute they walk out of school. But they do need to have stretched their minds, to have made their thoughts work in a different way, which both of those will. Almost none of them need to code (or indeed use a lot of what we teach them in school - ox bow lakes for example), but the ability to problem solve is essential. It could be taught through other things, it simply isn't in many cases… And people rarely choose to learn critical thinking unless they are an 'IT geek' and they are the ones that probably can already do it."

I don't understand why people question that this needs to be taught as people won't be coders, while we still do teach algebra and the periodic table to kids that will not be mathematicians or chemists. Education is not about learning a set of knowledge or practical skills that you can use later, it is about teaching you to think, to think in many different ways, to play with ideas in many different ways and to have a toolbox of techniques to address puzzles or problems you meet later. Abstract, critical thinking is one of the tools…"
It should be remembered that one the best ways to get to grips with the kind of logistical thinking skills demanded by coding is by using spreadsheet functions, such as google spreadsheets, right there in the browser, and then move on to writing your own formulae, to solve basic mathematical problems, that right there is the basis of writing code. Starting with a formula as simple as =A1+B1 to things like IF functions:

=IF(A1<B1, "awesome",IF(B1<A1,"amazing"))


So, my advice to potential coders would be learn to walk before you run, or more precisely, learn to walk (scratch) run (stencyl) jump (alice) then you can really get creative (dance) with the source code:



All of the the tools below are free, come with great support materials, tutorials, and communities to get you from A to B, even if you have to travel via N and X.

Coding for kids

Some of the iPad Apps we use to introduce kids to coding

Here's a great set of Apps you can use to introduce our child to coding, even from Kindergarten, this is my suggested sequence of progression, from games that teach the kind of logical thinking needed for coding, to Apps that allow free form creation:


  1. Daisy the Dinosaur
  2. Tynker
  3. Lightbot
  4. Move the Turtle
  5. Hopscotch
  6. Scratch Jr 

Do it yourself...

  1. Start with iPads to learn the basics of control, computer programming thinking, Apps like Daisy the Dinosaur, Hopscotch, Move the Turtle. Apps like these use a drag and drop interface will intuitively grasp the basics of objects, sequencing, loops and events by solving app challenges. 
  2. Move to Code.org,  Scratch http://scratch.mit.edu/, or www.tynker.com 
  3. Progress to Stencyl http://www.stencyl.com/ for iOS App coding using a similar 'block' interface, or alternatively App Inventor
  4. Then download the Xcode App for free from the App store if they feel they are ready to actually use Xcode, there are many online tutorials that can help with this, such as this one.
  5. Try http://www.codecademy.com/ for learning a range of programming languages. 
  6. Then to Alice http://www.alice.org/ 

By then you should be ready for the source code, this site hackerbuddies http://hackerbuddy.com/ will help with this final stage... One-on-one mentoring for startup hackers.

… but even then, which language?

C
Python
JavaScript
PHP
C++
Java

or Xcode for coding iOS Apps

And there are many more ... http://langpop.com/

But I would imagine for most kids the biggest motivator would be to create an app, using xCode (a free App from the App store). Which you can port to from Stencyl, but you have to pay $150 to enable that feature, so you can learn for free, you only need to pay when/if you're ready to put into the market place. Clearly it is the desire to create 'Apps' that is driving the current resurgence in interest in coding. For more on this phenomenon, read this article.


We now also facilitate the UWCSEA coding community through our ECA programme, for MS and HS students. If your child is in Primary and impatient to get going, learning Scratch and Stencyl will ensure they are more than ready by Grade 6, and of course from Grade 9 students have the option of choosing to follow a course in Computing, all the way through to grade 12 if they so choose. Middle school includes a module of coding through Lego Mindstorms in DT and we offer IGCSE Computing and IGCSE IT, and IB Computer Science in High School.

02 April 2016

The Myth of the Cyberkid



We all know that kids are 'digital natives' and the rest of us, well ... we're just... not.

Right?


Wrong.

Allow me to refocus your cultural lens with a few quotes from some eminent scholars in the know:


"The mother of ten-year-old Anna is surely observing a profound generational transformation when she says: I’ll have to come up to a level because otherwise I will, I’ll be a dinosaur, and the children, when children laugh at you and sort of say “Blimey, mum, don’t you even know that?” . . . Already now I might do something and I say “Anna, Anna, what is it I’ve got to do here?” and she’ll go “Oh mum, you’ve just got to click the—” and she’ll be whizzing, whizzing dreadfully.

For previously new media—books, comics, cinema, radio, and television—even if parent weren’t familiar with the particular contents their children engaged with, at least they could access and understand the medium so that, if they wished to understand what their children were doing or share the activity with them, they could. With the advent of digital media,things have changed. The demands of the computer interface are significant, rendering many parents “dinosaurs” in the information age inhabited by their children.


Young people themselves, conscious of being the first generation to grow up with the internet, concur with the public celebration of their status as “digital natives.” Amir (15, from London) says confidently, “I don’t find it hard to use a computer because I got into it quickly. You learn quick because it’s a very fun thing to do.” Nina (17, from Manchester) adds scathingly, “My Dad hasn’t even got a clue. Can’t even work the mouse. . . . So I have to go on the Internet for him.” But while these claims contain a sizeable grain of truth, we must also recognize their rhetorical value for the speakers.

Only in rare instances in history have children gained greater expertise than parents in skills highly valued by society (diasporic children’s learning of the host language before their parents is a good example). More usually, youthful expertise— in music, games, or imaginative play—is accorded little, serious value by adults, even if envied nostalgically. Thus, although young people’s newfound online skills are justifiably trumpeted by both generations, this does not put them beyond critical scrutiny, for the young entrepreneurs and hackers are the exceptions rather than the norm.

...

... one should note that while Ted, like the other two, would appear to a superficial observer to multitask effectively, “whizzing around” in the manner that impressed Anna’s mother, the benefits he gains from the internet are curtailed first by his lack of interest in information, education, or exploration and, second, by his poor skills in searching and evaluating Web sites, though one should not underestimate the importance of gaining communication-related literacy skills, especially for teenagers.

...

As more and more policy emphasis at national and international levels is placed on “media
literacy” or “information literacy” or “internet literacy,” critical scholars have all the more reason simultaneously to support internet literacy initiatives, ... (and) to challenge the inflated public claims regarding the “internet-savvy” teenager that accompany them.

(Livingstone, 2009)



“Our research shows that the argument that there is a generational break between today’s generation of young people who are immersed in new technologies and older generations who are less familiar with technology is flawed.

The diverse ways that young people use technology today shows the argument is too simplistic and that a new single generation, often called the ‘net generation’, with high skill levels in technology does not exist.”

Economic & Social Research Council (ESRC)


See? It does not exist. Yet.


Digital nativity vs digital naivety

On the contrary; the skills of this so called generation of 'digital natives' are in my experience woefully inadequate, now that so many schools have effectively abandoned the explicit teaching of a broad and balanced set of ICT skills. Why? Because they wrongly assume that this is what authentic tech integration looks like ... I've written about how we can avoid this, but suffice it to say, somewhat ironically, this generation is set to be less proficient in their use of tech than their parents! Parents, if you were taught IT, or ICT skills at school, you probably have more to offer your kids in terms of tech skills than you might think...


Digital natives, redefined.

The problem now is that, try as we might, this boat has sailed, and to a rather disconcerting extent, this term seems to have been incorporated into global vernacular... So perhaps, rather than attempting to subvert it, its time to correct it. As danah boyd (sic) points out in her book, 'It's complicated':

Beyond Digital Natives

Most scholars have by now rejected the term digital natives, but the public continues to embrace it. This prompted John Palfrey and Urs Gasser, coauthors of Born Digital: Understanding the First Generation of Digital Natives, to suggest that scholars and youth advocates should reclaim the concept and make it more precise. They argue that dismissing the awkward term fails to account for the shifts that are at play because of new technologies. To correct for misconceptions, they offer a description of digital natives that they feel highlight the inequalities discussed in this chapter:
“Digital natives share a common global culture that is defined not by age, strictly, but by certain attributes and experiences related to how they interact with information technologies, information itself, one another, and other people and institutions. Those who were not ‘born digital' can be just as connected, if not more so, than their younger counterparts. And not everyone born since, say, 1982, happens to be a digital native.” 

References

Boyd, Danah. It's complicated: The social lives of networked teens. Yale University Press, 2014.
Digital Youth, Innovation, and the Unexpected. Edited by Tara McPherson. The John D. and Catherine T. MacArthur Foundation Series on Digital Media and Learning.
Livingstone, Sonia. “Internet Literacy: Young People’s Negotiation of New Online Opportunities."


Images:

http://edu.glogster.com/media/9/41/73/77/41737713.gif
http://www.mondaynote.com/wp-content/uploads/2010/07/144-digital_native2.jpg