15 June 2016

Laptops and Lowest Common Denominators


A colleague shared the outcome of recent research into the efficacy of 1:1 laptop schools, more specifically it was a meta-analysis of 10 studies that examine the impact of laptop programs on students’ academic achievement.

I read it with mixed feelings. On the one hand the fact that the researchers concluded that schools where students have their own laptops see a "significantly positive average effect sizes in English, writing, mathematics, and science" was encouraging. They felt that "the impact of laptop programs on general teaching and learning processes and perceptions was generally positive".

So that's good then. Right?


Well, no; not really.

I couldn't help but notice that their expectations of the actual use of these devices was, to put it mildly, far from tapping into the true potential of these devices. This is despite their inspirational opening clarion call to change the world,

"We believe that the affordances of computers for learning and knowledge production are radically different from those of radio, television, and film, which explains why computers, unlike those previous technologies, are bound to have a very different educational fate from the one suggested by Cuban (1993a, p 185), who wrote that “computer meets classroom: classroom wins.”

So exactly what uses do they have in mind? How do they envisage these radically different affordances? By inspiring the creative expression of learning through the exciting synergies between video, image, text, audio and the deft analysis and application of data?

No, they see the main affordances of these devices in terms of use "to write and revise papers, conduct Internet searches, and engage in personalized instruction and assessment using educational software or online tools". (p 2)

What? That's it? These devices hold the potential to radically transform their world, but let's just use them to type up reports (so they're nice and neat), Google stuff, and take online tests.

How depressing.

Then it gets worse. Having applied the law of the lowest possible expectations of these tools, they proceed to use the worst possible measure to determine their efficacy. We find ourselves in the familiar territory of, when faced with the option of assessing those aspects of learning that are the most important (creativity, solutions, innovation etc), but of course the most difficult to qualify, instead they opt to measure the aspects of learning that are easiest to quantify, with, yes, you guessed it, standardised tests:

"quantitative findings of the impact on students’ academic achievement. [...] Measurements of academic achievement were standardized assessments or norm-referenced district- or school-wide tests." (p 5)

So the measure of efficacy all boils down to that which can be measured on a standardised test. How depressing, and how inappropriate for a medium as rich as that of digital technologies. Like judging a ballerinas dancing ability, based on her spelling. So we use them ineffectively, then assess the efficacy of their use in ways that are utterly unsuitable. Is this really what we expect when we talk about 'technology enhanced learning' in 1:1 environments?

I sincerely hope not. I tell you what though, it would make my job a lot easier if I did.

To be fair to this study, they do accept that there are problems with the ways they are assessing the efficacy of these devices, "studies on this topic have largely done a poor job of assessing learning outcomes that are not well captured by current iterations of standardized tests. As the United States and other countries move to more sophisticated forms of standardized assessment, these new measures may be better aligned with the learning goals believed to be associated with laptop use." (p 25)

I have to wonder whether the corporate world is as obsessed with trying to validate the influence of digital technologies in the workplace as we are with attempting to defend them in the classroom? Do any of us really believe that the corporate world would be better off without digital technologies? Then why would we would we believe that classrooms would be better off without them? Do we really believe the corporate world would spend the millions if not billions it must cost every year to maintain their IT infrastructures if they did not feel they was essential, important, effective?

Don't Settle

As it is I'm not prepared to settle for a lowest common denominator approach, where we abandon any attempts at using these devices anywhere near their potential, and instead settle for using them in the ways that are the easiest, and therefore the most common, even if they are far from being the most effective. By easiest/least I mean ways of working that most closely replicates the traditional approaches to learning that were the norm before the advent of the digital revolution; writing becomes typing, and researching in the library becomes Googling, the results of which we present, in writing. That's it.

Vitamin D[igital] Video, Image, Text, Audio, Data. VITAD.

No, these laptops should be used to create in all five domains, not just text, but image, video, audio and data, and all sorts of overlaps between them. These technologies should exploit all of the attributes that digital tools excel at; situated learning/working, access to global communities/experts, multimodal artefacts, mutable work flows, sharing and collaborating on content using social networks.

In the paper, they state,

"Contrary to Cuban’s (2003) argument that computers are “oversold and underused” (p 179) in schools, laptop environments are reshaping many aspects of education in K–12 schools." (p 24)

But the truth is that if all they are exploiting is their use is as word processors and web browsers, these machines are definitely underused. I guess this could be described as giving up on transformation and focusing on amplification instead, if I'm honest, is this a bad thing? It would certainly make my job easier... Maybe having tech being oversold and underused is better than having them ignored and unused.

I guess the question that I need to wrestle with is, if I need to lower my expectations... Maybe if we just focus on using one domain effectively (text), we'll still see benefits in terms of learning, but perhaps more effectively and consistently—less is more? Perhaps, but I doubt it; focusing on just one domain out of five strikes me as making as much sense as buying a car and using it to keep you dry in the rain, or cool in the sun.

Useful? Yes.

Appropriate? Maybe.

Ideal and Effective? ... 😕🤔😬


Zheng, B., Warschauer, M., Lin, C. H., & Chang, C. (2016). Learning in One-to-One Laptop Environments A Meta-Analysis and Research Synthesis. Review of Educational Research, 0034654316628645.

09 April 2016

Desktop Zero - 4 compelling reasons to make this an essential habit

Yes, this is mine. No I did not cheat, well maybe one folder... 


You don't need me to tell you that your environment affects your productivity. Since a great deal of our work is now done with a screen, it stands to reason that your desktop environment can play an important role in your productivity. Seriously can you really look me in the monitor and tell me that you'd rather work on a desktop that looks like, this?

Messy Desktop by RuthOrtiz


Five Reasons to Change

There are plenty of reasons for making the effort to aim for 'desktop zero', I'll attempt to lay out a handful for you here:

It is Irresponsible. 

Desktop etiquette—every teacher is a role model, and as a teacher, every time you share your desktop with your students, you demonstrate to them the kinds of organisational and work habits you expect them to imitate. 

Everytime we share a cluttered desktop with a class, or even with parents, we effectively also share our inability to self manage, our lack of organisation, perseverance, diligence, need I go on? The biggest problem is that all of these behaviours are built on bad habits, but these are bad habits I see teachers (and parents) passing on to their children every day.

It is Insecure.

Ironically one of the most common reasons I hear for storing files on the desktop, is their critical importance, 'those are files I need, and I can't afford to lose them...' Really? Because unless you are in the habit of fastidiously backing up your Mac with Time Machine, like every day (in which case you are probably already at Desktop Zero, or close enough), you run the risk of losing it all, one hard drive failure, and that's it, all gone. Desktop files, are the most common space/place where data is lost in my experience. If those files had been placed in a Google Drive folder (or DropBox) then they would have been safe. literally every edit, backed up, in real time—but nothing on your desktop (and your students, if they're imitating you) is being backed up to the cloud, nothing.

Top Tip -  on the Mac, you can create an Alias (right click, or command+option drag and drop) from any 'buried' folder/file so there is a shortcut or alias of it on the desktop, it acts just like the real thing (the parent folder) but with the advantage that it's really ensconced safely within a cloud backed up folder. 

It is Inefficient.

Your computer's desktop is a starting point for your entire computing experience, but—like anything else—if you let it get cluttered your productivity will take a dive, and your stress levels will rise; few things are as frustrating as you or our students not being able to find that file exactly when you/they need it, especially if that entails creating it again... and again... Next time you save a file to the desktop, wouldn't it be nice to be able to find it immediately, and not have to engage and a insanity inducing game of 'Where's Wally'. That's a game I have to aly almost every say that I work with a teacher on a desktop like ... that *shudders*


Clean-desk-high-productivity-toblender.com [modified]

It literally impedes

Because of the way OS X's GUI (graphical user interface) works, the icons on your desktop take up a lot more of your resources than you may realise... Just remember that every single icon on your desktop is actually a small transparent window with graphics (the icon) inside, so if you have, say, 100 icons on your desktop you have 100 windows open, each one stealing memory. And no, dumping them all in folder doesn't really help much, the fact that there is 2764 files in ONE folder, still means that OS X will still have trouble handling one folder with that many files in it..


Computer Desktop & Table Desktop

When we work with students on this, we are attempting to inculcate good habits, habits that will last a lifetime, one such habit is to work from desktop zero, an analogy we find helpful is for them to treat their computer desktop the same as they treat their table desktop in their classroom, as busy as it can get in the course of a normal working day, every day before they go home they are expected to return that space to what is effectively desktop zero 'IRL' (in real life). Everything gets put in it's right place, whether they have finished with that project or not, it goes in the appropriate folder. The difference being with computers being that you can actually work in files while they are in the folder, there's no need to take it out, and so need to put it back, this is why Desktop zero on a computer is easier than desktop IRL. In the same way when you place a folder in the appropriate folder (in Google Drive in the Finder) you can leave it there, and work on it while it is in there.

So, with this in mind, you shift your conception of the role of the desktop, the desktop becomes a temporary, easy to locate, grab, upload, rename "I need it in ten minutes or so" dumping ground. I only use my desktop as a temporary holding place for files I'm working with. Nothing remains there past the end of the day.

Cluttered desk via abcnews.com (Getty Images)

Upgrade Your Workflow

In actual fact the desktop is a folder, it's just a folder that you start from, and while it can function as a storage folder, as so many people have unfortunately proven, that is not its purpose. It was only created as an allegory so people would have something analog to relate the new digital experience to, just like the trash can in the corner‚we don't really keep tiny trash cans on the corner of our table tops, but it functions as an approximate analogy. And like most analogies, it has it's limits. One way forward is to start working the way you do when you use an iPad or similar device. 

New OSes like iOS and Android have thankfully ditched the "file icon sandbox" idea. The only things you are presented with when you look at your device is a launchpad for apps and services. Your data is invisible and agnostic and available only when you are in a program that knows how to display or use it, and you know what it works just fine, no clutter. 

Become more app oriented and less file oriented

In iOS, if you're working on a file, you start by opening the App, then you locate the file from within that App, well the exact same method work on the desktop. Working on a word document? Don't look for the file first, open Word, then you will easily find any recent files in the recent files view. All you need to is drop down the menu bar 2 spaces from Open, to Open Recent—there that's not so hard is it?

Open Recent, don't just Open.

You will find the same feature in any application you use. Trust me. These are conventions that are cross-platform, that means you will be able to take advantage of this workflow no matter what computer or platform you ever use. Invest in now, and you will reap the rewards the rest of your life.

File less, search and sort more

I've written about this already here, spend less time creating and organising folders (although that is important too) and make sure you name your files with keywords you can search for. On all your devices now instant search is everywhere, and on your Mac, you can search in literally any folder you open, from 'All my Files' to 'Documents' if it's in there, somewhere, search will find it, regardless of the folder it's in, but that's no use if the file is called 'Untitled.doc' or "Screen Shot 2015-03-14 at 5.38.12 am". Rename it, then move it.

Sort out your Sorting

When you have a bunch of files on display in your finder, make sure you take advantage of the button which lets you 'change item arrangement' pick whatever option will make it easiest to move the files you want to the top - I personally find the 'Date modified' to be the most useful, but there are options there for everyone.

Illustration by Ben Wiseman via nytimes

Don't procrastinate you can do it today!

The solution is not to just create another folder (which is actually inside the folder which is the desktop) and dump them all in there, it just means you've buried the problem. By all means dump all the files in a (cloud connected) folder (or 3 or 4), just make sure you've deleted the files you won't need again, and give the ones you do need a name you can search for. Once you've done that you'll probably find there are 'themes' forming that lend themselves to folders, but don't let that be an excuse to procrastinate, as you can always change your mind later, computers are convenient like that... 

Clean desk[top] policy via awanbee.com


07 April 2016

Aims, objectives and semantics


One of the first goals we were faced with on our first visit with the T2T Cambodia team was to really establish what the fundamentals of a lesson need to be. It is not until you are forced to defend your rationale for the structure of a lesson that some of the issues of semantics really do come into focus.

Take the typical traditional lesson structure:
  1. Objectives
  2. Activities
  3. Outcomes

With some seasoning from our recent workshops in formative assessment with Dylan Wiliam, this quickly morphed into something a little more nuanced ... When combined with the  5 key strategies of formative assessment, the first three of which are more or less synonymous with the traditional lesson structure...
  1. Clarifying learning intentions
  2. Eliciting evidence
  3. Feedback that moves learning forward
[Students as learning resources for one another
Students as owners of their own learning]

We ended up with something more like:
  1. Learning intentions/objectives
  2. Activities that elicit evidence 
  3. Outcomes as a result of feedback  

And before you know it, with a room full of teachers, it looked like this:
  1. Learning intentions/Aims & objectives
  2. Activities that elicit evidence though active engagements 
  3. Outcomes informed by feedback and based on clear success criteria 

Now trying to explain all that though a translator to a room full of teachers in a room without air conditioning in a temperature in excess of 30° with only the most rudimentary of teaching resources...

What I have found is that you find yourself having to distil everything down to the absolute bare essentials which for me now look something like this, something which funnily enough has enhanced my own understanding of my own practice an intern hopefully improve my practice as a teacher.

For me it is ended up being as simple as:
  1. Aim or Goal or BIG IDEA
  2. Activity
  3. Feedback

But, and this is essential, it has to be iterative.
[Aim, achieved? Great. No? Either change the activity (or maybe even the aim) then try again]

Getting the Aim right is CRITICAL, if the aim is any good then in order to achieve it you will have to move through a series of "objectives" which will automatically require the achievement of "learning intentions" and the design of an activity that facilitates those goals, but that ultimately has one outcome, the achievement of the aim...

Example...

The last IT lesson I observed had learning intentions of:
  • Create a table in a spreadsheet
  • List occupations
  • Add a new column for images
  • Insert images that match the occupations

But what was the AIM? And a well considered aim would make the individual learning intentions redundant. Of course the aim has to be worthwhile, authentic, meaningful… in this case because it was a FOCUS lesson I was able to intervene and redesign the lesson with the teacher) right there, right then. What we did was establish an aim which in this case was...

Use a table to compare a range of at least 5 career opportunities that interest you. Consider the following aspects of each of the occupations you have chosen:

  • Title
  • Brief description
  • Illustration
  • Positive
  • Negatives
  • Salary
  • Qualifications required

With a well-written aim, the specific articulation of learning intentions naturally follows, agonising over them is no longer actually necessary as they will have to be identified in order to fulfil the aim of the lesson. Don't they need to be expanded? Articulated in sentences? I don't think so, any teacher worth their salt will out the mat on the bones, and hopefully also provide feedback in relation to those specific learning intentions, whether or not they actually need to write them on the board is another question.

What was even better about this was that it quickly became obvious that there were quite a few aspects of the occupations that interested students that none of us were in a position to answer… for example salary, instead we asked the students to estimate what they think the salary per month would be, the we did the same for  of the other aspects of each of the qualifications that they chose. Then (using the student resources for one another) we are the students to compare their work… this sparked some passionate discussions as some students had (for example) the lowest paid as a police officer while other students had the police officer highest paid… Discuss!

What was fascinating is when we get the students to then research online to find out what the actual answers are and then compare the estimates with the reality and then to reflect on the disparities or consistencies that they found.

What started out as a rather banal activity in table creation and meaningless data entry became a transformational lesson in career guidance while also fulfilling the (arguably more mundane) ICT requirements. It's all about the aim

06 April 2016

21st Century Spelling


Spilling had never bin maw impotent

Spelling has never been more important, as my example above attempts to illustrate. In an age dominated by screens, misspelling is tantamount to an admission of idiocy—but the ways we teach spelling needs to evolve to take advantage of the unique affordances and challenges of spelling in a screen environment. Please note that none of the words in the title are actually misspellings, but mistakes they are, and a right twazzock you will look if you spell in a way that is overly reliant on proofreading tools as a safety net. It's time we took account of the fact that in a world dominated by screens the ways we teach spelling needs to evolve to take advantage of the unique affordances and challenges of spelling in a screen environment.

These days the likelihood of interacting with others in a digital environment is an extremely commonplace scenario. Even more critical, people who misspell in these environments are generally assumed to be less intelligent, less articulate, and despite their possible intelligence/experience, their perspective will be dismissed or demeaned if it is littered with misspellings. It has never been more important to master the ability to spell correctly. Unfortunately most schools, despite the criticality of spelling in the 21st-century, still rely on 19th century strategies to teach spelling. This really does need to change. So, with that in mind...

14 critical considerations:

  1. Spelling should be managed within the context of writing, and not as a separate "subject", For that reason keeping a separate spelling book is discouraged; a better practice is to think of and learn about words and misspellings and sounds within the context of writing, so for example words that are encountered that are challenging to spell should be recorded at the back of a student's writing book, not in a separate spelling book.
  2. Less is more, more frequent opportunities for kids to think about spelling, but for much shorter periods of time (10-15 minutes per day)
  3. Make direct connections between spelling and handwriting, the actual physicality or skill of the formation of the letters as they are literally connected is meant to reinforce the way the sounds are connected, and builds a visual reinforcement. The idea is to combine physical visual, oral and aural practise to reinforce the feel, the shape and the sound of a word.
  4. Children (and adults) can only spell words they know, sounds obvious, but so many of the spelling lists that are used with students contain words they do not know, so could not possibly be able to spell, other than through guesswork, which leads us to...
  5. There is a much greater validity to the skill of being able "guesstimate" in a TELE (technology enhanced learning environment), and ‘phonological awareness’ is more essential than ever, as an accurate phonetical estimation is relied on by computers to substitute for a correct spelling. A student who cannot phonetically 'attack' a word is unlikely to be able to approximate something that a computer can correct. Related to this is the critical importance of being able to spell the first half of a word correctly, most modern computing devices can now auto complete a word if a student is able to spell the first half of it correctly. Apple's 'QuickType' in iOS 8, and apps like "SwiftKey" utilise this approach very effectively, and the power of Dictation (speech to text) has never been greater, but it will still struggle with homophones (same sound different spelling and meaning). An alternative approach in a 'spelling test' context is to award 2 marks to each word, one mark for being able to spell the word phonetically correctly, or for spelling the first half correctly, and 2 marks if the word is perfect. 
  6. Stop using spelling tests for whole classes with lists of words, this is a nonsensical approach, considering the sheer quantity of words in the typical English dictionary, somewhere in the region of 400,000 words. The words that children learn should be unique and curated from their own literacy life, related to their own writing, reading and speaking, and viewing and listening experiences, or related to specific vocabulary that they are using/used (not will use) in a current unit of study.
  7. Wordlists curated by students should be seen as a source of vocabulary expansion, not just for spelling. Becoming a personal thesaurus/glossary that they should review regularly when writing to enhance the richness of their prose; use it or lose it.
  8. Less reliance upon "spelling rules" which are very rarely consistent, and in many cases can lead to a great deal of confusion. Like when students are asked to note the position of a certain vowel in a word and its impact upon other vowels or consonants within that word, also using acrostics like 'big elephants can always understand...' you get the idea, and of course they only work for one word… Instead focus on more reliance on building familiarity with the way words look and the way words sound, so 'look say cover write check' still works well as a useful skill/drill practice, but with fewer words, more often. This is strongly related to the student's reading life as a synergetic enabler in their spelling life. This becomes a context where students are encouraged to see words as 'friends' and building a large community of 'familiar faces' ie, the more they see these words the more likely they are to be able to spell them, or arguably just as important in the 21st-century, to recognise when the word is not spelt properly, ‘it just doesn't look right'.
  9. Skill drill tasks (practise makes permanent) should also be related to an activity that reinforces their comprehension of the meaning of the word, so ideally students should also invent (not copy) a sentence that uses the word, or even better, more than one of the words in the same sentence, that clearly demonstrates that they can use the word/s with an understanding of it/them. For some students it might be better for them to make an oral recording of them speaking the sentence rather than writing a sentence, if the writing is a challenge to reluctant writers, as the focus is on understanding meaning, and oral recall can be just as effective for building meaning, this is especially important with homophones.
  10. More recognition of the kinds of spellings that are particularly relevant to a screen centred writing environment, this means a greater emphasis on distinguishing between words with similar sounds and different patterns, homophones, homonyms, homographs.
  11. Making smarter use of digital tools to facilitate this kind of practice, while spelling activities that are built on skill drill using pre-set wordlists have their place, particularly for high frequency for younger learners, for older/more proficient students, encourage spelling drills that are built on individually curated wordlists. Unfortunately Apps that facilitate this kind of curation are not very common, but at least one that does this very well is Squeebles SP, although you have to ask students to pretend to be a teacher to do so.
  12. Use a word processor to spell check, before using a teacher. This could be a simple as a Notes app on a mobile device) to enable students to check spellings without the tedium of using a dictionary. Then the teacher reviews the spelling for careless mistakes, or more likely mistakes resulting from misconceptions about phonetics/word structure. Students need to be empowered to build habits of capturing/collecting words that they know, but cannot spell in their curated lists. The point is, it is better for the student to attempt to type the word in a text application and have the computer suggest corrections than it is for them to try and search for it in a dictionary. While the latter is still helpful, the former is a better cognitive process for learning the spelling of a word, and is also more relevant/likely as an activity or skill set in the 21st-century. Very few adults look up words in a dictionary, most rely on the prompt given by the computer in a word processing environment.
  13. Encourage students to learn how to use the "define" search term in Google, effectively turning any Google search window into a handy Dictionary, eg - define: magnificent
  14. Digital technologies are changing which words are traditionally understood to be "tricky" words/spelling Demons/sneaky spellings… so for example any word typed in a text environment will automatically switch the 'ie' in a word like receive, but will not be able to distinguish between homonyms.


Squeebles Showcase

Squeebles Spelling - multimodal drill and practice
I'm not usually one to emphasise a tool, but from time to time a tool emerges that has affordances that are ridiculous to ignore, Squeebles Spelling is one of those. Digital tools like Squeebles can transform spelling practice by making traditional equivalents pale in comparison, consider the following:

Flexibility


Click to see Squeebles in action! 
Kids can 'masquerade' as a parent or teacher to curate their own lists, careless errors are mitigated by the built in spell check—obviously this feature is not activated when they are actually practising! Alternatively, there are a wide range of built in word lists to choose from that cater to all skill levels.

Multimodality and meaning

It's not enough to spell a word, they need to know how it sounds and understand the meaning. In Squeebles kids can record the sound of the word, as well place it in a sentence, eg "Pear. I like the taste of a pear better than an apple. Pear." Better still make it fun by having the kids make up silly sentences, as long as it shows they understand the meaning anything goes! This makes the activity aural and oral - this way the kids say the word, hear the word, and see the word. 

Immediate feedback - differentiated

No need to wait for a teacher to collect in all the spelling tests, then wait a few days to get them all back, even then, actually acting on the spelling errors is a chore, never mind tracking these over time. Squeebles provides immediate feedback, but even better keeps a record of any errors in a collection called 'Tricky Words' that reflect the words that this individual is struggling with.

Motivation

Last and maybe least, Squeebles 'gamifies' the successes into mini games, so kids feels a tangible sense of reward, over and above the real reward—improved spelling.



05 April 2016

Deliver us from tedious tests and rubrics


via hippoquotes
Assessment drives everything educational. So, not surprisingly, assessment is the biggest factor in terms of planning the use of tech in effective ways. This means that it's critical to ensure that we use a varied range of assessment strategies, which is where I find a surprising lack of options.

Why do so many teachers assume that only rubrics and tests are suitable for assessment? Sure they have their place but only within a suite of assessment strategies...


It feels to me like every educational reference I read or hear about, especially in tech circles, assumes that the only viable option has to be a rubric. I don't mean to denigrate any particular assessment tool—clearly rubrics and tests can be effective assessment tools, but when they dominate, they have an unfortunate tendency to diminish the importance and efficacy of all of the other tools that are available. It is depressingly common to me that in virtually any educational context (classroom, conference, online) when the conversation inevitably turns to assessment, the question seems to default to, 'what rubric or test will we use?' rather than any awareness that there are a plethora of other tools and strategies that could be just as effective if not more so.

Now I am concious that I may be overstating my point, after all, I have to confess I don't hate them, I hate the way they are so often assumed to be the only option worth considering. I loathe the majority I see that are poorly conceived and poorly written. They are often bloated verbose attempts at teasing out questionable differences in attainment, many that seem to be based on the assumptions that just adjusting superlatives is sufficient, like well, very well, independently, with assistance...

Of course I'm not the only one who has a problem with rubrics:

The most famous of whom is probably Alfie Kohn who speaks to the false sense of objectivity and how rubrics have misled many.

And I really like Joe Bower's take on Rubrics, in 'The Folly of Rubrics and Grades'

"Grades and rubrics are a solution in search of a problem that will further destroy learning for its own sake.  
It’s been five years since I used a rubric. I simply don’t need them, nor do my students.
Rather than spending time asking how can we grade better, we really need to be asking why are we grading. And then we need to stop talking about grading altogether and focus our real efforts on real learning."


Most of the rubrics I've seen could be easily replaced by a continuum, at least then all you would need to is define the extremes, but and I guess this is a statement about teaching as a profession, far too many teachers use the term 'rubric' as if it is synonymous with 'assessment tool'.

Rubrics are one of many ways to assess learning, and they are used far too often. Used well a rubric can be a powerful assessment tool, but in my experience I rarely see them used well, and I often see them used inappropriately.  So, yes, they have their place but only within a suite of assessment strategies...

Here's one way to use a rubric well, by making it more student centred' this way the teacher defines a central standard (eg a level 3 on a 5 point scale) and then leaves the students to define and justify the level they feel there work sits in comparison to that, (above or below, or in the middle) with examples.

There are other ways to assess... 


Next time you're assessing, at least consider some alternatives to rubrics. Now before someone accuses this of being more new fangled thinking, here's some out of the Ark:



But one of my favourite summaries of assessment strategies and tools, is this grid from the PYP:


Unfortunately the PYP is allergic to the term 'tests' and (somewhat simplistically in my opinion) assume that all tests can be summarised as 'checklists'. Still, if more educators made more effort to tick all the elements in the above diagram in one year everyone would be a winner. I've always found this matrix from the PYP to be particularly useful to illustrate this, although you may be surprised by the omission of tests from this grid, I believe they (somewhat disparagingly?) categorise these as 'check lists':

Do less, but do it better.

Now of course it's highly possible that teachers are unaware of the wider range of assessment tools they use effectively almost everyday, such as the ad hoc/informal conversations (conferences in the jargon) with students every day, to spirited class debates (not lectures) that utlise skilful Socratic strategies, which are in and of themselves valid assessment tools. The problem is that I think these are seen as somehow inferior to a "proper" test/rubric. All this does is create a lose/lose scenario for the teacher and the student. Rather than focusing on tests and rubrics, wouldn't it be better for everyone if we were to embrace a much wider tool kit when it comes to assessment? To see them all as valid/powerful, maybe that conversation/conference was so effective that adding a rubric or a test is not only unnecessary but possibly even counter productive?

I think if you had asked most teachers why it is that they rely so strongly upon rubrics and tests as opposed to all of the other powerful forms of assessment, I think you would find that they would point to one sad fact; they feel they need paper with marks on, that they can attach a grade to, so they can point to it as being hard evidence of their assessment judgement. While there is clearly a place for this kind of formal (usually summative) judgement, in my experience it is far too frequent and far too common. Teachers could do themselves a favour and do their students a favour by focusing on the goal of learning rather than the need to have a hard artefact to present evidence of every stage of progress.

What if instead we were to focus on the goal, that is, as long as the assessment tools you use allow you to provide effective individual feedback to the student and enables them to progress in their learning point where they are improving compare to their previous level of competence (ipsative assessment), then the goal has been achieved! So why not work a little smarter and use a range of assessment tools that are a far more varied. In so doing you create a classroom environment that is more dynamic, and far more effective for both the teacher and the student.

So what does this have to do with edtech?

From my perspective, a classroom that exploits a wide range of assessment tools is a much richer environment within which to be able to integrate digital tools that can truly enhance and transform the way teachers teach and the way the students learn, and demonstrate the extent to which they have mastered the skills, knowledge and understanding that is truly the point, not just in ways that can be measured quantitatively on another test or a rubric. You don't have to look much further than an early childhood classroom to see this in action. Why? One thing these very young students can't do is demonstrate their understanding via tests or rubrics, which opens up a whole range of extremely rich engaging ways of demonstrating skills knowledge and understanding that would benefit many students that are considerably older, 

04 April 2016

Kids, Concentration, Boredom, & Tech

Photograph: John Slater/Getty Images


Boredom is not a new problem, it is a condition that has to a greater or lesser extent been an aspect of human existence for eons. And yet it seems to me that a pervasive myth is developing, along the lines of assuming that boredom is the fault of computers, that students that use computers are students that cannot concentrate, articles like these are a case in point:

"Technology Changing How Students Learn, Teachers Say"

also

"Technology Creating a Generation of Distracted Students"


The general gist of the arguments could be summarised thus:


Teachers (from middle and high schools) say today’s digital technologies “do more to distract students than to help them academically.”

"There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to persevere in the face of challenging tasks, according to two surveys of teachers..."

".. roughly 75 percent of 2,462 teachers surveyed said that the Internet and search engines had a “mostly positive” impact on student research skills. And they said such tools had made students more self-sufficient researchers.

... nearly 90 percent said that digital technologies were creating “an easily distracted generation with short attention spans.”

... of the 685 teachers surveyed in the Common Sense project, 71 percent said they thought technology was hurting attention span “somewhat” or “a lot.”

That said, these same Teachers remained somewhat optimistic about digital impact, with 77% saying Internet search tools have had a “mostly positive” impact on their students’ work.

Arguments abound, although ones like this strike me as quite strange:

"This could be because search engines and Wikipedia have created an entire generation of students who are used to one-click results and easy-to-Google answers."


Wait. What?

You're saying that if you can get an answer to a question with one click, that is a bad thing? Sure, there will be times when you will have to do a lot more than one click, because you have not been able to get a satisfactory answer to the question. But... if I could get a good answer in one click, believe me I would. If anything, access to the treasure trove of information that is the Internet, makes it much easier to get a multiplicity of sources, rather than only one, much easier than I could with books - yes I said it.


If your students can get the answers to your questions with one click... You're asking the wrong kinds of questions, boring questions. Maybe try asking questions that they can't just google, or that are difficult to google?



So. To the hordes of disgruntled teachers who are so quick to blame technology for short attention spans, I have this to say.

Get better. Get creative.

If your kids are bored, that is because, you are boring them, you are allowing them to be bored. Face it, move on, build a bridge, get over it, and use this as impetus to improve. As Dylan Wiliam says, "teaching is the hardest profession because you can always get better at it; and, "A complaint is a gift" (Although it won't feel like that at the time)."

"The cure for boredom is curiosity. There is no cure for curiosity."

(Widely attributed to Dorothy Parker)

"by removing lecture from class time, we can make classrooms more engaging and human." 

"Why Long Lectures Are Ineffective" Salman Khan


It is unfair to blame technology for short attention spans… We (the human race, not just kids) have had short attention spans for many years, it's just that students are now less inclined to put up with it. Certainly the Time magazine article cites research from 1976, well before the advent of digital technology as we know it - I was a (bored) 6 year old.


I know this may come as a huge shock to anyone who knows me, but I have always had a short attention span; and that predated computers by at least a decade... I am not the only one. Chances are many of them are in your class (and are also your students' parents).


In 1996, in a journal called the National Teaching & Learning Forum, two professors from Indiana University — Joan Middendorf and Alan Kalish — described how research on human attention and retention speaks against the value of long lectures. They cited a 1976 study that detailed the ebbs and flows of students’ focus during a typical class period. Breaking the session down minute-by-minute, the study’s authors determined that students needed a three- to five-minute period of settling down, which would be followed by 10 to 18 minutes of optimal focus. Then—no matter how good the teacher or how compelling the subject matter—there would come a lapse. In the vernacular, the students would “lose it.” Attention would eventually return, but in ever briefer pockets, falling “to three- or four-minute [spurts] towards the end of a standard lecture,” according to the report.


Just in case you didn't catch that. Let me just make that a little clearer:

10 to 18 minutes of optimal focus.

That's it.


So, what we need to do is instead of complaining, get creative.


via technorati

Maybe, just maybe, boredom is nature's way of telling you that you need to change.

03 April 2016

To Code or Not to Code?

That is a good question - and one I am commonly asked by both parents and teachers.

Technically this is not code, it is script... 

There is a LOT more to this than a simple yes or no answer, but my opinion is that I'm not convinced that encouraging kids to become coders (actually computer programmers—coding is actually a slang term) is a great idea, I think they should learn to code, if they're keen, but only so they can understand it better, so they can be creative with it. You see you can employ coders, they are a dime a dozen, they're all over the web. It's the creative 'big picture' aspect that is lacking, ie what to code, not so much how.

That said... It's hard to know what you can do if you don't know how. Basically you don't need to be the best coder, you need to be good enough to really know what its potential is.
"Someday, the understanding of computational processes may be indispensable for people in all occupations. But it’s not yet clear when we’ll cross that bridge from nice-to-know to must-know." 
http://www.nytimes.com/2012/04/01/business/computer-science-for-non-majors-takes-many-forms


"But is it really crucial to be able to code? Many content producers use technology virtually every waking hour of their life, and they don't know a variable from an identifier, or an integer from a string. Personally, I'm conflicted: I have a technical background, but for most people I just don't see how being able to compile code is going to prove useful."  
http://m.gizmodo.com/5897020/is-learning-to-code-more-popular-than-learning-a-foreign-language
"Coding is not a goal. It’s a tool for solving problems. [...] However, much of the “learn to code” frenzy seems to spring from the idea that you can achieve fame and riches by starting a tech company and you need to actually code something first. Programming is not a get-rich-quick scheme. Even if you do hit the jackpot, the CEOs of successful tech companies do not spend a lot of time coding, even if they started out behind a keyboard. There are simply too many other tasks involved in running a company. So if coding is what you really love to do, you probably wouldn't want to be a CEO in the first place.."  
http://www.fastcolabs.com/3020126/no-you-dont-need-to-learn-to-code

Please don't advocate learning to code just for the sake of learning how to code. Or worse, because of the fat paychecks. Instead, I humbly suggest that we spend our time learning how to …
• Research voraciously, and understand how the things around us work at a basic level.
• Communicate effectively with other human beings.
These are skills that extend far beyond mere coding and will help you in every aspect of your life.  
http://gizmodo.com/5910497/please-dont-learn-to-code 

So, don't believe the hype; there is no more need for this generation to learn to code, than there was for the generations that preceded them to do the work of a car mechanic.

Clearly there is no shortage of people that want to code, in the same way that there was no shortage of people throughout the 20th Century who wanted to become automobile engineers, and those that have the predilection will. I mean, the point is it's not hard to act on it, to make it happen, and ... if you can't, then coding is probably not an option for you.

Compare that to say … learning the oboe, well that's not quite so easy to learn if you only have a computer and an internet connection. But there are millions of people out there who do, and are honing their abilities every day, and they don't expect to be paid as much for it as you might think.

So - how do we learn this stuff?

All the people I know who are any use with IT and ICTs (yes, there is a difference) are those who basically taught themselves (including myself). It's almost a rite of passage. My instinct tells me that the kind of kids who can code WILL code, and if they can't find ways to teach themselves using the plethora of resources online, then, they probably haven’t got what it takes to code. Despite the glowing 'FUN, FUN, FUN!' messages that proliferate from some quarters of the web, the truth is, if you want to code, really code, you will need to work hard, you will need to persevere, nothing that is worth having comes easy, and coding is no exception. Simple as that.

"Top companies expect you to know what a recent comp-sci graduate would know, which could include SQL vs. NoSQL databases, the time complexity of some algorithm, or how to implement binary search. ... opportunities are few and far between."

"While there are some excellent companies willing to hire driven and intelligent self-taught engineers, they lie in the minority. Many companies pass over candidates without a formal degree in computer science before reading on; the stigma of low experience is a hard one to break in any industry but especially in those involving technical abilities."

http://qz.com/193896/no-three-month-course-can-teach-you-how-to-code/

I have never been taught 'IT' but I had to teach myself HTML to design web pages, and ActionScript to create Flash animations - at its best, that is what things like coding 'computer science' and subjects like DT teaches kids - YOU can solve your own problems, and you can teach yourself how to do it. The WWWHWW have getting from A to B, even if it means going through D, H and X to get there. The first time.

That's another argument for coding, not so much as a skills for the workplace, but the process, the rationale it demands, here's a quote from my colleague Helen Leeming who teaches IT in MS and HS, from an email exchange we had on this subject: This point about developing critical/analytical thinking through coding is powerful -

"It isn't the coding… its the critical thinking… they don't need to code any more than they need to be able to do quadratic equations - for most people either would be redundant the minute they walk out of school. But they do need to have stretched their minds, to have made their thoughts work in a different way, which both of those will. Almost none of them need to code (or indeed use a lot of what we teach them in school - ox bow lakes for example), but the ability to problem solve is essential. It could be taught through other things, it simply isn't in many cases… And people rarely choose to learn critical thinking unless they are an 'IT geek' and they are the ones that probably can already do it."

I don't understand why people question that this needs to be taught as people won't be coders, while we still do teach algebra and the periodic table to kids that will not be mathematicians or chemists. Education is not about learning a set of knowledge or practical skills that you can use later, it is about teaching you to think, to think in many different ways, to play with ideas in many different ways and to have a toolbox of techniques to address puzzles or problems you meet later. Abstract, critical thinking is one of the tools…"
It should be remembered that one the best ways to get to grips with the kind of logistical thinking skills demanded by coding is by using spreadsheet functions, such as google spreadsheets, right there in the browser, and then move on to writing your own formulae, to solve basic mathematical problems, that right there is the basis of writing code. Starting with a formula as simple as =A1+B1 to things like IF functions:

=IF(A1<B1, "awesome",IF(B1<A1,"amazing"))


So, my advice to potential coders would be learn to walk before you run, or more precisely, learn to walk (scratch) run (stencyl) jump (alice) then you can really get creative (dance) with the source code:



All of the the tools below are free, come with great support materials, tutorials, and communities to get you from A to B, even if you have to travel via N and X.

Coding for kids

Some of the iPad Apps we use to introduce kids to coding

Here's a great set of Apps you can use to introduce our child to coding, even from Kindergarten, this is my suggested sequence of progression, from games that teach the kind of logical thinking needed for coding, to Apps that allow free form creation:


  1. Daisy the Dinosaur
  2. Tynker
  3. Lightbot
  4. Move the Turtle
  5. Hopscotch
  6. Scratch Jr 

Do it yourself...

  1. Start with iPads to learn the basics of control, computer programming thinking, Apps like Daisy the Dinosaur, Hopscotch, Move the Turtle. Apps like these use a drag and drop interface will intuitively grasp the basics of objects, sequencing, loops and events by solving app challenges. 
  2. Move to Code.org,  Scratch http://scratch.mit.edu/, or www.tynker.com 
  3. Progress to Stencyl http://www.stencyl.com/ for iOS App coding using a similar 'block' interface, or alternatively App Inventor
  4. Then download the Xcode App for free from the App store if they feel they are ready to actually use Xcode, there are many online tutorials that can help with this, such as this one.
  5. Try http://www.codecademy.com/ for learning a range of programming languages. 
  6. Then to Alice http://www.alice.org/ 

By then you should be ready for the source code, this site hackerbuddies http://hackerbuddy.com/ will help with this final stage... One-on-one mentoring for startup hackers.

… but even then, which language?

C
Python
JavaScript
PHP
C++
Java

or Xcode for coding iOS Apps

And there are many more ... http://langpop.com/

But I would imagine for most kids the biggest motivator would be to create an app, using xCode (a free App from the App store). Which you can port to from Stencyl, but you have to pay $150 to enable that feature, so you can learn for free, you only need to pay when/if you're ready to put into the market place. Clearly it is the desire to create 'Apps' that is driving the current resurgence in interest in coding. For more on this phenomenon, read this article.


We now also facilitate the UWCSEA coding community through our ECA programme, for MS and HS students. If your child is in Primary and impatient to get going, learning Scratch and Stencyl will ensure they are more than ready by Grade 6, and of course from Grade 9 students have the option of choosing to follow a course in Computing, all the way through to grade 12 if they so choose. Middle school includes a module of coding through Lego Mindstorms in DT and we offer IGCSE Computing and IGCSE IT, and IB Computer Science in High School.