Pages

Wednesday, 22 February 2017

Whiplash (2014): A Review

Believe it or not but I have a schedule or agenda of sorts for this blog. However, I rarely actually stick to it because, you know, writing stuff is this blog's philosophy and, as a lecturer once told us, inspiration is stochastic. Or, at least, something like that (he was talking about how essays are invariably written in exams). But the thing is that I believe there's also some kind of "it comes to me" moment of greatness... that you could practise whatever all the time in the world but that truly inspired moment of Great Art was in there to start with and you just got to the technical level that would let it out. Which is the context and the philosophical frame of mind to this review of Whiplash.

Now, it must be said that I don't listen to jazz all that often any more. Indeed, "Everybody Wants to be a Cat" is probably my most listened to jazz song. Yet, it remains that I don't think that great jazz is made with drumming like you might get in "Wipeout" or "Trampled Under Foot". So was I really ever going to enjoy Whiplash, a film about jazz drumming? Probably not. As it turns out, though, making the main character a total dickhead was the main problem. But, you know, it is obvious why film critics liked it. Spoilers to follow, but we'll start with this quote from, The Uncomfortable Message in Whiplash's Dazzling Finale, a review:
In Whiplash, jazz drummer Andrew endures a brutal, sustained campaign of bullying and abuse, both psychological and physical, at the hands of Fletcher, the conductor of his conservatory's prestigious studio band. He eventually washes out under the extreme pressure and, at the urging of his concerned father, anonymously gets Fletcher fired for abuse. In the final scene, Andrew ends up at Carnegie Hall subbing in for Fletcher's concert band. It's a final cruel ruse orchestrated by Fletcher, who wants to humiliate Andrew publicly by cueing him up to play the wrong music.
Well, I think basically all of this is wrong.

The way things start out is pretty simple: Andrew stays up late practising and a dude of importance happens to hear it. Indeed, from what I can tell, Andrew's playing so that Fletcher might hear. It then cuts to his normal band (class?) session where, you know, turns out Andrew doesn't seem to get on with people. Or, at least, he has trouble associating with the dude he's the "understudy" for. But Fletcher turns up and rescues Andrew from being ordinary. It doesn't say that then, but that's what Andrew thinks. But he is immediately told to turn up to Fletcher's band session three hours before it is due to start. Why? Well, I would guess that Fletcher intended the extra time to be used to practise (assuming Andrew has drive), but Andrew just sits there instead.

Now, let's be clear, just wanting to be somewhere doesn't mean that you sign up to be abused. But how abusive is Fletcher? Throwing chairs at people is extreme but that's it. The other instances the author is presumably thinking of basically consist of Fletcher forcing the three drummers to repeat the same part of a particular song they're apparently out of tempo for (it all sounds the same to me). For hours, admittedly. None of them are, at this point, unbelievers... but maybe that's the point. Maybe Fletcher is abusing his status as a god among mortals. But really the dude's style consists of telling everyone their gay in aggressive phrases. It's mean but not sociopathic. Which is where we get back to Andrew.

Does Andrew wash out? No, not at all. What he does is snap. You see, during his first concert as an understudy to Fletcher's original drummer, Andrew puts a folder down on a chair and it... vanishes. I thought it was stolen but the rest of the film makes me think Andrew disappeared it, knowing that it would give him the opportunity to play. It does and he does. Indeed, Andrew's playing is interpreted by Andrew as his assumption of being the core drummer in the band. He simply has to tell everyone about this. But it turns out Andrew, just a month in, is totally unable to be overshadowed. He can't let his cousins/his dad's friend's children overshadow him: the dinner has to be about Andrew and they must be denigrated... or, rather, put in their place (third division sports, a model UN). This is the first sign that the reason Andrew doesn't have friends isn't because he's shy or because he is lonely... but because the dude is a prick. More specifically, Andrew is an entitled prick and it is his seat. He is the Man. How dare the conductor/putter together of the band even suggest that Andrew's seat is not his? And when things work out badly for Andrew... his bus breaks down, he leaves his drum-sticks at the rental car agency, he gets hit by a truck, he screws over the entire band by insisting on playing, screws up (predictably)... he attacks Fletcher and gets expelled.

Which is, you know, when Andrew briefly tells the truth... it was his fault. But then he listens a bit longer and decides to say what they want him to say. That's not a good sign. He's not telling his story when he tells the hearing, but rather a story that will get Fletcher dismissed. And Fletcher knows this.

You see two characters are humanised in this film. The first is Fletcher. He gets a moment with a friend of his and the friend's daughter. It's strange, but we see it because Andrew sees it. Fletcher also gets a moment when, prior to the crash when Andrew's temporarily lost his temporary place (the one he insists is his God-given right), Andrew bursts into Fletcher's office where Fletcher is visibly upset. Why? Turns out an ex-student of his has just committed suicide, but while we don't know that then, Andrew totally fails to notice anything remiss (except Andrew's own situation). The other character is Andrew's dad... who both recognises his son's dickness at the dinner party and is there for his son when he needs him to be there (and also the review's mentioning of his fatherly concern). The point is... we are never given an opportunity to sympathise with Andrew because he's unsympathetic. He is Francis Urquhart.
Great art, or at least a great rendition, has been achieved, but at the total cost of the teen’s humanity. At the beginning of the film, he's obsessively driven and introverted, but relatably so; he works up the courage to talk to a girl he has a crush on, and kindles a brief if awkward relationship with her. He struggles with dinner party conversation. But as Fletcher begins to grind away at his confidence and sanity, Andrew withdraws further, breaking up with his girlfriend in robotic fashion and behaving more erratically until suffering a mini-nervous breakdown.
The review's take on Andrew's sympathetic nature. Having checked with Wikipedia... the breakup is after the entitled display in the office. It must be said, at this point, that Tanner and Connolly both take being unseated much better than Andrew. They're upset but are professional about it. Every single time. And critically, they're not explaining the Achilles Question at dinner tables and arguing that Achilles was right (although, to be frank, I think he liked the concept of friends) on screen. (Achilles is also a dickhead.)
But that bravura ending—a hyper-masculine celebration of punishing dedication and success in a great battle of wills—is impossible to shake. As much as we've regarded Fletcher with horror throughout the movie, Andrew's ultimate achievement is that he finally impresses him, without caveat. Andrew is tragically wasting his effort on this sociopathic void of a man, but you can't help but be stirred by his superhuman effort all the same. Whiplash treads that uncomfortable line as tightly as possible and leaves the audience feeling a little queasy for admiring Andrew's victory, no matter how Pyrrhic it might be.
I think my dislike of Andrew clouds the true horror of Fletcher from me but Fletcher's the dude with friends and even if he hides suicides from people he is still visibly upset. Neither are traits of sociopaths. What is sociopathic is a total absence of empathy (see: Andrew) and an active disparagement of the idea of having friends (see: Andrew). Which means even if Fletcher is as truly monstrous as the review argues, Andrew's completely complicit and all too-ready to believe Fletcher's stories about Tanner and Connolly's current activities. What enormous cost of Andrew's being was borne in order to end up where he should have been at the start: playing as part of something bigger? I don't know. Nicole, maybe? But he did lie to her in that last phone-call... there was no "school" to complain about.

I can't drum. I can't play any instrument, actually. I can't read music, either. In short, I have literally none of the talent that Andrew displays. But while I may have ego-maniacal tendencies, at least I like having friends and can see how they're an answer to the Achilles Question. Neither is true of Andrew. Nor, indeed, is it true that Andrew is some Great Talent anyway. All those sessions. All that training. And what? A bunch of bloodied hands. Sure, he sometimes manages to get to speed, but I feel that if he was really that good, he wouldn't have to bleed to get there. Just sayin'. Perhaps the honest assessment of the film is that both Andrew and Fletcher think themselves gods among men. Trouble is, Fletcher's the one who is close to being right. Andrew's all about dat hubris.

Tuesday, 21 February 2017

Video Is Bad For You

As a statistics student one of the things you'll do is examine residual plots for signal -- patterns that indicate the model needs further work. However, as people we identify more patterns than what actually exists. This is why Spotify redid its "random" algorithm, Dilbert has a comic with a "dude" saying 9 repeatedly in his capacity as a random number generator and why I look at the pattern of my multi-choice answers when guessing (to see if it's something someone thinks is random; yes this is problematic... am I assuming I am better at perceiving randomness? do I ignore that a computer has probably sorted it in this day and age? Point is, I still do it).

Obviously this is a big problem if we're modelling something. If we're interested in causality "fake patterns" will compromise our inferences. If we're interested in prediction, we could build a model of the data rather than the process -- data are assumed to come from a process -- and, hence, we're predicting a para-reality. But, it seems to me, from an evolutionary point of view, this is very interesting because it seems very useful.

Imagine, if you will, that you're out and about in the jungle. Now, this is a scary place to be for anyone who isn't equipped with fairly modern weapons (and, even then, in many respects it still is anyway) but it's also a very busy situation. As a species we're a great prey animal. Bad hearing. Crap sense of smell. Lots of mass, i.e. food. Limited capacity to run, climb or swim to safety. Oh, and we have no defensive weapons -- no canines, no claws and no power (a chimpanzee is a lot stronger than a human). Basically all we can do is notice a lot of stuff (mostly visually) and link it together, and give it a meaning. We can also easily pass on this meaning to other people.

One of the big ideas in statistics is the null. Basically, a statistician assumes innocence or boringness -- this thing did not do/is not associated with this cool/interesting thing. In other words, statistics is very cautious and (intellectually, not politically) conservative.  But this is a taught characteristic and equivalence. What is conservative ultimately depends on a lot less than what is cautious. The conservative reading is that of meaningless. Or, put another way, science generally assumes squares living in square worlds. But for our jungle walker, cats are real and it's a dog eat man world. Caution, there, means assuming meaning -- but in both situations it's about harm minimisation. Specifically it's about the relative harm. If you're not cautious in the jungle, you'll ignore the noisy rabbits (saving energy) but you're probably going to be eaten by a tiger. If you're not cautious in academia, you could be "ahead of the curve" but realistically you're going to be caught out on closer examination. But thinking that "twig snap" = tiger isn't conservative, even if it is cautious. That's the point.

Now, this is an interesting point and it's probably worth dwelling on. But it also gives me a sense of deja vu -- that, perhaps, I've seen a video about, if not the stats points, the evolutionary example. The thing is, you read the above a lot faster than you would've watched a video saying the same thing. If you don't believe me, time yourself reading a paragraph aloud and silently. In fact, think about how much you learn when you're doing that. It is the same right? A video of the above discussion "teaches" exactly the same content as the above discussion as is, right? Which is to say, the lesson I've been building to is that the information load of videos is pretty God damn awful. Huh. But, does this matter? Is it practically significant or just some "true but trivial" lesson paraded by pedants? Well, I rather think it is something we should be concerned about. Quite concerned, even. I think an example will help explain this.

I used to have a clear morning routine that invilved reading the news on TVNZ's website, the NZ Herald's and the BBC's. These days, though, it's months between visits to TVNZ. Why? Hopefully you're thinking videos -- although I don't like the horizontal layout, indeed, I feel it actively harms my perusal of the site. But we need more details, yeah?

There is no way that I could ever have read all the articles on all three sites or, even, all the latest ones in each main category. But what I could do was exercise much more control over how I consumed information. A video can look pretty and professional (thus hiding the uncertainties so much more easily) but if a fast-forward it's like reading every other paragraph. If I need to speed up a search for "meaning" (the pattern of things I care about) with "written" articles (although, in truth, while I may be happy to call practically anything an article.., videos aren't articles) I can scan the text. I don't understand or read all of it -- but I am able to spot patterns for meaning (signal indicating newsworthiness) and maybe slow down where appropriate (rather than just skipping material). I can also read the whole thing faster than the video plays, which also matters. (And if you're thinking about, say, playing at 1.2 times the speed, as opposed to fast forwarding, read on.)

We live, today, in a world concerned less about infotainment than it is about "fake news" and "alternative facts" (remember when infotainment was the Devil?). Indeed, we live in a meme-news era: I accidentally discovered the origin of the phrase "alternative facts" as the in-vogue expression via clickbait. But this misses the point -- the real news has to be entertainment too. In the past, this probably begab with "if it bleeds, it leads" and then that evolved into "human interest" delivery -- you know, things where, say, the bombing of Hiroshima would be reported on by cutting to [grandchild] of [granny visiting Hiroshima], when not glorified versions of "cat in tree" stories.* These days, while these things are still true, real news must now be about video. Video, it seems, is the final frontier.

I'm not saying that if people watched fewer videos, (other) people would be less concerned about fake news due to being less conned. However, I really am saying that in a world confronted by "fake new" a news-audience needs to process information individually. The truth is that it's not good enough to read half an article whilst listening to the "news" video plating in the other tab and thus get nothing from either (we suck at multi-tasking, live with it) or subconsciously melding them instead. It's not enough to be reliant on a medium unsuited to scepticism... a typo, I think, could really make you doubt a fake news piece but how do you notice that in a constantly perpetuated video? (Typos are indicative of shoddy editing, which suggests shoddiness, which suggests the reporting may be shoddy.) And, it must be said, practice makes perfect... the more you read of "real news" the better your sense of the variation in "normal" reporting ("fake news" being "pathological" reporting). But maybe it's hopelessly naive of me to think this. A lot of "fake news" is written, so maybe people who don't notice now aren't going to realise that it's outside the aforementioned variation. And maybe "fake news" has nothing to do with modes of consumption and people are right to analyse it from the echo chamber... i.e. modes of thought (e,g. believing what you already believed). But I obviously think that our current modes of consumption are conducive to the proliferation of spurious truths.

What to make of this? Am I some would-be enfant terrible yearning for a return to a lost-era I never experienced where the internet was dominated by the written word? Or have I stumbled across something of genuine interest? Could it even be that we need to consider what people are likely to read? That is, does it matter that comments sections in news articles generally follow smoothly from the article and with videos the comments are generally clearly separated? Does this change how often comments are used? Is the reason why no-one refers people to comments sections because no-one actually reads them? Could it therefore be that videos exacerbate and foster echo-chambers? Certainly, it's harder to reply to a video than it is to a written document: at least, without a transcript. Does this make videos the ideal vehicle to parade ideas that one doesn't want critiqued? These are important questions. and this is an important topic. Videos are bad for us. But sometimes they're fun to watch.

*An inconvenient truth about Campbell Live lost in the hooha surrounding its cancellation was that it's so-called "hard reporting" was mostly this kind of story, plus infrequent actual hard reporting.

Sunday, 12 February 2017

Tone Policing

It seems, to me, the fashion to consider what the election of Donald Trump means. One aspect of this fashion is wondering how on earth Trump came to be elected in the first place and what these mechanisms mean going forwards. Well, the truth is that we should exercise a lot of caution in this regard because Trump shouldn't have won the recent US elections. I don't mean this in the sense that we shouldn't have voted for National in 2014. What I mean to say is that if the US operated in a well designed system, the election campaigning would've happened across the country and there would be no demographic clustering that worked out well for Trump and poorly for Clinton. And then there was Comey. Naturally, given that the features are unlikely to change any time soon, Clinton supporters in particular have been wondering about what they will do next time around.

I think the above context should draw our attention to one quite important, but also unfairly controversial, idea: tone policing. Now, sure, tone policing most accurately describes attempts to side-track conversations into discussions about tone rather than substance or even some sort of, I suppose, ad hominem attack. This is why you'll hear people talking about tone policing as being obsessed with shrill women. But tone policing is actually getting to something broader than this and it needs to separated out from online "discussions" of feminism and put in its own light. We need to recognise that while the tone police, those we think of when we hear the term, are problematic, what they're saying isn't bollocks. And it is trivial to see how this is the case. Here's a hypothetical example:
  • "Feminism is completely stupid: a bankrupt ideology committed to the principle of female dominion over men."
  • "OMG, what did I just read... soo wrong... it's like, I don't even.Google feminism dipshit."
  • "Wow, do you seriously think that talking to him like that is going to convince him that you're right?"
I think this is a believable example and, to be honest, I should probably try and find a real one but it's late and I just want to get this topic off my chest. But before I do that we need to slow things down to note three things about this example. One, the tone policer was not, in this case, the original speaker. Two, the original speaker is wrong (that really is a terrible conception of feminism, if a common one) and the "refuter" is right. Three, telling him to do a Google search is bad advice for three reasons. (I) Google's searches vary between users based on their usage: there is no naive Google search. (II) One is likely to find reference to conceptions of feminism that match the original speaker's views, if not outright endorsements. (III) People tend to ignore things that disagree with them and notice things that agree with them: self-directed research is not an answer. Is something like this the tone policer's point, though? No. The logic of tone policing is that how you choose to dress something up matters... there is significance to forms and modes as well as substance.

Whether or not you agree with the motives underlying tone policing I think it is both a mistake and a common occurrence to allow the critique of the usage to colour the substance of the idea. That is, tone policing is generally very ironic in usage... there is some way of encoding the message "be careful how you say what you mean because it affects the reception" that is better. In fact, betterness is what tone policing really boils down to in plain language terms: there is a better way of saying that (whatever that is). I think this idea that different expressions of the same meaning are unequal is very obvious, but maybe it isn't.

Notice that word "encoding" above, why did I use that particular word? Very simply, because that's the word that was used when we did the Communications Process Model in (I believe) Business 102. I'm not sure how much attention relevant academics give this model but it must be said that it lines up with what I know about the world so I believe in it. I also believe in examples. Take the expression "I'm pissed", what have I just told you? Am I horrendously drunk or enormously angry? Did you even think that there was ambiguity in that expression? In NZ, we use "pissed off" when we're angry because being "pissed" means being drunk. This is not necessarily true everywhere else. The abstract point of this example is that different socio-cultural contexts will lead to different interpretations of the same message. I think this is part one of why tone-policing's logic is correct.

If I look at tone policing through the consumer process model because my "indoctrination" into the buzzword life has been very successful, my faith in the notion that consumers, that is people, only "perceive" a subset of the stimuli they're exposed to is due only to higher level indoctrination. Basically, perception consists of several parts which I don't recall right now and am too tired to look up... the upshot is that you only perceive what you're made aware of. Have you ever been out shopping and realised that you've been zoning out the music? That's your not being aware of stimuli... more to the point, your constructing barriers to avoid letting stimuli in. This is also why you'll notice that many ads are louder than surrounding programming or start off in weird ways: marketers use numerous techniques to get noticed. Sometimes they go too far and people don't get the message... maybe you try to sell something with sex but your viewer only remembers the pretty face and has no idea what product was being sold, something like that. In the context of tone policing what this means is that you do have to be more than background to get heard, but what makes you stand out may be what gets the attention: not your message.

I think the third major reason to respect the logic of tone policing is that we're people and we make judgements. For instance, some people wonder if you're disadvantaged in a court-room if you are unable to appear in a suit or if pupils will pick up on socio-economic cues in mufti and bully each other. These judgements don't, of course, appear out of thin air: they're products of most of the things we've been exposed to. Which is why your grandmother might be shocked if you swear in your song but literally no-one else even notices (the above paragraph) and also why your grandmother might think less of you for doing so. But we're not really profiling here... we notice some behaviour that a person has, "know" a meaning for that behaviour and infer the meaning applies to the person. Thus, if you say annoying things you become annoying and if you say unreasonable things.. you become unreasonable.

The last reason why I think tone policing's logic is true is simply about the sort of barriers that people construct... ask yourself, what does offending or enraging someone actually accomplish? Well, they're likely to think less rationally and entrench their positions... this is not so good if you're trying to convince them but it is likely to make them look unreasonable.. which may convince some third party of whatever your point of view is. On the other hand, the above discussion suggests that we may be viewed  as dickheads or bullies, which makes us unreasonable instead, i.e. the bad guys. When we factor in all the reasons why argument tends not to convince people, it becomes all the more critical to realise just how damaging further barriers could be to mutual understanding. Is mutual understanding not our goal? It bloody well better be...

The case made here doesn't seek to defend tone policing per se, but it does seek to suggest that its logic matters. I also think that the substance of a message is important, and the idea of tone policing is really about realising that we can maximise the worth of our messages by paying attention to our tone. Perhaps even more important is actually going out and talking to people: no matter how slick your style or strong your substance, if they don't hear you, they're not going to listen.

Friday, 10 February 2017

The Magpie, Or BYOD : a contemporary tragedy

What do you spend most of your day doing? I have no idea. What I do know, though, is that I spend a hell of a lot of time doing two things. One, watching television (often via a laptop). Two, using a computer (by which I generally mean the internet). What's the point? Basically, that people have contact with computers, if they have them, outside the school environment. In fact, people have so much contact that they often manage to teach themselves computer languages/learn to code and really get to grips with technology in a way that transcends this xkcd comic.

xkcd
It stands to reason, then, that computer skills... especially given the skills of the vast majority of teachers do not transcend the comic and that BYOD policies by and large are not associated with attempts to teach, say, coding more broadly... cannot be the rationale of BYOD. After all, if computer skills were the reason for BYOD, then BYOD would have to be the most ill-thought out, ooh shiny/magpie policy concocted since your two year old nephew tried to hide your keys by eating them. In fact, since policy makers aren't two year olds, they cannot possibly be so stupid as to think like that. However, I strongly suspect that they don't think at all (hence the inequitable nature of these policies), so I do mention this here. But assuming that "improving computer skills" is not the rationale, there are still big problems with the stated reasons.
Using digital technologies:
  • supports expanded community and international involvement in learning, both face-to-face and online
  • enables students to learn in relevant, real world 21st century contexts
  • allows students to learn, create, share, and collaborate anywhere, at any time
  • opens up a new world of resources for students, providing much more knowledge than any one teacher or school library could hope to
  • enables students to personalise their learning experience (recognising every student's strengths, talents, and needs, building on their identity, language, and culture)
  • helps build on students’ prior and current knowledge, needs, and interests
  • encourages greater collaboration between students, teachers, and school leaders
  • supports teachers to engage in blended, personalised professional, and peer collaboration.
All of these factors add up to more students being present to learn in the classroom and beyond – engaged, enjoying learning, and achieving better results.
That's TKI. This is a quote from a transcript about BYOD at one particular school.
The overall vision for BYOD here at Wairakei School is around making sure our children are connected, capable learners. For our school vision, the characteristics of the learner are visible in everything we do.
Well we believe one-to-one device is a more optimum way of learning. For one, the child gets to use it during class time, they get to take it back home and carry on with their learning.
Our students are really not bound by time and space because of the learning opportunities they have by owning their own device. It’s that ubiquity coming out. One of the reasons why we have one-to-one BYOD is that the device is personalised to them. The children are able to have their own unique login.
These are pretty stupid reasons, which only make sense if you're a magpie. Let's examine them, through the quote. (Fair warning, I think the TKI stuff is self evidently stupid because what it is on about doesn't depend on BYOD.)

"making sure our children are connected, capable learners"

This is, effectively, the computer skills argument packaged in a manner that looks just enough like  tech speak buzzwords to convince people that it's something different.

What is a connected learner? Well, one must assume that this means a learner who sits at the centre of some sort of web, like a giant spider, and draws knowledge from a variety of different sources in essentially real time. Newsflash... this can and does (or should that be, did?) happen without BYOD. It is not predicated on BYOD. Thus, this is irrelevant in deciding whether or not BYOD is possible. Actually, it is entirely possible that it is better if the sense of being connected is something that a pupil generates as a by product of what they do... rather than seeing it as something that their school wants them to be. Do you see the difference?

Capable learners? Ah, we'll get back to this... the idea is clearly that BYOD makes people learn better.

source

"use it during class time, they get to take it back home and carry on with their learning"

Okay, leaving aside questions about:
  • whether that much screen time is actually good for developing children
  • whether or not this actually happens
I am left wondering how this is different to being able to take your schoolbooks home with you or even how it differs to homework.

So, once again, BYOD leaves us with niggling doubts as to whether or not it results in worse outcomes (harmful here) and something which doesn't rely on BYOD.

Clearly not portable at all.
Oh, maybe the point is that they're using the device for other reasons so it's not homework, and avoids the "too much homework problem" too (n.b. too much homework is a real phenomenon). But what if this makes the pupil lose sight of the difference between free time and school time? And it still doesn't resolve the screen time issue. In fact, it probably makes it worse.

"not bound by time and space" 

Okay, I'd like to pretend this is talking about "the cloud" but I doubt whoever wrote this was thinking about that. This is actually an advantage of BYOD. After all, we could access several different subjects from one easily organised and reorganised computer or via any computer by storing the same things in the cloud. However, given that BYOD is about taking a specific device with you, BYOD is not about the Cloud, which really just becomes a back up in the case of emergency. The Cloud is actually something that the old model of computer suites encouraged and developed... because multiple devices were actually used. This we might term flexibility through experience for the pupil, and a counter-intuitive but actually deeply obvious outcome of BYOD.

"Not bound by time and space" is probably largely about the collaborative opportunities presented by technology (or if it isn't, it ought to be). Take Business 102. As part of that course we had to make a short three minute presentation. However, that really relied on being able to organise five other people, which proved (as with 101) to be difficult. I myself had to show up very late to one (self-organised) meeting due to an assignment being far more difficult than I had appreciated. One of the attempted solutions to this problem was Google Docs. I never actually used this but I saw it in action on two of the macs used by the other group members... two people could type in the same document at the same time and the changes were made in real time. That's pretty cool. It is, however, merely just an augmentation (maybe even substitution) of the age old practice of vivids and a piece of newsprint... it's not a new phenomenon, just a non-time and space bound version. Possibly, though, it could lead to more group efforts throughout the education system... there are lots of other ways to collaborate (a class blog, God knows why, being a popular suggestion).

At this point, it is probably worth noting that the evidence suggests that BYOD does not, in fact, improve outcomes. However, the counter-argument raised to this is that substitution/augmentation level of technological integration/implementation is why this is so. That is, perhaps the reason why the evidence disagrees with BYOD is because what determines the outcomes is how people are learning... and this doesn't change if you switch between writing your notes in an exercise book or typing them up.

sylview

"the learning opportunities they have by owning their own device"

Devices, or at least computers, really do bring learning opportunities. For instance, typing represents a distinct skill set to dictation or writing (by hand) and it is, in fact, something that pupils will learn best through constant use. Why? Well, firstly I think a course in typing is really only going to help you if you are willing to learn (and I am not sure this holds true of school children) but more important is the second reason: typing, writing and dictation are three distinct mindsets. In the same way that people who don't write aren't as good at writing, people who don't type aren't as good at typing. The trouble is, BYOD generally exists at the expense of the place where most people learn to write and, as I noted right at the start, people use computers of their own volition very easily. If you're paying attention, this is probably why the defence of the study is able to work... except it's clearly a two-edged sword because solutions don't involve writing experiences but trying to figure out how to have typed assessment. And, of course, maybe it's both... even with typed assessment the handwriters will do better than chance alone would suggest.

It must be said, however, that continual access to IT is really useful. Well, not so much if you have an addiction but healthy use means that you are able to do work when you need to be doing it. A lot of people without access to IT or, perhaps more likely, the internet at home will be reliant on libraries to get their work done. This means, if you're a school pupil, that you'll basically have before school, morning tea, lunch and then after school only up until 8pm at the absolute latest. I don't know about you but there were several internals I worked on until midnight or the wee hours (and a couple of uni assignments too). I was also, in general, able to work in a sustained fashion: public libraries  will often allocate one only an hour at a time and they're busy.

What we notice here, like so much about BYOD, is that BYOD is not required. Owning a computer/device does not require a BYOD policy. For instance, when I was at college we had, for basically the entire time, at least two computers for a three-person household. Right now, now that I am at uni, we have four laptops (one of which is admittedly crap) and a PC (although that is my brother's so no-one else uses it). In neither period was BYOD a thing that any of us lived with. Now, a lot of houses, admittedly due only to the crappiness of NZ and the general cost of tech here, are a lot less able to afford such extravagance. It is also true that price really does determine quality with computers. For instance, the cheapest of those laptops has effectively 0 memory by design and is very much intended for the cloud, so basically it demands more internet than something like my laptop which has a decent hard-drive. It can do less, but I bought it for two things (the odd occasion when I want to have a computer on me at uni, and watching television) so that's okay. But if such a computer had to be the primary device? Hmm... There are serious equity issues that are just plain ignored by advocates of BYOD. Possibly because they overlap substantially with contemporary leftists.1

Dilbert
The equity issues are also manifest in another respect: internet access. A friend of mine from Ramarama only managed to get decent internet access a few years ago... and his family is definitely not from the "not well off" side of things. This raises very interesting questions about the delivery of education in rural areas... especially if the flipped classroom becomes a video-heavy orthodoxy. Mind you, the issue is now resolved so maybe there is good rural internet access these days. But let's take a poverty angle: how common is limited internet access in New Zealand? That is to say, what proportion of households have internet? Well, I'm not sure. The percentage seems to be quite high, and was growing in 2014. But let's say that we get to 95% (the 2014 figures say 90 in 100 households have internet access), what does that actually mean? Well, basically, it means that 5% of households don't have internet access or, in other words, 5% of people are getting left behind... which based on the 2012 figures implies that 1.5% of households (probably about 25,000) households can't afford internet. And I think it is reasonable to think the cost of internet is too high for about 30% of internetless households because while it had risen from 2009, CPI and other figures suggest the cost has collapsed somewhat since 2011. So, how many children is that?

Okay, let's just clarify where some of my numbers are coming from by working in a list:

  1. In 2012, 1.3 million households homes had internet access, or 80%. This implies that there are 1.625 million households in New Zealand.
  2. We then assumed that we get access to 95% of homes (this would be 5 percentage points more than was the case in 2014). But that also says there were 1.77 million households at that time, so 95% of that is 88,500 homes. We'll work from here now.
  3. We then inferred that 30% of internetless homes would have no internet for cost reasons based on 2012 figures (for a few more details see above). This means 26,550 households can't afford internet access.
  4. In 2013, 28.3% of households did not contain families, which implies that children could live in 20310.75 families. (We're assuming, for instance, that no children live with adult siblings; which is stupid because we all know it happens.)
  5. In 2013, around 6 in 10 families contained children. This suggests that there are 11421.81 families with children in New Zealand.
  6. In (5) we also learn that 32.7% of childrened families have one child (3734.93187), 29.8% two (3403.69938) and 16.4% have at least three (1873.17684).
  7. This means we're talking about 3734.93187+(3403.69938*2)+(1873.17684*3.1) children assuming the average of the last group is 3.1 children (I pulled this number out of a hat). That's 16349.18 or 16,349 children.

There are obvious problems here. I believe these numbers are correct under certain conditions. For instance, we imagine that being internetless is random so there are equal proportions of internetless households in certain categories. We know that isn't true. We also assume that among internetless homes, those containing families with children are randomly distributed too. That's unlikely to be true as, among other things, children do raise costs and people are probably likely to make sacrifices for their children. How such forces come to bear on distributions is almost certainly unknown and definitely isn't to me. Hopefully, though, that figure of roughly 16,000 is a conservative estimate (i.e. less than the real figure) because as long as it isn't vastly larger than the real figure, if we perceive 16,000 to be too many children consigned to being left behind... an unfortunate phrase in education discussions due to the diffusion of US debates globally... we condemn the policy. 16000 might not be relatively many people, but I couldn't live with 16,000 futures on my conscience, and if you can... well, I want you no-where near educational policy. And I've got some substantially stronger words to say to you too...

More cartoons about poverty: some grossly ill informed, some intelligent

Now it must be said that education in New Zealand (or, indeed, the USA) is hardly equitable without the burden of BYOD. We have deciles, it is true, but while I roundly endorse the concept, it must be said that I have come around to the view that deciles haven't really managed to solve the issue... that to be from a poorer household and to have good school results requires resilience is simply not fair.Which is to say, getting rid of BYOD or stopping its uncaring advance is far from a solution. And if we ignore that most of the stuff about the benefits of BYOD are predicated on routine accessibility to the internet, it must be pointed out that a laptop could reduce the costs of stationery. But how much stationary does $100 buy? And how much of the large costs are from homework books and the like which could well still have to be bought in digital form? How much laptop does $100 buy and you ain't never finding a laptop that lasts for thirteen years. And who knows when we'll start with the BYOD... if we assume from year five forwards we're probably looking at two laptops between useful lifespans and updates. The bad laptop I mentioned at the start of this post is just over a year old and practically toast because it turns out the HP Stream doesn't just have limited memory but has a memory problem. (Research before you buy: and don't have the bad luck to search for laptops on the day of a one off sale you didn't know was happening which throws you off kilter.)

Inequality in Education
 Bring Your Own Device and the Dunning-Kruger Effect

 The case I have made thus far, as I remember it (it's been written in at least two major chunks months apart) is fairly simple:

  1. Do no harm. Education is serious and we should be sure that some policy change is helpful.
  2. There is no real evidence that suggests that BYOD is additionally helpful and some, possibly explicable, evidence that it is harmful. This suggests we shouldn't use BYOD.
  3. The benefits people say BYOD brings are not predicated on utilising BYOD, i.e. they can be, and are, realised through other approaches.
  4. There is an inequality dimension to BYOD that should be enormously concerning in light of (1).
This last point is more complex in the sense that I think the Dunning-Kruger Effect is relatively easy to misunderstand and I have had to do my best to understand from the internet. That is to say, the Dunning-Kruger effect describes the erroneous assessment of ability such that low ability individuals over-rate their personal competency. I believe that we can substitute in knowledge for competency/ability if we so choose. But what has this got to do with BYOD? Put simply, I believe the Dunning-Kruger effect represents another example of BYOD providing no additional benefit and an additional harm.

One of the things that seems true about BYOD is that it is not intended to be used for, as an example, developing coding skills or development of computer systems. In other words, regardless of whatever people say about, for instance, augmentation3 what is really going on is the substitution of learning about {traditional subjects} via {traditional methods3} with the learning about {traditional subjects} via utilisation of {devices}. In other words, I see the ICT skills doing one thing: nothing. If you remember my critique of the idea of Millennials you'll recall that we're talking about people whose educations took place in two quite distinct eras. Old Millennials born around 1980 went through a school system which would have had minimal computer use... they were in their early-mid teens when Windows 95 was released, for example. Young Millennials, theoretically me, in contrast are old enough to remember, maybe, daft Apple computers in computer suites and also fairly modern, "grunty", operating systems (e.g. Windows 7) in computer suites. There is no comparison whatsoever. But everyone always talks about the computer literacy of Millennials. But when you search the italicised in Google, what do you get? That's right, everyone goes on about how digitally native "Millennials" are, which basically means being unable to do anything that's beyond everyday. They can use Facebook and maybe hook-up an xbox but they're not really computer competent... even if we define competency as "adequate Excel skills" based on the first few hits of that search.

There is, naturally, a moral to the Millennial story for BYOD. When schools develop curricula they have to distinguish between "pupils can do what they want to do" on the computer and "pupils can do what is useful" on a computer. We're not even talking about "coding" stuff like being able to use the html aspects of Blogger when we're saying "useful" but stuff like sorting (although how useful sorting emails really is, is up for debate). Hell, html is probably too high a standard too: we're not even talking about the sorts of coding knowledge that will help pupils become statistics students dealing with the likes of R, SAS or Stata (basic, entry-level ideas). Will BYOD help at all?

The CLOUD
This is a very pessimistic overview of BYOD and I haven't really introduced any of the benefits of BYOD that aren't bollocks airy-fairy statements that don't stand up under even cursory examination. That is, I am letting you believe, at last, that maybe TKI didn't present the best case for BYOD (and, hey, I am assuming, months down the track that the links still work). And sure, as the above image probably also demonstrates I am a bit sceptical of the holy awe the Cloud seems to invoke in some people but... I am capable of pointing out at least some the tangible advantages of BYOD.
  1. Weight saving... I feel it is a rare college pupil who wanders through high school with less than a Chromebook's worth of weight.
  2. Freedom... no longer is work stuck in a 1B5 exercise book. You may well not remember anything as well but you can ctrl-F.
  3. Indeed, Chromebooks probably force you to have something like the above:4 permanent cloud-based storage of all your files... as long as you have internet, you're all good.
  4. Space saved storage. Everything is there in just a few gigs (or even less) rather than eight or so magazine boxes (where the bulk of my uni stuff is kept: the above is mostly just assignments and any lectures I've recorded/important files). The flipside is that Mummy and Daddy are going to have a terribly difficult time figuring out if Johnny's deleted all his previous year's work (although, why would he? laziness will out).
  5. You don't have to up and move the entire classroom in order to use something like Language Perfect, which may or may not be better than completely traditional educational methods.
There are possibly other things, but thus far you get the impression that, at best, the constraints of using glorified keyboards will force pupils to be a bit more clever than just saving everything to their MacBook Pro and then driving off with the thing on the car roof (true story... unless it wasn't a Pro... in that someone told me they did this). And while there are tradeoffs to be observed in some of these things (e.g. insurance or relative abuse survival of a 1B5 and a laptop) these really are good things. But, still, it doesn't really look like BYOD is going to do too much more to actually encourage pupils to get to grips with ICT, unless the curricula change. The one thing that BYOD can do the most to change, the single most transformative part of BYOD, is the potential to make something like "computing" some kind of mainstream subject. But it won't because the teachers don't have the skills and because the principals don't have the curricula space... especially if teachers can't integrate even the "productivity" skills into their normal subjects because they don't have them either. Some do, of course, but not enough.

This is where the Dunning-Kruger effect comes in. Pupils raised on a steady diet of digital devices and digital delivery will have a poor idea of their understanding of technology. This is problematic if we want to talk about the pupil-as-citizen, i.e. someone who has to actually go out and vote. And, sure, I don't see why it doesn't apply anyway with non-BYOD pupils like my former self whose typing skills come simply from doing a lot of typing. I don't know why being able to think on paper or think to dictate would open up the mental possibilities of "maybe I don't know my way around the computer as well as I'd like to" unless, you know, encountering tasks that are trivial to do by hand (e.g. nested bullet point lists) makes one confront limits to one's knowledge. Maybe that helps one learn things or maybe it doesn't. Point is, that's an additional problem that we have to think about, especially when considering how it interacts with the small-scale tangible benefits BYOD brings.

Aggressive and All About the Shiny (which is doubly amusing)
Summary


Am I a fan of BYOD? No, definitely not. This should be obvious from the title.

Am I convinced by the "marketing" as represented in the green material above? No, and nor should anyone else be.

Are their useful aspects to BYOD? Yes, of course.

Are their harmful aspects to BYOD? I believe I have identified four. One, it limits modes of thinking. Two, typing is not as efficient for note taking as handwritten notes. Three, the Dunning-Kruger effect of over-rated technological competence is probably exaggerated by BYOD. Four, and arguably most critically, it's inequality creating facets have been buried, deep, deep down.

What matters more in the scheme of things? Some conveniences or the broader implications of, for example, having next to no experience of thinking in multiple contexts and situations? Convenience is so often a dirty word for a reason.

Am I opposed to technology is schools? Of course not. It's just that I think technology in schools, through a computer suite mode of delivery (or, hell, if government stepped in and provided some standard device which was used intermittently) better serves everyone. Just because we can have BYOD and because it's some new, shiny, thing doesn't mean that we ought to pursue it. Education is important! We should not do harm... and I think, very strongly (and hopefully convincingly) that BYOD does do harm. BYOD's proponents are magpies... drawn to shiny things and dangerous to those nearby...

1Actually, there is an argument to be made that BYOD would make owning such things more accessible because then it would be much easier to go to WINZ and get a loan through them a la uniforms (and other school costs). However, I am not sure of the intricacies of that system and, in general, encouraging debt is a bad idea. I would greatly prefer the govt. to choose some sort of mid-range Windows laptop (because despite what that xkcd above said, if you actually want to have options, you have Windows) and just make it available two questions asked. That is to say: a) how many children do you have? b) can we have proof of enrolment?

2And it is problematic that a website/paper trying to raise questions about inequity in our education system simultaneously allows a wildly offbase defence of "identity politics" which suggests trying to focus on socio-economic inequality is a means of keeing the "White Man" in control. 

3And as we saw above, this can require wilful ignorance of standard practices "offline", when you look at the old vivid/newsprint concept. It also must be said that I rarely see whiteboards and their versatility brought up... perhaps because all American media always uses blackboards?

When I was at school, our lessons came in all shapes and sizes. We worked in groups, we worked in silence, we worked in pairs and we worked alone. We worked creatively and routinely. We made powerpoints, poems, posters and "papers" and probably pretty much anything in between. We had computer based lessons and wrote internals online, in test conditions or delivered them orally/as performance. We did all sorts and maybe I can't see what else there is to do because I can't think in a BYOD-mode or maybe because we did all sorts I see how BYOD hasn't really got the power to change anything... except for what it won't change... curriculum, because it is seen in "marketing statements" not as a proper tool, not as a means to an end. BYOD is an end to its proponents. To that I say nay! Education is the end. BYOD is merely a means... and not a necessary means.


4That's not really Cloud storage if I'm being absolutely honest. I have four versions, basically, of that folder. Two on this computer, that Google Drive and on a USB stick. I save to the computer and intermittently update it. If I did so everyday/at the end of every session it'd be Cloud, but it's not. I must say, though, that I wish I had typed some more of my English internals... not having access to what I wrote feels like I've lost some of myself and, well the thing is, if we'd been BYOD they'd probably have been typed to start with so the problem would be avoided. Or, you know, maybe they should've given us back our internals.

Wednesday, 1 February 2017

Mandatory Te Reo in Schools

Some years ago now there was a discussion about whether or not Maori (that is, te reo) should be universally offered in schools. Today the Greens, who this article says agreed with the earlier policy suggestion (I think they started it), suggest that Maori become a compulsory subject in schools.

I find it much easier to agree with the earlier proposition but my reasoning is completely awful. You see, I know exactly how it feels to not be able to speak an ancestral language but that language is not te reo (I hardly ever call Maori Maori because of Cook Islands Maori). Thus, it is difficult for me to spin compulsory te reo as anything other than a death knell to the hopes of Dutch being more widely taught in New Zealand (in theory, some schools some where in the country offer it: we saw a resource book for it once in Year Eight). That's a great personal disappointment.

The reasons for teaching te reo are thus:

  • historical redress for policies that led to or actively encouraged language loss (this also applied with 1950s Dutch immigrants to New Zealand; I don't speak Dutch for different reasons)
  • New Zealand is a bilingual country (sorry NZ sign language)
  • bilingualism has noted and long known benefits, and it doesn't really matter what languages are under discussion
Hopefully you can see the problems with teaching te reo from a political viewpoint. To the extent, that bilingualism is part of  the odious Bicultural lie it may be resisted on grounds not that dissimilar to my own (i.e. some might interpret teaching te reo as an assault on, say, Samoan, Indian or Korean ethnic identity). Much more common, of course, is the resistance that relates to the fraught relationship between past New Zealand and present New Zealand. In truth, I am probably a little squeamish in this regard myself. As I wrote a long time ago, however, "it's not right that bigots should decided whether or not a country pursues a particular policy". What this does mean, however, is that there are hurdles when it's quite likely these "bigots" (who are probably mostly "squeamish" too but we must recognise our complaints differ not in kind but in scale, hence the amalgamation) represent a substantial part of the electoral pool. This begs the question: have the Greens jumped the gun and offered an unelectable policy in an election they were poised to do well in? (I'll still vote for them, though. National's crap. Labour's racist* and the other parties are clones of National)

There are two other criticisms that need to be discussed. Firstly, that we should choose a "useful" language. Yeah, sure, we'd probably be better off learning German or Mandarin or Spanish in the sense that these are three major languages in the trade and political world (let's be honest here, France pretends it's more important than it is in Europe: everyone knows Germany wears the pants, and has a bigger economy). However, our primary interest is in the general benefits of bilingualism... not in churning out row upon row of prospective MFaT employees. It is also the case that once you're bilingual, being trilingual gets a bit easier too. In this sense, German! Spanish! Mandarin! or whatever other language is a red herring, especially considering which languages are most useful within New Zealand.

Secondly, there is Hekia Parata's objection: lol, choice. Well, that's exactly why she has to be the worst of the many bad ministers Key lumbered us with. The reason why Europeans are widely hailed for their language abilities is because everyone learns English from a young age. Problematically, Tasman didn't bother trying to develop a South Seas Netherlands so we already speak English and are thus robbed of the great obvious option (no offence to Mandarin speakers, but Mandarin is fairly complex and not as global as English; Spanish is a more global, simpler Mandarin). In other words, the choice we have here is whether or not we want a policy that would actually work. And to that, it has to be compulsory. And that, plus the above paragraph, is all the reason in the world to agree with the Greens.**


New Zealanders! It is time to put aside our personal emotive responses and seize te reo Maori as the opportunity to reap the benefits of being bilingual. Compulsory instruction is the only way to succeed and we have, for too long, suffered under a language-teaching regime that results in few learners. Sure, there will be some issues (such as an absence of teachers) but such practical challenges can be overcome. This will not be at the cost of other subjects, personal freedom (cf. maths, English) and offers us the opportunity to conduct much needed reform in English language instruction. The one caution I have left is thus: it needs to emphasise the language, not the culture. Do it, not for us, but for New Zealand's future. Te Reo Maori: The Investment Worth Making.

*To be fair, the Greens can be too in exactly the same ways but I rationalise that if I want the Real things to be done, it has to be the Greens or Labour and voting for the Greens at least doesn't reward Little's Labour's blatant small-minded xenophobia.

**And, it must be said, emphasising these reasons is a reason to not feel squeamish because the reasoning has nothing to do with colonialism and everything to do with "hey, there's already a language that is relatively widespread in schools". Emphasis of this could also annoy people who want to emphasise the earlier two reasons.