Sunday, May 21, 2017

Concerning the Dark Souls of US presidencies

It has been said that the current president is the Dark Souls of US presidencies. Which, to be sure, has a certain ring to it, but it lacks the virtue of truth. Let's explore the issue for a spell.

Dark Souls is a series of games built around the notion of gradual player progression. The games might seem hard at first, but if you stick with it you learn how to overcome that difficulty and become good at what the games ask you to do. The difficulty is not mechanical - the challenges do not require superhuman reflexes or superior skills to overcome - but rather psychological. By failing, again and again, the player gradually learns what needs to be learnt. The reward for this application of patience is the opportunity to excel whenever new situations arise that require the very thing just learnt. It is the player leveling up, rather than the player character.

Meanwhile, in the background of all this character development, a world and its long tragic backstory is ever so subtly unfolding. It is not a simple backstory, where this happened after that, but a series of subtle implications of social relations and emotional states of mind. Complex social processes led to cascading catastrophic outcomes which in turn sparked other social processes which -

It is a deep and complex backstory, and for the sake of brevity, it will all be ignored. Suffice to say that much of it is left unsaid, and that the player will have to piece it together from archeological fragments, old legends and features of geography.

From this description alone, you might see what I'm getting at. Gradual self-improvement through patience, slowly unfolding understanding of past events through contextual knowledge, and the characterization of subtle states of mind - neither of these things are applicable to the current president, even with excessive use of shoehorns or cherrypickers.

There probably is a past president that would live up the title of the Dark Souls of US presidencies. But that is a topic for another cycle.

Friday, May 19, 2017

My computer broke down, can you learn it?

With the recent update to Windows being in the news (not in small part thanks to a computer-eating virus which eats non-updated versions), I've been thinking about how knowledge is situated. Which might seem like a strange connection to make, until you are confronted with this question:

"My computer broke down, can you fix it?"

This is a very common situation to find oneself in, especially if one has acquired a reputation for being able to fix computers. (Even if it only came about from helping someone change the background image that one time.) The knowledge required to navigate this situation is not, however, primarily related to computers. Mostly, it comes down to knowing the asker, their general level of computer literacy and the problems they've asked you to fix in the past. It is a very particular skill set, and over time you develop it through use and abuse.

The aforementioned recent update seems to have crashlanded a fair number of systems, if anecdotal evidence is anything to go by. This made me think about whether I could fix my system if it went down as well, and after poking around for a bit (and making an extra backup of all the things for good measure), I figured that I probably could, given time.

If someone were to ask me to fix the very same problem on their system, I probably couldn't. Not because of my admittedly limited skill in these matters, but because of the different situations in which the problem is situated. If it's just me and my broken computer, then I can take my time, tinker with it, fiddle with the knobs and overall do things that are not directly goal-oriented but which nevertheless gets to the point eventually. It'd be a learning experience, albeit a terrifying one.

If it's not just me, then a whole host of other constraints and situationally specific conditions apply. For one thing, the asker might not have the patience with me learning on the job; they might want the situation dealt with and gone, and me taking my time is the opposite of that. There's also the added element of risk - tinkering is never 100% safe, and accidentally making the problem worse is equally the opposite of the solution. Being risk-averse is good, but it is also slow (yes, even slower), which overall is not conducive to getting things done in a brisk manner.

The point here is not that computers are fragile (though they are), but that knowing something rarely is a yes/no proposition. Mostentimes, we know something sufficiently well that if we were to try it out on our own we'd probably turn out all right, more or less. More often than not, the things we know stem from some first attempt that went in an orthogonal direction from well, but which nevertheless sparked the learning process that led us to where we are. We tinker, we fiddle, and eventually we figure things out.

Though, to be sure, having someone around who you can ask about these things as you go along learning speeds things up immensely.

Do be kind to their patient hearts.

Monday, April 3, 2017

Automated anti-content

So I was thinking about bots in microblogs today, and it occurred to me that they have the potential of being pure anti-content. A realization which, when stated in these terms, raises two questions. The first is "microblog, really?", and the second is "what is this anti-content you speak of?".

To answer the first question: yup, really. It's faster than describing a subset of social medias that are defined by short messages visible for a short period of time, mainly in the form of scrolling down the screen in real time. Gotta go fast.

The second question is more interesting. "Content" is a word that describes some kind of stuff, in general. It doesn't really matter what it is - as long as it is something and can fit into a defined media for a defined period of time, it is content. A person screaming into a mic for twenty minutes is content. It is as generic as it gets.

Anti-content, then. It is not generic, but is is also not original. An example would be the UTC time bot, which tweets the correct (albeit non-UTC) time once an hour. Another example is the TootBot, which toots every fifteen minutes. It is not content, but it is definitely something. You are not going to enthusiastically wake your friends in the middle of the night to tell them about the latest UTC update (though you might wake them about the toot bot), but you are going to notice them when they make their predictable rounds yet again.

Anti-content is not content. But it is familiar.

The thing about humans is that they like what is familiar. It provides a fixed point of reference, a stable framework to build upon, and - not to be underestimated - something to talk about. Stuff happens, stuff changes, but you can rely on these things to remain the same. And because they remain as they are, they can be used to anchor things which have yet to become and/or need a boost to get going.

Or, to rephrase: you can refer to them without accidentally starting a fight. After all, they have nothing to say worth screaming about. They are anti-content. And they are a part of the community, albeit a very small part. They say very little, but every time you see them, they remind you of the familiar.

And now that you have read this, you will never look upon these automated little bots the same way again. Enjoy!

Tuesday, March 28, 2017

Free speech vs rational debate

An interesting thing about the most vocal defenders of free speech at all costs is that they often conflate free speech and rational debate. Which is a strange thing to do - if you argue something with loudness and extreme forwardness, the least that could be expected from you is that you know what you are on about. Yet, somehow, free speech maximalists often show a brutal lack of understanding of the difference between rational debate and free speech.

To illustrate the difference, I shall describe a case where it is not rational to engage in public debate, and where the debate itself has detrimental effects to the society within which it takes place. The debate in question is whether it is the right course of action to exterminate a specific group of people.

For those who belong to this specific group, it is not rational to participate in such debates. The most immediate reason is that you might lose. No matter how unlikely, the mere possibility of losing is reason enough to stay clear of such debates. To the proponents of the extermination policy, your participation in the debate is an additional justification for their point of view. "They can't even defend themselves!" they'd claim, and then move from word to action. Perhaps not immediately, but eventually the final day would come.

The tragic part is that you would lose even if you won. If you won, it would most likely be because you gave reasons for why your extermination is a bad idea. These reasons might be good in and of themselves, but there would be a finite amount of them, and with enough journalistic efficiency these reasons could be summarized into a list. From the very moment the debate ended, this list would constitute the reasons society abstains from exterminating you.

The existence of such a list would constitute an opening for those who favor your extermination. One by one, the proponents could work to undermine these reasons, until they are no longer seen as sufficient reasons for abstaining. The debate would reopen, and you would find yourself in a weaker position than last time around. You would yet again have to defend your right to exist, and you would have to do it using an ever shrinking range of possible arguments in your favor.

Needless to say, this process would continue until there are no reasons left. And then the proponents of your extermination would have won.

This is detrimental not only to the group targeted for extermination, but also for the society as a whole. For each round of these debates, the society would slip one step closer to enacting genocidal policies. Which, to any decent and moral person, is not a desirable outcome.

The rational thing to do in order to avoid such an outcome is to simply not have these debates. Exorcise them public discourse, and keep them off the realms of possible topics. Do not entertain the thoughts, shun those who persist in proposing them, ban them from polite conversation. Keep the opinion marginalized. No good outcome can come from having these debates, and thus the rational thing to do is to simply not have them.

Free speech maximalists want to have these debates anyway, in the name of free speech. But they conflate free speech with rational debate, and as you have seen, there is a very concrete case where these two things are mutually exclusive. If they are to be honest to themselves, they will eventually have to make a choice between one or the other.

If you began reading this post with the opinion that we should have these debates anyway, and still hold that opinion, then I want you to be fully aware of what you are proposing. I fully trust that you will, in your own time and on your own terms, make the rational choice.

Monday, March 6, 2017

What cyborg Harry Potter can teach us about teaching

After revisiting the recount of my masters thesis, I realized that it is rather German. That is to say, it goes on at length to establish some general principle, but then doesn't bother to give examples of how this principle is realized. Which is a style of writing well suited for some purposes, but, let's face it, is also rather annoying. So let's contextualize this general principle for a spell, by relating fan fiction to the subject of history.

The general principle is that people learn by doing things that they are interested in doing. This happens automatically, without the addition of directed conscious effort. When someone does something, the doing of that thing places them in situations and frames of mind which facilitate the process of learning, and the more doing that takes place, the more learning subsequently follows. Being interested bring with it the propensity of doing more of it, and of paying attention whilst doing it. It is, in most cases, a self-perpetuating process.

This is rather straightforward, and the biggest drawback with this line of thinking is that it takes too many words to convey with regards to how straightforward it is. You begin reading, work through the verbiage, and then conclude at the end that it would have been sufficient to just say "you learn by doing". Which is true, but it also goes to show how much effort you have to put in to convey something straightforward. In retrospect, it is obvious, but you have to go through the process before it becomes retrospectively obvious.

Thus, we have what we need to get to work: the general principle of learning by doing, and the notion of retroactive obviousness. Let's move on to fan fiction and the subject of history. Specifically, let's move on to how the notion of 'canon' relates to the teaching of history.

Canon, in the context of fan fiction, denotes a particular set of works which can be considered official or true (as far as fictional depictions are true). In the case of, say, Harry Potter, the books written by Rowling are canonical, and the specific words found within these books carry significance in that they are the source material from which all knowledge of the fictional universe are garnered. Any further discussion about the Harry Potter universe will have to take these books as written, and conform to the limits imposed by Rowling having written them in a specific way instead of another.

Or, to put it another way: it is canonical that Harry Potter is a wizard that attended Hogwarts, a school for magically proficient youngsters. It is, however, not canon that Harry at a young age underwent a series of radical medical procedures which replaced everything but his visible exterior with cybernetic machinery, and that he is a robot that passes for a human child. The former is canon, the latter I just made up. Those who want to talk about what happened in the narrative universe of Harry Potter have to stick to what actually happened in the narrative - which is to say, the source material, as written.

Any particular work of fan fiction set in a particular narrative universe has to be related to the source material, in various ways. The fan work has to cohere with the source material (i.e. be about wizard Harry rather than cyborg Harry), and it has to cohere enough that assumptions from/about the source material carry over to the fan work. The closer to the source material a fan work manages to cohere, the more interesting things it has to say about the canonical narrative universe.

This introduces an element of evaluation to the act of reading fan fiction (and even more so to writing it). The act of reading also becomes an act of comparing - does the fan work cohere with the source material, and if there are inconsistencies, where are they? A critical reader can move back and forth between the different texts to find out whether they cohere, contradict or - more interestingly - pose further questions about the source material that are revealed through the act of writing the particular work in question.

Whether or not a reader actually makes the effort to make such comparisons depends entirely upon their level of interest. But, as we stated at the top of this post, people do the things they are interested in, and it is by doing the things they are interested in that they end up learning what they actually learn.

Thus, those who are interested in fan fiction about Harry Potter will eventually learn the skills associated with comparing a fan work with canonical works, by virtue of following their interest. They will find out which works are considered canonical, which works are not canonical and which works occupy ambiguous gray areas between these two poles. Or how to handle situations where canonical works disagree - such as when the books and their movie adaptations contradict each other. Which canonical authority has preference?

If you are a teacher of history, then these are the very questions you wish your students to engage with. Not about Harry Potter, mind, but about the general validity of narratives told about the past. Which works are canonical, which are not, and what do you do with all the gray sources in between? Which statements about the past can be substantiated with references to the source material, and which are but speculation? How do you position yourself as a critical reader with regards to the source material at hand? What do you do when you encounter a text about a historical equivalent of cyborg Harry? These are questions that practitioners of fan fiction engage with, albeit not always explicitly.

The pedagogical challenge that follows from the general principle that learning follows from doing what you are interested in, is to identify what students are interested in and which skill sets they have developed during the course of following their interests. By doing this, a teacher can utilize the retroactive obviousness inherent in applying what a student already knows to new situations. Rather than restarting from square one, we do something more interesting.

Fortunately, everyone is interested in something.  But that goes without saying.

Obviously.

Sunday, February 26, 2017

Roundabout canons

Every academic discipline has a canon. That is to say, a series of texts that most of those who are active in the field have read, or at least have some sort of working understanding of. The exact composition of these texts vary from field to field (and over time), but at any given moment you can be sure that there is a set of books most practitioners within a particular field of knowledge knows about. The canon as a general category, whilst undefined in its particulars, still exists.

It is markedly more defined at local levels. It is especially defined at local sites of education, where there are syllabi that explicitly specify which texts are included in the mandatory coursework. Teachers are expected to know these texts well enough to teach them, and students are expected to read them well enough to mobilize some parts of their content through some sort of practice. Such as writing an essay on just what the mandatory texts have to say.

Invariably, there will be some students who are just not feeling it when it comes to going through the academic motions. Invariably, these students will turn to the internet for an easy way out. Invariably, some of these students will yoink a text from the internet and turn it in as if it were their own.

Thing is. If the texts and/or the subject matter remains the same over the years, patterns will emerge. Students will be faced with the same task of producing some work on a topic, and they will conduct the same web searches year after year. And, if general laziness is a constant, they will find the same first-page results and turn them in, unaware of their participation in an ever more established tradition. [A fun sidenote: I have a few blog posts which receive a boost in traffic two times a year, which coincide very closely to when their subject matter is taught at my local university.]

What I wonder is - how many times does a particular web-copied text need to be turned in before those in charge of grading start to recognize it? Or, phrased another way: how many iterations does it take for these easy-to-find texts to become part of the local canon?

A canon is wider than merely those lists found in official documents, such as syllabi. Informal inclusion is a very real phenomena, and when a particular text keeps showing up again and again and again -

Now there is food for thought.

Wednesday, February 22, 2017

Postmodernism, a primer

There has been a lot of talk about postmodernism lately, and the only thing larger than the distaste for it is the confusion about what it actually is. While it might be tempting to label this as a postmodern state of things, it's not. It's just confused, and confusion is not postmodernism. The latter might lead to the former, but that is the extent of the connection between the two.

If you've ever read a textbook that in some way deals with postmodernism, then you've probably encountered the introductory statement that the word consists of two parts - post and modernism. Post- as a prefix means that whatever it is fixed to happened in the past. When it is fixed to modernism, we get a word that means "the stuff that happened after modernism". Modernism came first, then postmodernism - in that order.

There are two main reasons for including introductory remarks of this kind. The first is that it has become tradition and convention at this point, and it's easier to latch on to what has already been established than to be creative. The second is that you cannot treat postmodernism as an entity unto itself - it has to be understood in relation to what came before. If you do not understand modernity, you will not understand postmodernity. The one came from the other, and it could not have happened in any other way.

It is vitally important to underscore this intimate relationship. It is a historical progression which is not merely chronological - the tendencies and practices set in motion in the modern time period kept going in the postmodern time period. They are linked, similar and connected.

The modern project was (and is) one of enlightened critical thinking. Traditional institutions, mainly those of monarchies and churches, were no longer to be seen as the absolute authorities when it came to the truth. Instead of relying on ancient authorities (or very present authorities, as it were), the moderns wanted to rely on science and reason.

An example of this shift from ancient authority to a more modern way of thinking is Galileo and the notion that the Earth goes around the sun. Using the tools at hand, Galileo figured out that Earth is not the center of the solar system. The traditional authorities, who held that the Earth was in fact the center, did not agree, and much ado was made about it. In the end, you know how it all turned out.

This ambition to test things by means of science and reason wasn't limited to one person and one particular way of looking at things. Over time, it became the default mode for everything - everything could be questioned, measured, re-examined and put to the test. Those things that were found to not hold up to the standards of scientific testing were thrown out, and those things that did hold up were expanded upon.

The scientific implications of this are fairly obvious: you can get a whole lot more done if you are allowed to freely use the scientific method, without having to make sure everything you find corresponds to what the authorities want you to say. Science builds on science alone, and its findings are all the more robust for it.

The social implications, however, are less straightforward. If long-held beliefs about the cosmos as a whole could be questioned and challenged, then so could long-held beliefs about things of a smaller and more private nature. If the church was wrong about the Earth being at the center of the solar system, then it might also be wrong about marriage, sexuality, and other social institutions. Everything is up for questioning. Everything.

This process of questioning everything kept going, and over time more and more things that were once taken for granted were put to the task of defending themselves. Everything that was once solid melted away, and what came instead was something completely different. Where once kings and bishops ruled, there are now scientists and bureaucrats. And marketers.

Mind you, this is all part of modernity. This is the part that came before postmodernism became a thing. Postmodernism is what happened after this process had been around for a while and become the status quo.

The thing about questioning everything is that you can't really keep doing it forever. At some point, you arrive at the conclusion that some questions have been answered once and for all, and thus that there is no need to go back to them. You begin to take things for granted, and enshrine them as the way things are supposed to be. There are other, more important things to do than reinventing the wheel. There is an order to things and a tradition to consider, both of which are as they should be. The product of modernity is a new range of authorities which dictate what is to be taken for granted and what is to be questioned.

Postmodernism is a return to the very modern urge to question everything and make present institutions answer for themselves. It is, in essence, a return to the modern impulse to trust reason and science rather than tradition or authority - even if these very same traditions and authorities have used reason and science in the process of becoming what they are. But instead of asking whether the Earth revolves around the sun or not, it asks: why do we do the things we do the way we do them, and might there not be a better way to go about it?

Postmodernism happened after the modern project. Post-modernism. But it is still very modern. It is modernity turned upon itself.

If you, after having read this, are slightly more confused about postmodernism, then that is good. It will have primed you for this next statement:

Academics stopped talking about postmodernism some decades ago, and are baffled at its return to fame in news and popular culture.

As final words, I say only this: its resurgence is not postmodern. It is merely confusing. -