Tuesday, December 27, 2016

Making translations work

There are certain advantages to having English as a second language (aside from the ever present point that no one has English as their first language). One of them is that you have access to a whole realm of non-English thoughts and traditions, and can escape into it from the goings-on of the international realm. When the going gets rough, the locals go local.

And, of course, you have a brutally efficient means of encryption at the ready at all times. Just don't bother to translate, och ditt budskap blir obegripligt utan att du behöver anstränga dig alls. Very handy, very convenient.

One counter-intuitive advantage of belonging to a non-English language area is that books are translated into your language. To be sure, given sufficient fluency, it doesn't matter one way or the other whether a certain text is translated or not. It's still the same text, after all. Except for one subtle difference: the introductions.

It takes time and effort to translate a text, even if you are only mechanically flipping the words from one language to another. It takes even more time and effort to translate a text in such a way that context, intent, nuance, references and allusions find their way across. Most of that extra effort takes the form of someone who knows the subject matter being paid for their labor, meaning that the decision to translate something is both a matter of wanting the text to be translated, and being able to justify the expense of doing it.

Now, Swedish is not a huge language on the world stage, as you might imagine. Even more so since most swedes know English anyway, and can just as easily pick up the original version for the same reading experience. The market of monolingual swedes is not large enough to support just-because translations. Which actualizes the justification of expense mentioned above: why do the work if it's all the same?

The Swedish answer has been to establish a long tradition of writing introductions to translated works. Long and comprehensive introductions, which touch upon most of the things a reader might or ought to know before heading into the text proper. When reading a translated work, you do not only get the work in and of itself - you also get yourself a proper grounding as to what kind of work lies before you. You are, for all intents and purposes, introduced. More so than those who read the original, untranslated work.

This is what marketing people call a selling point.

It is also something that those of you who are monolingual will never find out unless someone tells you about it. So, thus. See a need, fill a need. -

Thursday, December 22, 2016

Dress for the future you want, not the one you foresee

An understated aspect of my Discursive Anomalies is that they are not one-off affairs. I carry them with me, and try them out on things I encounter. They are, in a way, a toolbox. The nature of these tools or what situations they are meant to improve is as of yet unknown, and that is part of the point. When the time comes, the tools will be there.

Lately, I have been thinking back on the post about Jonathon Green's depiction of the 60s counterculture of Great Britain. It accomplishes something it really ought not to accomplish: by describing many contemporary constituent parts of a time period, without really piecing them together, it conveys a better sense of the times than a more integrated approach would. It is all nows: one now after another, juxtaposed in such a manner as to bring context through sheer numbers. It is not a point of view, but you end up with one nevertheless.

It is all very backwards, and all very straightforward. Integrated and holistic points of view are artifacts of hindsight, not readily available to those living in the moment. In the moment, there are only constituent parts, who disappear when we find ourselves with something more interesting to do.

I wonder what a similar depiction of our time would look like. What the distinguishing characteristics and vital constituent parts will turn out to be.

I suspect it would be a mixture of things we take for granted and things we cannot see due to being too close to them. The Trump election would most likely warrant a mention, alongside some massive landslide of a long-term change that happens on the other side of the world we have yet to see the ramifications of. The rattling of sabers on both sides of the old Cold War will probably be discussed as an ambient factor, but the real background tune of the future has every probability of being recorded in a suburb of an African town whose name we will never know. Perhaps meme culture will be a thing; perhaps it turns out a revived ancient tribal practice performs the same functions with far greater efficiency, sneaking in from the periphery.

History has a way of becoming those things that happened alongside those other things we paid attention to.

This state of things is a hopeful one. It implies that the world is not limited to what can be seen in the news. It also implies, through the same logic, that there are still surprises left in the world, ready to strike from so far out of left field that they cannot but be discursively anomalous.

It implies that we could be the one causing these unforeseen consequences, by engaging in some fit of passion that in hindsight turned out to be more important than we could have imagined.

That is a good future. We should prepare for it. -

Tuesday, December 6, 2016

If you can't beat them, beat them by joining them

The administration of the president-elect of the United States is in something of a hurry. Apparently, they didn't foresee the eventuality of actually becoming the administration of the president-elect of the United States, and thus didn't bother with the formality of specifying exactly who is in the administration of the president-elect of the United States.

That is over 4000 job positions, to be filled by early January. Or, to put it in a more manageable number: a hundred appointments to be made a day until the next presidency begins.

This is something of a pickle, to be sure. The usual way to go about these things is to begin months in advance to make sure the best people are placed in the right positions, with any number of checks and balances and procedures to facilitate the process. It isn't something that happens overnight, and being in the position to very soon have to literally make it happen overnight is not something to envy.

As you might imagine, this means that things have to be done faster than usual. If you can't imagine, try counting to a hundred, and then read a hundred names out loud. It takes a while just to enumerate the positions and the names that go with them, and the work has to be done at breakneck speed. There is bound to be something of a drop in quality of the process, and due to this, less than optimal choices will inevitably be made.

This means that being considered for a position is a very good thing to be these days. The speed at which the whole ordeal has to be completed brings with it the temptation to just pick a name from the pile of available names and make it official. Speed is of the essence, and the positions must be filled before the next presidency begins.

Fortunately for you - if you are a US citizen of somewhat good standing - it is very possible to apply for jobs in the next administration. And given the sped-up process described above, now might very well be the best possible moment to just send in an application and hope it sticks.

Just like a certain president elect-did.

If you find yourself thinking that you are not qualified for a cushy top government job - do not worry too much about it. You have a grasp of basic science and are a somewhat decent human being, which means that whoever you displace is a worse choice than you are. If we work on the principle of doing no harm, you will most definitely do less harm than someone who believes that the bible is literal truth and that climate change is a myth perpetuated to weaken the bargaining position of western nations. You have what it takes.

Also, I hear the healthcare benefits are to die for.

The application form can be found here. Make yourself known.

Sunday, October 30, 2016

Continuity, but not too much

The US presidential election will soon be over, and the world will sigh in relief. Finally, there will be something else in the news than how terrible the two candidates are, and we can all go back to the business of not knowing what Thanksgiving is or exactly how or when it is supposed to be celebrated.

Normality will return. And there will be another Clinton as a president.

Paradoxically, it will be more interesting to see what the Republican party makes of Trump after the election than during it. During it, they will have to manage a fine theological line of being loyal to the party but only in spirit. Afterwards, they'll have to construe the whole ordeal as some kind of discursive anomaly that only occurred due to aberrant circumstances. A freak accident. Something that, during the years to come, will be referred to as "that one time", the exception to normalcy.

Undoubtedly, there will be generous amounts of sophistry, retconning and outright lying to make it happen. One does not simply erase billions of dollars of brand promotion without effort. The next step in making America great again will, ironically, involve forgetting that very phrase.

Constructing a discursive anomaly also means constructing the thing it is anomalous to. If Trump is to be made a weird thing that cannot possibly happen again, some sort of Republican identity will have to be rediscovered or invented, and then presented as the sane, rational, absolutely non-Trump baseline. To erase the past and move on to a better future. Build a better mousetrap, as it were.

It is one of those inherently American traditions. Like Thanksgiving.

Monday, October 24, 2016

You love the Left

It is so tremendously handy. If anything stupid is ever uttered, someone from the Left uttered it. If something stupid happens, someone on the Left thought it a good idea. If something ever goes wrong, there is always some specter of the Left causing it.

This state of things solves the ever so tricky problem of finding someone to blame. It is after all something of a hassle to analyze the situation, understand the historical context and identify the motives of all the relevant moving parts. It is an even bigger hassle to admit that things are complex and more often than not ended up the way are without anyone actually wanting them that way.

Blaming the Left is so much easier. Whatever it is, whatever has happened, whatever the situation. It was always the Left.

The only thing left is to admit it. You love the Left.

The alternative is, as you already know, an uncomfortable hassle.

Originally published May 26, 2015

Words about words about you

Looking back on the now not so recent Discursive Anomaly on plagiarism, I realized just how much of a multi-stage process source use is. And, moreover, how many stages of knowing what it is about there actually are. It's not as simple as knowing or not knowing, but rather a complex coming into one's own as a reading subject.

Here are some sketches of these stages. To give you something to think about.

The first stage is not being aware at all of the use or purpose of using sources. While I suspect humans are incapable of existing in this state in a more general sense (the phrase "but mom said" is in fact source use), in the context of writing they can and do exist without it. Reporting what someone else has written is not an intuitive concept, and like writing itself it has to be learned.

The next stage is knowing that sources can be used. Even if it is only rudimentary, or mechanically. Or, as in many a case, that there's an expectation to put something like (Foucault 1975) at an appropriate looking spot in the text. The text needs to relate to other texts somehow, or at least go through the motions of doing so.

This might seem like a trivial difference. The step from not being aware of something and being superficially aware of it is not a big step. But, as with many things, you have to start somewhere, and then gradually work through it. Even if the baby steps will look awkward in hindsight.

Next up is knowing that not all sources are good, and that some ought to be avoided. Simply having a source does not a well-grounded text make, and knowing what counts as a good source and what does not count is a skillset all of its own. The ability to look upon different texts and see what they have to offer to the specific context of one's own writing is a skill that takes time, practice and familiarity to grow.

These things are not made easier by different contexts drawing upon different bodies of knowledge. Sometimes, drawing upon Wikipedia is frowned upon, while at other times it is perfectly fine. It all depends, and finding out exactly on what it all depends (genre, tradition, situation, politics, policies, etc) is a slow and wordy process.

Next up is to summarize a line of argument. That is to say, to in some fashion paraphrase a text to give readers some insight into what it has to say. This goes beyond simply invoking the name (Foucault 1975) or saying that someone said something. Giving an account of what someone else has said, and working through the steps of it in a fair fashion, takes more work than it seems. It forces you as a reader to look closely at what the sourced text does, and to understand it well enough to give a fair account in your own words of it.

Texts do more than they seem to at first glance. Reading a text once and getting the gist of it is all well and good. But when read again, you'll find that the text makes all kinds of assumptions and uses a wide range of premises that your first glance didn't catch. Summarizing a text and conveying its core message means sorting through which of its parts are important and which are not. Figuring out what's what can sometimes take more time than might be reasonable to expect.

The point here is to take the strong points of someone else's argument and repurpose them in your own writing. No need to reinvent the wheel when you can borrow the schematics, as it were.

Next up is finding out that you do not have to agree with what you source. You can summarize it (as indicated above), and then go on to explain why you don't agree with it. Of course, simply saying that you do not agree with it is somewhat of a waste of verbiage - the fact that you have given a summary of what the other said means you can go into specific detail of why and how you don't agree. You can get real.

You still have to do the work of summarizing the other's line of argument in a fair and correct way, though. If you get it wrong, then the fact that they got it wrong first is lost upon closer inspection.

Next up is comparing and contrasting. That is to say, to summarize several texts and see how (or if) they relate to each other. The point of this is to put things and texts into context, and to make sure that this context is one of your own making. It is one of the hardest things to do, writing wise, but if you can manage to source several texts in such a way that your own point of view comes across in the process, you have a power that is both immense and beyond comprehension.

Next up is whatever you well damn please. You can take texts and make them dance. Compare a beautiful passage here with a striking argument there, and see what interesting thought children they make.

I suspect they will be beautiful striking and interesting by virtue of being yours. -

Sunday, October 9, 2016

Horse notes

I'm writing a thesis on horse_ebooks (because of course that's what I'd be doing), and one of the possible avenues of approach I'm investigating is Bakhtin's notion of genre. Because you are interested in the Horse, I'm going to share a few notes on this notion with you. To further a common understanding of the situation.

A classic understanding of communication and utterances is that someone wants to say something. They have some inner thought or emotion they wish to express, and in order to express it they turn to language. Using their understanding of grammar and their available vocabulary, they effort to produce some discourse that will hopefully convey the message across to the listener. It's a directed process, from one self to another.

Bakhtin is not a fan of this classic understanding. Rather, he proposes we understand communication in terms of genres. While it is true that communication takes place between individuals, it's not a question of one person talking directly to another person. Instead, it is a question of a person in a particular situation talking to others who are also in the same particular situation, and this situation has distinct and non-subtle effects on how the things being said are interpreted. The situation is as much a part of the communicative effort as the individuals in it.

A trivial example of this is a wedding ceremony. Everyone gathered has a certain understanding of what is going on, and unless something out of the ordinary happens, the situation will unfold as expected. Everyone knows the genre of wedding ceremonies, and this knowledge informs how those present understand what is occurring there and then. And, conversely, that it'd be weird if someone would act in a manner not in accordance with this genre.

Someone suddenly standing up and giving a rousing oration on the need to lower import taxes would be extremely out of place, and possibly cause a minor scandal. Whether or not there's actually a need to lower these tariffs is beside the point - there's a wedding going on, after all.

This kind of situational awareness is not unique to weddings, to be sure. It goes for all social situations, in general. However, there are only a certain number of such situations, and most of them tend to resemble each other over time. They become genres, albeit informal ones, and the understanding of those present informs what can be said in future such situations. If you are able to mobilize an understanding of the relevant genres, you will be able to make things happen in future situations pertaining to them.

The next time you hear someone relate an anecdote of someone acting strange at work, then they are giving an account of someone not understanding the genres at work. There is a certain expectation of how people ought to behave, and someone didn't act in accordance with these expectations. To amusing or confounding effect.

I imagine you might be thinking to yourself - how does this relate to the Horse? Which is a both understandable and crucial question

Remember how Bakhtin wasn't a fan of the classic understanding of communication? How it's not about one person saying something in a void, but rather a process of shared understanding of specific situations?

This becomes relevant in the context of the Horse, as it becomes meaningless to analyze it in terms of semantics and intention. It does not try to convey some sort of message, and decoding what it might be intending to communicate is a pointless exercise. It is communication without a subject, as it were.

Yet, it has over a hundred thousand followers. Clearly, it accomplished something with its tweets. And my hunch is that Bakhtin's notion of genre as social expectations might help uncover what this is.

Tuesday, October 4, 2016

Grading EU on a curve

Technically, our local universities have a new grading system. This follows from the Bologna process, which aims to standardize grading systems across the EU (among other things). However, given the considerable autonomy our local universities have, the rate of implementation varies from university to university.

The reasons for this have little to do with the new grades in and of themselves, above and beyond the basic reluctance of institutions to change anything at all. Rather, it has to do with a peculiarity of the legal status given to the new grades, and what it means to be given a particular grade instead of another.

The old system had three grades: fail, pass and pass with distinction. The difference between passing and passing with distinction is often quite significant in terms of effort, and utterly irrelevant to anything at all outside of your sense of accomplishment. The important thing is whether you passed or not, and the range of important grades stops there.

This has implications for the legal status of these grades. Given that our local universities are government institutions, and you cannot challenge decisions by government institutions that are in your favor, this means that you cannot challenge the decision to give you a "pass" rather than a "pass with distinction". The difference between passing and passing with distinction is so miniscule that it makes no difference, but passing a course is beneficial to you. Thus, since passing a course is a beneficial governmental decision, it cannot be challenged.

(It is not unheard of for students to intentionally fail a test they know they'd only get a pass on, in order to redo it later to ensure they'd pass with distinction. These minmaxing daredevils are rare and far between, though.)

The grading system proposed by the Bologna process, however, has more steps in it, ranging from A to F(ail). This might seem like a minor point, and if your only aim is to get through the educational process in one piece, it is. However, since there are more steps in the new system, the legal status of any particular grade is slightly different compared to the old system.

Specifically, getting a B rather than, say, a C, is better all around. It makes a difference. It says something about you. Something that is left utterly implicit in the old system.

This means that it is possible to challenge grades given in the new system (given that they are not an A). And students do challenge grades, en masse. The universities can't revoke a grade due to a student being annoying, but they can raise a grade if badgered about it with sufficient paperwork. Thus, the paperwork commences.

Equally thus, universities are dragging their feet in implementing the new grade system. Because being badgered with paperwork is a chore. An easily avoided chore.

If you want to understand the process of EU integration, all you have to do is to take the state of affairs described above, and multiply it across all the institutions of all the countries.

All of them.

Friday, September 23, 2016

Free speech, but as a public good

The core of free speech is that words have meaning. This is a point so obvious that it goes without saying, and it is thus all the more confounding that it seemingly has to be said. Not least in these perilous times, when the only thing keeping pace with the frequency of meaningless statements is their volume.

I need not remind that one of the most frequent loudmouths has a fifty/fifty shot of becoming the next US president.

The classical arguments for free speech had very little to do with individual expression. They did not primarily expound the right and virtue of being an asshat in public. Rather, these classical arguments were primarily concerned with the positive effects of not having public debate limited by the public institutions that were the object of debate. Primary among such institutions were monarchies with varying degrees of absolute authority.

One of the positive effects of free speech is that the public, by partaking of the public debate, would have a reliable source of information to base their opinions on. After having read the arguments to and fro, for and against, the public would stand at the ready to mobilize their own rational faculties in their day-to-day democratic activities. Thus, the public is informed on the issues, understand who the actors are and what is at stake, and can get to work utilizing the best knowledge available to them. Whatever the king sovereign might opine on the matter.

The hidden premise here is of course that the public debate is conducted properly and in good faith. Those participating are expected to bring clear understanding to bear on the issues of the day, and inform the reading public about what is at stake. The function of public debate is to educate and mobilize the public with regards to the issues of the day. If this function is not fulfilled, things go awry. The public bases its decisions and deliberations on poorer information than it ought, and the democratic decision-making processes suffers because of this.

This is a radically different notion of free speech than the nihilistic freedom to unabashedly shout one stupidity after another in public. While some might feel good by roaring "RETARD!" at the top of their lungs on the town square, such roars do not contribute anything constructive to the public debate. Such antics do not create or convey an understanding of current issues, and it goes without saying that there are more interesting avenues of speech to explore.

The same goes for self-identified nazis. The issue as stake is not that the public has not heard what they have to say, and would be convinced if they but took the time to absorb more refined versions of nazi rhetoric. Quite the opposite: the arguments have already been presented, both in general and in their most specific implementation. Adding more speech will not further anyone's understanding of what is at stake. History has already rendered judgement on these matters, and there is little left to add.

With this in mind, it is hard to take self-proclaimed free speech provocateurs seriously. They are either disingenuous or know not of what they speak. In either case, nothing is gained by helping them spread their unrelenting ululations.

We know better know. It would be to insult generations of free speech advocacy to not have learnt these things by now.

Originally published August 10, 2016

Tuesday, September 20, 2016

The question of useful knowledge

Every once in a while, the question of the usefulness of knowledge rears its ugly head. Usually in the context of education, where it is accompanying the question of what to teach the kids (and where to allocate limited monetary resources). At other times, as a connotation to the accusation that you are wasting your life by not learning something else - such as the art of accounting or lawyering.

The worst part about it - aside from the "can we get away with not spending money on this" and "why are you wasting your life" parts - is that the measure of what is useful always is a retroactive quality. You never know what will be useful to know until you are in the situation wherein it would be useful, and by then it is usually too late to learn fast enough to make a difference. The only way to be prepared is to have learnt these things ahead of time. You never know what will be useful until it is.

Moreover, the notion of useful knowledge is usually defined with a few specific situations in mind, ruling out most of the vastness of human experience as irrelevant. This is seen most clearly when it comes to educational regimes focused on preparing kids for work, to the exclusion of everything else. Everything is geared towards this one particular purpose, and those things not specifically geared towards this purpose are ignored.

What is the workplace utility of critical literacy, of knowing how to be a supporting family member, or of fluency in the arts?

What is the workplace utility of having read a poem that makes the experience of having a bad breakup more bearable?

When push comes to shove, the notion of useful knowledge always depends on just exactly who it is supposed to be useful for. More often than not, it tends to be someone who is not you. Their exact identity is shared on a need-to-know basis, and you do not need to know.

The question of whether knowledge is useful or not is never asked without hostile intent. Let there be no mistake about that.

Friday, September 9, 2016

Against content

Containerization is one of the forgotten obvious aspects of modern life. Containers are everywhere, and many modern cities have huge areas dedicated to their loading and unloading. Containers move hither and dither, but unless you are actively working with logistics, you are not likely to think about them other than as something that has always been there. Indeed, it would be very strange and disconcerting if they weren't there - the rhythm and ambiance of daily city life would be perturbed without them at the periphery of perception. Containers are as staple as the goods they contain, as it were.

Like malls, if you've seen one, you've seen them all. One container is identical to any other container, except for eventual external markings of corporate ownership. They are all the same size, weigh the same and handle the same. Which is the point. No matter where you are, you can pack things in containers and transport them anywhere else in the world. Wherever you go, there will be infrastructure ready to accept and process your container - since they are all identical.

The point of this standardization is to make it easier to move things around. Since the containers are all the same, it doesn't matter what happens to be inside them. Writ large, this means that the various trucks, trains and airplanes used to move things can be designed to move a certain number of containers, and set in motion once they have loaded the desired number. Moving any one container is the same as moving any other, and large amounts of content can be moved efficiently as it can all be processed through the same system, rather than in parallel systems that all move differently. One size fits all.

It might be surprising to find out that the process of containerization began rather recently, and that harbors, airports and train stations used to have trained crews on hand to load and unload different cargoes in the manners that suited them. Furniture had to be handled in a different manner than, say, foodstuffs, and each category of things had to have specialized infrastructure and institutionalized knowledge sets in place in order to be processed properly and efficiently. Which, as you might imagine, is more resource and labor intensive than having an all-encompassing system being able to process all the things.

This before-time is still in living memory, and there are plenty of stories of logistical mishaps to be told from those days. You have but to know whom to ask.

The reason for this text coming to being is not, however, the fascinating global process of logistical standardization in and of itself. Rather, it's how this same process has begun to happen in a more metaphorical way in the present. It can all be summed up in one singular word, and you will understand the significance of the above paragraphs once you see it:


The notion of content is problematic, to say the least. It assumes that all mediated things are, in some fashion, identical, and that the particulars of any given media artifact does not matter. Writing, movies, computer games, music - it's all content. In the standardized world of content delivery, it's all the same. All of human culture has been reduced to one singular ubiquitous gray goo, and the point of it all is not to distinguish one artifact from another, but to keep consumers busy with enough content to maintain a satisfactory profit margin.

This is a rather nihilistic view of culture, and if you spend too much time with it you end up thinking of your creative processes as content creation. You're not writing to express ideas or influence people; you're writing to give readers enough content to keep reading. You're not making music that will move souls and provide katharsis for a new age; you're filling out the minutes until you have enough content. You're not creating anything in particular, but rather a sustained generalized discursive noise that will keep your audience content - if you'll pardon the pun.

This is not to say that there isn't uses for such lines of thinking. Some things become easier to do once you realize that most of it is content - for example functional writing such as journalism or graduate theses.  These things become less cumbersome to do once you realize that it's not about you, and that the main thing is getting words on a page. But it shouldn't be your only line of thinking about your creative processes, or even the main one. You're not doing what you do because you have to, but because you want to.

Content can be created by pressing record and screaming into a microphone for three hours. If we follow the logic of containerization of culture and ideas, we end up in a place where there really is no point to go those extra miles in order to say something in particular. When the aim is to fill out empty containers with content, anything goes. And it goes with expedient efficiency.

You're not a content creator. You're a writer, artist, game maker, musician - you're doing things in order to express something that wouldn't be expressed if it weren't for you. You're contributing to this world. You're a context creator.

What you do matters.

Keep at it.

Thursday, September 8, 2016

Indirect rationality

Humans are strange. They have the darnedest ways of absorbing new information, and the way it is presented to them more often than not matters more than the information in and of itself.

For instance: simply stating something in the most literal straightforward way possible is usually the least efficient method of getting the information across. One would think that it'd be a straightforward proposition, but it isn't. Even if the thing stated is both true and self-evident.

To be sure, if it was sufficient to say something once, a lot of problems would have been solved moments after the Sermon on the Mount.

On the other hand, indirect and circuitous methods of presenting information have a tendency to be far more effective than intuition would suggest. If the information is embedded in, say, a narrative framework with complex storytelling conventions and mechanisms acting out over a large number of pages, this very same information is absorbed with alacrity. Despite the massive overhead.

It might be tempting to attribute this to a failure of rational thinking, but that would be a failure of rational thinking. Rational thinking takes the situation as it is and uses it as a basis for further action, and the situation is that human beings think in terms of contexts and relations rather than singular statements presented in isolation. A rational approach to human beings would take this into account, and present information in need of presenting with an appropriate measure of indirectness, so as to give the contextual and associative thinking time to occur.

Thus, I present to you this following information, in the most straightforward manner possible:

Homeopathy doesn't work.

Wednesday, August 31, 2016

The theological issue of Donald Trump

The Republican Party has a problem. And it is a theological problem.

As you are well aware, they nominated a certain Donald Trump as their presidential candidate for the 2016 elections. They did this by the number, following all the steps and procedures laid out in advance for how a presidential candidate is supposed to be nominated. Everything in the nomination process went according to the rules, traditions and party spirit. Out of all possible candidates, the process ended up with Donald Trump as the candidate. The Republican Party nominated Donald Trump. Unequivocally.

There is no doubt with regards to this. Let's focus on theology for a moment.

If you are well versed in theology, you know that the precise nature of the trinity is a topic that has garnered a non-trivial amount of attention over the centuries. There is only one God, with a capital G, but God also consists of three parts: the Father, the Son and the Holy Spirit. These three parts are all part of God, and encountering any of them is the same as encountering God, as they are all God.

However, this three-part structure is complicated by the fact that the three parts are not one and the same, but three distinct entities. While they are all connected, problems arise if you say they are all the same. While Jesus (the Son) is no doubt of a divine nature, He is not identical to God (the Father), and this has very important implications for His intervention into history. Depending on how you define the precise relationship between the two, the theological and historical figure of Jesus Christ takes on different meanings and connotations.

This might seem like a subtle point, and it is. But depending on who Christ is, you get different versions of Christianity. In one version, God appeared to humanity in the form of a man; in another, He became a man in order to administer the salvation of humanity. Both cases feature the same actors, but the exact significance of what they did differs.

Returning to the matter of the Republican Party, we see the same dynamic playing out both on and behind the scenes. Trump is, following the internal logic of the party, the presidential candidate, and as such he is of the party. But he is also, by all accounts, a bumbling buffoon who turns everything he talks about into an incoherent mess of contradictory nonsense. Which, by virtue of Trump being the party's presidential candidate, also becomes the expressed will of the party. The party chose Trump to express its views and beliefs, and Trump expresses them.

Thus, those in the party have to make every effort to ensure that the precise relationship between party and candidate is delineated in such a way that there is in fact a party left standing when this presidential race is over. And at all times the party functionaries have to be very subtle about how they make these delineations, since they must perform the contradictory tasks of being sufficiently loyal to the party orthodoxy, and heretical enough to not drag the party into sectarian obscurity. They cannot denounce Trump outright, since that would also mean denouncing the party, but they also can't support him, since he is in all things himself. He both is and is not the Republican Party.

It truly is a mess of biblical proportions.

I do believe thoughts and prayers are in order.

Monday, August 29, 2016

Trigger warnings and you

It truly is fascinating how trigger warnings have come to be the center of so much attention. In and of themselves, they are nothing to write home about - especially not in the context large educational institutions facing a large number of students from many diverse backgrounds. The worst case scenario is that a student looks at the warnings, shrugs, and then move on with their studies. It does not get more dramatic than that.

There seems to be a perception that putting trigger warnings on required reading somehow turns it into optional reading. That these warnings, somehow, enables students to pick and choose which readings are relevant to them, discarding those with the (in)correct label as somehow not required for their intents and purposes. That trigger warnings are some kind of "get out of reading free"-cards.

Let there be no mistake: if you do not read and engage with the required coursework, you will fail the course so hard that the resulting draft sets off car alarms. No ifs or buts about it.

What trigger warnings do is allow students to prepare for what they are about to read. If a book contains depictions or descriptions of sexual violence, those who have a history with sexual violence can steel themselves in preparation for those passages. When students reach the relevant point of the book, it does not jump out at them and cause them to relive their past experiences by sheer force of surprise.

Students who do not have such past experiences can just ignore any such warnings and read on as usual. Or, better, read the book with the knowledge that it indeed contains sexual violence, and that the scenes depicted or described are not to be seen as innocent everyday acts. Students can thus become more observant and critical readers - skills that are highly relevant to cultivate in an academic setting.

Trigger warnings don't set themselves, of course. Someone has to (re)read the books in question and make a judgement call as to whether this or that warning is applicable. More often than not, this someone is in a teaching position. In the process of revisiting the coursebooks, they have to critically engage with the required reading and rethink what it is they are actually teaching. Thus, their understanding of the various aspects and nuances of the coursework is enhanced, and they become better able to aid future students. Even if they find that no warnings are necessary.

Given all this, it is hard to understand what the fuss is about. Unless, of course, those who shout the most are also those who are least likely to engage critically with the subject matter, and thus brutally ignore the fact that trigger warnings in no way make required reading optional. It would be ironic indeed if those who worried about trigger warnings turning required reading into optional reading, themselves treated the reading required to understand trigger warnings as optional reading.

It is only polite to assume they know better. -

Tuesday, August 23, 2016

Perfunctory writing

There are two ways to write short, topical blog posts. One is the brute force-approach, the other takes a slightly more indirect route.

The brute force approach endeavors to take readers from a state of doxa (which is to say, knowing nothing) to some particular conclusion. In order to accomplish this, the text has to provide the necessary steps to get from here to there. Mostly in the form of providing necessary background information, and some logical reasoning using this information in order to move things along.

A perfunctory writer can combine brute force with minimalism, and provide just the barest minimum required to propel self-directed readers to the desired destination. Introduce the subject, the prerequisite information, the logical steps and the conclusion - done. Those who want to understand can glean what they need from these words, and those who want to use it as a source that the thing in question is an actuality can use the fact that they are posted in a blog to great effect. Mission accomplished.

The more indirect approach does not aim to convey the bare facts of the matter, but also a specific point of view to go along with these facts. Some additional context to make the general into a particular, placing it firmly alongside other things that are obviously in the same category. You tell it like it is, as it were.

The largest difficulty in keeping this short is that the main objective can only be accomplished by the way, in passing. You do not accomplish it by merely stating a particular thing and dropping the mic; the very point is that you and you in particular is there to provide some verbiage on the matter, reminding readers about your point of view on these things. It takes a sustained effort, but is ever so effective once momentum has built up.

With enough momentum, the posts write themselves in their predictability, ever so perfunctory. -

Thursday, August 18, 2016

Optimizing for the wrong situations

Optimizing for the wrong situations is a very common condition in the modern world. Not due to any fault of the individuals who happen to do it, but because of the sheer volume of complex interlocking systems that are in existence and the impossibility to know them all well enough to avoid it.

An example is when a traveler hears about the very strict custom checks in a country they're about to go to, and efforts profusely in order to make sure everything is in order prior to arrival. Time and energy goes into the preparations, in order to optimize for these checks. Then, when the day arrives, the custom official looks indifferently at the luggage, shrugs and waves them through with a bored gesture. Not just with regards to our traveler, but all travelers passing through.

These things happen all the time. You hear something, prepare exceedingly in accordance to what you heard, only to find out that the preparations were completely unnecessary. Even though you've spent weeks or months agonizing about this one particular thing.

Again, this is not due to any particular fault on your part. It's just that you didn't know the situation well enough to know that it's not a big deal.

This will happen to you, again and again There is no real way to avoid it, other than to only do things you've already done before. Which, to be sure, is the mostest expression of optimizing for the wrong situations. -

Sunday, August 14, 2016

The great man theory of Wikileaks

There comes a time in the life of any organization where it has to choose. Either between the things the organization ostensibly stands for, or the people and circumstances that happen to be relevant to it.

In the case of Wikileaks, this time was years ago, and they made their choice.

They faced, in no uncertain terms, the choice whether to focus on the structural and systemic possibility of whistleblowing, or on those who happened to lead the organization at the time. They could have chosen the former, yet have continually doubled down on the latter over the years.

This is not a subtle distinction to make. Assange and his crew could have made a statement to the effect that he would step back and deal with things privately until the issue was resolved, leaving the day to day operations of whistleblowing and media coordination to those left in the organization. Instead, they chose to turn Wikileaks into an Assange-focused organization, rather than a whistleblowing one.

What's interesting is that there are those who to this day continue to insist that the person of Assange is more important than the structural and systemic possibility of whistleblowing. They insist so fervently that they actively aid in sacrificing every shred of credibility Wikileaks once had in an effort to see their man go free. Instead of keeping the lines of communication open and trustworthy, they prefer to drag it all into the mud and make everything slower and dirtier for it.

It is not a wise choice. But if it is the one you're making, I ask only one thing: is it worth it?

Friday, July 29, 2016

Speaking of educational efficiency

Having majored in education is an endless series of surprises. Not only do you get to learn thing about education, you also get to learn about how to treat people who have not learnt about education and who are not wont to learn about education.

Since you have an education about education, people come to you with questions about education. It's a reasonable thing to do, tautologically speaking - if education works, then the effects of education should be present in those with education. Conversely, if education didn't work, then seeking out those educated about education would be the very definition of folly.

You do not think these things before majoring in education. Because that's what they teach education majors.

This might seem like a roundabout introduction to a text on the efficiency on education, but it isn't. It is at the heart of the question what it means for education to be "efficient", and just at what education is supposed to be efficient.

It is very common in everyday discourse (political and otherwise) to depict educational practices as in some way deficient or inefficient, and to propose alternate practices which just happen to alleviate or eliminate these flaws that were presented moments ago. Everyone wants efficiency, and if someone with a confident demeanor says that something is more efficient than what we've already got, than it's probably true.

Otherwise they wouldn't be so confident, right?

Thing is, being efficient is not a standalone quality. There is always an implicit "at something" at work whenever the word is used, and being able to discern what this "something" is, is a key part of being able to tell generic confidence from actual, trusty knowledge.

Economists, who for some reason are extremely keen on applying generic confidence in political debates about education in particular, are prone to use statistics to show that education has become more and/or less efficient. The typical example is to compile the numbers on grades and how they've fluctuated over time. A move in any direction is interpreted as signs of (in)efficiency, and boldly proclaimed as such.

There are several peculiar unexamined assumptions at work here. One is that higher average grades guarantee some quality of excellence within an individual, and that educational systems that produce higher average grades also produce more of this excellence. Higher grades thus equal higher efficiency, which is the desired result.

The obvious counterpoint to this is of course to just give everyone the highest grades possible, thus guaranteeing maximum efficiency. Which is false on the face of it (although a case can be made that the absence of grades might have positive effects on learning environments), but it highlights the next unexamined assumption.

The second assumption is that this quality of excellence is a uniform quality that expresses itself equally in every student. It might seem counterintuitive, but it follows from the assumption that grades in and of themselves measure the same thing across individuals. It's subtle, but it obscures differences between individuals and - more crucially - curricula.

A third assumption is that the current curricula are adequate with regards to producing the desired outcome. The fact that the desired outcome is left undefined during the argumentation does not leave the assumption empty - it simply leaves it in the default mode, which is the current policies in play. Unless you specifically frame a different desired outcome, you're more or less forced to agree with the standard documents and their valuations of different skill sets and competencies.

Status quo has the advantage of actually being implemented, and does not need further explication to keep being implemented. Quite the opposite.

As you can see, you very quickly get into these deep nested layers of assumptions that follow from previous assumptions, ad infinitum. Sorting out what is assumed where and how these assumptions interact with others on the various levels takes both dedication and hard work, and an explication of any given boast that x is more efficient than y is bound to take up exponentially more verbiage than anyone really ought to read. Generic confidence is a powerful tool, able to convey a whole world of effort into a brief "it's all very simple, really".

But not to worry. We can deal with it. We are taught these things as education majors.

It's super effective.

Tuesday, July 12, 2016

An oppositional read of Ghostbusters

The new Ghostbusters movie is out and about, and there are no lack of reactions to it. From what I've gathered, those who saw it without any particular expectations (other than the busting of ghosts) kinda liked it but had some minor complaints here and there. Others, however, claim that the movie ruined their childhood, and that they now face years of heavy duty therapy to recover from the loss of their past selves.

While the phrase "ruined my childhood" is hyperbolic to the extreme in this context - it's hard to imagine how such a thing could occur because of a ghost busting movie - it does however open up for some interesting possibilities. If we take it as possible for a movie to ruin a childhood, it should logically follow that an equal and opposite effect is possible. If childhoods are somehow open to retroactive alterations, then it ought to be possible to produce movies that in some way enhance these very same childhoods.

An opposite Ghostbusters, as it were.

This line of reasoning opens up a whole range of therapeutic treatments of many actually existing shitty childhoods. Indeed, avid entrepreneurs might want to get to work right away on retroactively prophylactic cinema products, before the market is flooded with happy memories and fondly remembered daydreams of the future we now ended up in.

I can't wait to see it happen.

Thursday, June 30, 2016

What even is remain?

Recently, I did some very late editing to an earlier post, that on modern ruins. Like all of my translations, it suffers from the fact that it is a rework of a piece I wrote when I was less language. Revisiting them means noticing that they do other things in text than in retrospect. The temptation is always there to edit them so that grammar and sentiment align.

I keep them around for the sake of archaeological preservation. No sense pretending the past isn't affected by learning and personal growth. Old writing always happened before the learning that came afterwards.

Thing is. It is possible to look upon contemporary architecture as modern ruins, and read it as such. Certain time periods had certain architectural norms and standards, and built accordingly. These norms and standards have mostly faded away, but the buildings remain, and with an astute enough eye it's possible to read past sentiments off the walls. Sometimes literally - either by design or later additions, such as graffiti - but mostly in implicit terms. Either the writing is on the wall, or the wall is the writing.

It always amazes me how much can be conveyed through architecture. It's never just about keeping the roofs supported and the walls upright. A whole aesthetic is conveyed by just standing around. This is the way things are, the walls say. Because they are.

There is a literacy to these things. Knowing the mindset and zeitgeist of the times that built the buildings around you lets you decode them more skillfully. You can see the optimistic bureaucratic 70s peering at you from the brick boxes, and the 00s from the confused rectangles that looked worn the day after the construction crews left. The past is on display. It remains.

These buildings weren't meant to be ruins. They were meant to perform functions in the present. To house, to store, to home. To present, as it were.

Nowhere is this as apparent as in abandoned buildings. Every part has a specific function, a designated task to be performed within. The fact that it hasn't, and hasn't for a long time, only underscores this intentionality. Dusty conveyor belts convey more than dust. Past design insists itself. It remains.

The question does insist itself. How far-fetched would it be to propose that not all ruins are physical, and that some are ideological, political, social?

It is a question to live by.

Sunday, June 26, 2016

Some inspiring words about brexit

Before the brexit election happened, I wrote a blog post about post-electoral practices. About the inherent absurdity of showing up the day after an election, dressed in electoral garments and keeping on the electoral shenanigans as if nothing had happened. The point was, as you might imagine, to point out the seemingly inherent ridiculousness of such a course of action.

And then to turn it on its head and suggest that a more long-term virtuous approach to electoral campaigning, where you can actually just keep going after the election day without changing too much. The main point being that if your company is both pleasant and convincing, there'd be no need for electoral excesses. You could just comfortably speak your convictions and have listeners accept them by sheer force of personality. Quintilian style.

But then the election happened, and about five minutes after it became clear that leave won, its chief campaigners instareversed and declared everything one big lie. They never meant it, they only suggested possibilities, the impression that they actually wanted to leave is mistaken.

As electoral excesses go, this sure takes the cake. All the cakes. Especially those cakes that are lies.

Let the liars get the just deserts they deserve.

This whole ordeal underscores the point I was trying to make, though. There are a lot of elections going on, and you're likely to be involved in some of them. If something is to be salvaged from this epic clusterfuck of an election, let it be this: campaign in such a way that you can keep going with pride and confidence afterwards, regardless of outcome. And, more importantly, that you can remain on good terms with those who happen to disagree with you in the matter of who or what to vote for.

I'm sure you can understand the rationale behind this, seeing its opposite on prominent display.

The post I originally wrote turned out to be in exceptionally bad taste, given that it assumed a remain victory. But but. There are still things to be learnt and salvaged from this mess. -

Saturday, June 25, 2016

Some uninspiring words about brexit

Brexit happened. No one really wanted it to happen. Not those who set it in motion, not those who for months shouted about it with alacrity and, least of all, those who voted for it on a lark. But despite everyone, here it is, and it opens up a vast range of possible futures that are slightly less optimal than remaining.

One possible future is that some arcane part of the British body politics mobilizes and declares the whole thing null and void. It's unlikely, but it could happen.

Another possible future is that things tumble around for a bit, and then settles into a situation akin to that of Switzerland or Norway. While not formally part of the EU, they're not not parts of it either, and for most intents and purposes the difference is metaphysical.

A third possible future is that things break down completely, and the EU starts to treat the UK much like it treats Syria. No one is allowed to enter, and those who try are discouraged in the most direct of ways. Economic cooperation is severed completely, and the Brits are left to fend for themselves on their imperial isle. No matter how miserable they become.

The future that will actually take place is likely to fall somewhere in between. But it is worth noting that the EU contains the possibility of both possibilities, as it were. And any analysis of the EU as it actually exists - with and without the UK - needs to take both possibilities into account. The EU is both, in potentiality and actuality.

As it stands, brutal negotiation await those unfortunate government officials who have to hammer out the details. The EU has the capacity and incentive to not give two flying figs about the plight of the British people, in an effort to discourage further withdrawals from the union. As such, they will play the hardest balls right in the faces of the UK officials, who neither wanted nor prepared for the task foisted upon them by their supposed leaders.

It is not a pretty sight, and I do not envy those involved.

Especially not the Syrians.

Thursday, June 16, 2016

When rhetoricians and pedagogues clash

I operate by the rule of three. If I see something referenced three times, I look it up. If three persons, independently of each other, ask me to do something, I give this something serious consideration. It's a simple rule that leads to interesting new avenues, and if you ever find yourself bored, try it out.

The reason you're reading this is that three persons, independently of each other, asked about the BA thesis in education (Swedish: pedagogik) I outlined in passing the other day. So here goes. The most technical document you're likely to encounter on this here blog. If you're not an educator or a rhetorician, you might find this somewhat orthogonal to your interests. It would be quite alright to skim this one.

(If you're entering from Twitter and lived through the Thesis Livetweetage, be aware that this is not the memetic fanfic online literacy MA thesis of awesome you've heard so much about. That one is still under evaluation, and will hopefully pass soon.)

My writing partner (bless 'em) and I wanted to analyze an ancient series of rhetorical exercises, the progymnasmata, and how these relate to modern educational practices. Our reasoning was that there seemed to be some very tangible skillsets taught by this series, and that it would be enlightening to see if and where these same skillsets were taught today. And, subsequently, if any insights could be brought to bear from knowing these ifs and wheres.

Which, as you might imagine, is a very broad question, in need of being brought down to earth in terms of scope and analytic feasibility. Specifically, two things needed to be operationalized: the skillsets taught by the progymnasmata, and those skillsets taught in schools today. Only by breaking these skillsets down to their basic elements (the skills that make up the set, as it were) could we compare and contrast them with sufficient detail to say anything interesting about them.

At first, we thought that simply looking at the various exercises outlined in the progymnasmata would be sufficient to get a handle on what they're supposed to teach. However, we soon found that while they certainly built up to something, this something was underarticulated. After being frustrated by lack of context, we realized we'd better get some context. This we found in the form of Quintilian, the famous Roman rhetorician and educator, who both used the progymnasmata and had a whole philosophy regarding what they're supposed to teach.

(A technical rhetorical note: it is both possible and common to apply the progymnasmata [or variants thereof] as a kind of vocational training, where those who undergo the exercises emerge afterwards with newfound abilities to give presentations and suchlike. If all you want to do is to make sure your employees/students are able to give interesting talks, then it works well towards that end. However, this is somewhat barren in terms of comparative educational insights. Thus, we brought in bad boy Quintilian.)

I'm going to skip the tedious details about how we settled upon the five key skills taught by Quintilian's version of these exercises, and get straight to them. These are as follows: critical thinking, the ability to actualize oneself as a social subject, the fast organization of information, to have a good orientation in literature, and to establish good and enduring habits early on. The first three follow from the exercises, and the last two are emphasized by Quintilian. They all point towards the same goal: to become a good orator. Or, to quote: "we are to form, then, the perfect orator, who cannot exist unless he is above all a good man".

Being able to think critically means to not take things at face value, and to see things from multiple points of view. The ability to actualize oneself as a social subject is a complex matter, but it can be summarized as knowing what to say to whom in order to get results (especially in matters of state and law, the traditional arenas of rhetoric). The fast organization of information relates to being creative in finding things to say (topoi). Having a good orientation in literature means both to have read the classics, and to constantly be on the lookout for new nuggets of insight (or turns of phrases) to use when the rhetorical need arise. The good habits - the virtues, if you will - are rather straightforward.

Together, these five skills form what Quintilian named a hexis. This is more than just the ability to do something - it's more akin to having your whole way of thinking based on or shaped by an ability to do something. Learning something - in this case rhetoric and speaking in public - is not just a "learn and forget" kind of thing. You emerge a different person from the experience of learning, and by virtue of this you apply your new insights automatically to all aspects of your life. The knowledge is integrated into your character, and thus you know it intimately.

When Quintilian says that a good rhetor must also be a good person, this is part of what he means. Simply learning some detail or aspect is not sufficient. The art of rhetoric has to become an integrated, instinctive response to new situations - only then has the educational process been effective. You are not just someone who knows rhetoric - you are a rhetor. Your whole being as a person is involved.

(Another technical rhetorical note: the other part of this - the good part - has to do with the conditions of persuasion. If you are not a good person, and use your rhetorical powers to evil ends, those you try to convince will remember this in the future, and so become less inclined to listen to you. Conversely, if you at every point try to do what's good, this too will be remembered. Being a good person means you're in good standing with your peers, and your words thus carry more weight. The counterexample, of course, being Donald Trump.)

If you've followed so far, you have probably gathered why we sought to find if anything of this remained in contemporary educational practices. Both in terms of the explicit aim of educating good/virtuous persons, and in terms of the deep kind of knowing emphasized throughout. Regardless of the subject to be taught, there might be potential advantages to this line of thinking when applied to contemporary educational institutions.

Fortunately for my writing partner and me, we didn't have to ponder the morass of actually implementing or changing anything (at least not within the context of the thesis). Instead, we moved on to the second part of our analysis: the state of our current educational institutions.

Of course, this could stand with some narrowing and specification. Examining the entirety of a school system isn't very expedient, so we had to settle for some part of some aspect of it. We chose the time honored strategy of reading the manual, so to speak: curriculum analysis. Specifically, we read those parts that relate to the subject of Swedish. (The subject is very similar to English in English-speaking contexts. However, since English is also a subject in Swedish schools, calling it "English" would confuse things.)

Since the Swedish school system recently got a brand new curriculum, we decided to compare the old version with the new version, using the framework built from reading Quintilian. I'll spare you the tedious details of how we went about this, and get right to the interesting parts.

The curriculum of '94 (aka the old version) focused on the acquisition of "skills" or "abilities" [the term "förmågor"can go either way]. The curriculum of '11 (aka the new version) focused on the successful internalization and application of strategies. This might seem a subtle difference, but it has dramatic effects when applied in educational practice. For example, the ability to utilize a library and its resources is different from applying a strategy of utilizing the library. The former implies some sort of familiarity and affinity with the library as an institution; the latter implies that libraries are one strategically viable option among many equally viable options.

Now, this is not to say that abilities are better than strategies, educationally speaking. But depending on which framework you use, you end up emphasizing different aspects of the problem at hand, and different solutions to solving it. Being aware of the strengths and weaknesses of your framework gives you the option to compensate for the weaknesses and play into the strengths.

One of the strengths of Quintilian's hexis approach is that it's comprehensive. It teaches, and it teaches well, and brings with it the potential of further deep learning based on the learner's own interests. It has a corresponding drawback of being time-intensive, and requiring a not insignificant amount of dedication from both teacher and learner. You end up immersed in the subject matter, but it takes a while to get there; being oriented in literature only happens one book at a time. (This approach also brings with it the risk of alienating you from your peers, but that's another thesis.)

The abilities approach gives you the understanding you need to act in certain situations, such as in the library example above. It gives you what you need to move along, should interest motivate you. However, if such interest doesn't motivate, you end up with a partial understanding of a particular situation, without the appropriate context to act on this understanding. Such as, say, having a library card and a mechanical understanding of how to use it, but not really knowing or caring as to why you'd want to use it. Or, indeed, why libraries exist.

The strategies approach has the advantage of being fast and efficient. It identifies the desired outcome and the way(s) to get there, and it's easy to control if the educational goal has been achieved. Particularly when the situation only requires that you teach one particular thing. It does, however, run the risk of becoming fragmentary. Clever learners can game the system by quickly identifying the desired strategies, perform them sufficiently enough to pass muster, and then forget all about it. (Interestingly, both studying AND teaching for the test follows this pattern. But, again, another thesis.)

Rhetorically speaking, this would be the place to write that we ended up advocating a particular approach over all others. But, being an exploratory thesis, we didn't arrive at fire and brimstone clear-cut solutions or conclusions. The point was to be able to think about these things in clearer and more nuanced terms. There are advantages and disadvantages to each approach, and each are used in different settings depending on which learning outcome is sought. Some of them within traditional educational institutions, some without.

By sharing these findings with you (all three of you who asked), I hope you'll be able to ask better questions about education in the future. More so, I hope that it has become clearer that education isn't just one standardized thing that can be performed better or worse, and that the goal of educational policy isn't to choose the option that performs better (by any arbitrary definition of better) than the others. There are nuances to these things, and perhaps - just perhaps - bad boy Quintilian still has a thing or two to teach us.

Thank you for reading.

Tuesday, June 14, 2016

Don't say that all lives matter

At times like these, a temptation grips those who want to comment on recent events. A sentiment. An urge to proclaim, in solidarity will all of humanity, that all lives matter.

If you feel this, in any way, shape or form: don't. Resist. Abstain.

It's not a wrong sentiment to have, but it's a wrong sentiment to express. It does not help. At best, it adds nothing to the discussion, and only wastes the time it takes to express it. At worst, it causes all kinds of additional harm and strife, which is the opposite of what you wanted to achieve by expressing it. Expressing it will not improve the situation.

The reason for this comes down to one word: bandwidth.

Social situations have a very limited amount of bandwidth. Only a few things can be socially processed at once. In the cool, calm and collected moments between situations, things can be compared and contrasted and analyzed though larger perspectives. But when the situation unfolds, only a few key items that can be processed at a time. The fewer, the faster and better.

Saying "all lives matters" in a situation where some lives clearly matter less than others introduces a whole range of issues that cannot be processed there and then. There simply isn't enough bandwidth. Insisting that some of this limited resource is diverged is both counterproductive and - dare I say it - rude.

There is a time and a place for all things. Sometimes, tact and discretion are in order.

Sunday, June 12, 2016

Ordinary people Twitter is not a slur

Recently, I've begun to think more and more about this phrase. Ordinary people twitter is not a slur. As a slogan, it lacks the necessary brutal directness of impact. As a subtle statement, it creeps up on you and surprises you when you least expect it.

In order to understand this phrase, it is necessary to understand what this mythical "ordinary people twitter" is. Who are these people, and what do they want?

Thing is. It is more of a negative identity than anything else. That is to say, it's easier to say what it is not in order to gradually approach an understanding of it, than to approach it head on with a declarative statement such as "ordinary people twitter is".

Ordinary people twitter does not have an emergency strategy for when hundreds of angry young men emerge in your mentions and threaten to spill over to your friends and family. It does not have such strategies for the very good reason that it does not need them. The thought of needing these strategies is absurd on the face of it - yet there it is.

But what do you do when your notifications are all about how much people hate you? What do you do when your family texts you to say that strange people are calling them? What do you do when they are outside your home, after learning the address from a public posting?

If you scratch your head in confusion at these questions and their relevance to twitter - congratulations. You are a solid member of ordinary people twitter.

The phrase "ordinary people twitter is not a slur" is a very nostalgic statement. It reminds of a time when you could post that you were getting a sandwich and get at most one fav. Nothing much happened, and that was okay. You posted ordinary things about ordinary things, and that was that. Life moved on.

Being a member of ordinary people twitter is not a bad thing. It is a good thing.

But it suggests the extraordinary nature of the situation the rest of us find ourselves in. The extraordinary twitter. Those who keep a watchful eye open for people acting in explicit bad faith, and live with the awareness that twitter admins - working on the assumption that ordinary people twitter is the only twitter - won't do pretty much of anything to help when the raging hordes come hording.

This is not a healthy state of mind. And it is not a healthy state of things. For anyone involved.

Saturday, June 11, 2016

How to speedrun Twitter in ten easy steps

1. Don't be yourself. You're a nobody, and nobodies are not interesting.  Be someone else. Better yet: be a concept. The more immediately recognizable a concept, the better. The aim is not to be a genuine expression of anything related to you; the aim is to speedrun twitter, as fast as possible.

2. Be interesting. Latch on to some existing or emerging trend, and position yourself in relation to it. Become a source of information, amusement or familiarity. Enthuse your followers. Whatever you do, don't be boring.

3. Be about something. A thing. One singular, well defined, utterly overanalyzed thing. Stick to this thing religiously. Think the thing. Be the thing. Tweet the thing.

4. Be relevant. Stay on top of things. Help your followers stay on top of things. Generate a feedback loop that makes your followers help you stay on top of things. Become an up to date resource. Be a goto source.

5. Be useful. Provide tangible benefits to you followers. Be a voice worthy of being listened to. Post relevant links. Make your followers know more by virtue of following you. Help them along.

6. Be partisan. Being impartial and neutral takes a lot of time and effort, and has a poor return on investment. Take a stance and stick to it. Loudly and often.

7. Get into a lot of fights, but never participate. Getting into fights generates a lot of attention. Participating in fights generates a lot of badwill. Tweak this to your advantage.

8. Retweet like a mofo. Creating things on your own takes time, energy and effort, and chances are you'll only create one or two things at a time. Retweeting, however, takes less effort and even less time. Plus, there's a lot of interesting and useful things out there in need of more retweets. Make it happen.

9. Don't be afraid to meme. Memes are your friend. Everyone likes memes. Memes.

10. Be afraid to be boring, uninteresting and irrelevant. Make every effort to avoid these things. There is simply no time to waste, and being either of these things wastes more time than you or your followers have.

Friday, June 10, 2016

Identity politics for everyone

At times, I hear people decrying the horrors of identity politics. It's optional, they say. It distracts from real political issues, they say. Dividing people up is divisive, they say.

Okay then.

Let's reverse things. Let's remove all pretenses of division and focus on a unifying aspects that cuts across all intersectional barriers. Let's get back to basics.


A fundamental part of citizenship is that any given citizen has the same rights and obligations as any other citizen. It doesn't matter who you are, where you were born or what you do for a living. The laws are uniformly applied across the board, and there are no distinctions between citizen A and citizen B. Both are citizens, and both are equal before the law.

Simple, easy, undivisive. It cannot become less identitarian than that.


Complications arise when a particular group of citizens demand that they be treated like all other citizens. That their rights, beholden to them due to their status as citizens, are respected and enforced in practice, rather than just on paper. Is this an instance of identity politics, or just a simple assertion of citizenship?

The difference may appear subtle, but it has clear consequences for how such assertions are treated. If it is seen as identity politics, it is usually scoffed at and ignored, regarded as of little consequence. If it is seen as a proper assertion of citizenship, it is seen as the right thing to do and a correction of injustice.

The question is: how to tell the difference between the one and the other?

You wouldn't want to make a mistake and scoff at a group of citizens claiming their legitimate rights, now would you? That would divide the citizenry into two groups - those who have rights and those who do not - and that would be the opposite of an undivided whole. You would end up with identity politics, even as you try to keep it off the table while focusing on the real issues.

Identity politics is tricky like that.

Wednesday, June 8, 2016

Performing critiques of religion

The honorable Rothstein recently wrote a debate article, which was given the title "Religion does not contribute to a better society". Which to be sure is the main thesis of the article, and the main argument for this thesis is that it can be shown using statistics. The argument ends there, without going into specifics, but we have to be understanding of the limited space afforded by such articles.

The honorable Bengtson wrote a reply to this article, given the title "Religion does not exist in general!". The main thesis is that the concept of "religion" is about as wide as the Atlantic Ocean, and that it follows from this that it's hard to draw conclusions about it. That is not to say that it cannot be done, but the concept has to be used in a more specific and explicit manner before embarking on such conclusionary endeavors.

To use an analogy: both football and Starcraft are sports. There are similarities between them. There are also differences, and these differences are of a nature that those things that apply to Starcraft do not automagically also apply to football, and vice versa. It is possible to pontificate on sports in general, but it helps everyone involved to specify whether the discoursing is related to Starcraft, football or some other sport. Just to keep everyone on the same page, as it were.

Before things get heated, I want to apologize to any potential readers with strong religious feelings about sports. Just in case.

We live in a time where many are engaged in criticism of religion. Or, rather, what they think is criticism of religion. Specifically against Islam, which for reasons inexplicable is deemed more in need of criticism than other religions. "It must be allowed to criticize Islam!" they bellow repeatedly, and it's hard to deny that the feelings surrounding this issue are both strong and upset.

But. Do they understand what they mean when they use the words "criticism of religion"?

As stated above, "religion" as a concept is both unspecific and unwieldy. The same goes for the concept of "critique" - even more so since it's one of the least understood concepts of our time.

To simply bellow "ISLAM IS A SHITTY RELIGION THAT DOES NOT BELONG HERE" is not to perform critique. It's just uncouth, inarticulate and headless. Nevertheless, there is no shortage of people who refer to such bellowing when they say they have to be allowed to criticize Islam. While it would be far from me to imply that these people are uncouth, inarticulate and headless, they are indeed wrong.

Critique is something that takes time. And space. Literally. To critique something is to analyze this something (preferably in detail), to relate this something to something else (which preferably also is analyzed in detail), and to then proceed to describe the similarities, differences and points of contact between the two. All the while keeping the readers informed of the steps taken by the analysis, with the aim of having conveyed an understanding of both the analysis and the things analyzed. The purpose of critique is not to find faults and flaws, but to convey an understanding of the thing critiqued - an understanding that includes such faults and flaws.

Which, as you might imagine, requires many words to perform properly. Critiques and understandings are not done in a hurry.

Those who want to critique Islam has a formidable challenge ahead of them. First, they have to grok Islam, its contexts, its core values and its everyday practices. Then they have to build a framework to relate and compare this understanding to. Then begins the hard work of comparing, relating and contrasting, all the while presenting these efforts in such a way that one's understanding of Islam, the framework and eventual conclusions are made explicit to the reader.

It would not be unfair to propose that those who energetically claim their right to criticize Islam does not have this in mind. They do not have criticism in mind at all. They have a completely different verb in mind.

But, if you ever meet someone who energetically claims such a right, point them towards this post. To give them a sportsmanlike chance to say what they actually mean. -

Originally published March 3, 2015

Tuesday, June 7, 2016

The most useless knowledge in academia

A while ago, I completed my BA in education. It is, by far, one of the least useful degrees imaginable.

Now, don't get me wrong. It's useful in terms of marketable skills, personal growth and insights into the mysteries of being human. All the good jazz you expect from a degree. But.

There is a but.

The thing about education is that everyone has opinions about it. What it should be, how it should be conducted and what the end results of it ought to be. Everyone, from all walks of life, from all political camps, from all everything. Everyone has opinions. Everyone.

The thing about these opinions is that they very rarely are based on any particular knowledge about education. Or, rather, they are based on very particular pieces of knowledge, without much context to support them. Just to keep things in balance, this lack of context is made up for by an overflow of emotion and passion when it comes to discussing the issue.

Just the one, mind. One issue at a time.

The intuitive thing to do when these issues come up is to try to provide some context. Use that education to do some educating about education, as it were. More often than not, however, the passions are such that any attempt to educate will be met with fierce resistance and fiery disagreement. It discourages further attempts on the subject.

Which leads to interesting situations when things like trigger warning, safe spaces and campus politics come up. These things could be used as launch pads for discussions on curricula analysis, pedagogic philosophy or the role of educational institutions in contemporary society. They could be. But they aren't.

The thing is, of course, that these enthusiasts are not willing to learn. That's not the reason they engage in discussions about these issues, nor the reason they want to be seen publicly as engaged in discussions about these issues. Most discussions about education, it turns out, are not actually about education, but about broader issues that just happen to find purchase in popular perceptions about education.

Thus, knowing things about education is pretty much useless in such discussions. It's beside the point. It's like bringing a knife to a gunfight - no matter how fine the point is, it's just not relevant to the situation. And knowing a degree's worth about education is a degree of uselessness.

It does, however, save you from engaging in useless fights with posturing know-nothings. Which is a win in and of itself, no matter the subject.

And you get to brag that your BA thesis was all about how Quintilian's philosophy of education relates to modern day curricula, and the importance of remembering that the role of education is to teach the young ones to actualize themselves as social subjects. Good times all around.

Thursday, June 2, 2016

The information complexity of bee sexuality

Recently I began to see people ambiently talking about bee sexuality. Which, as you might imagine, made me go wtf, until I stumbled upon the context (apparently, worker bees are all female, and Bee Movie got it wrong). Upon finding this out, the wtf factor disappeared, and so did my interest in the matter. But it did get me thinking about information processing.

Information processing happens in iteration cycles. The information differs from case to case, but the general process is the same every time, with up to five stages if the information is complex enough.

The first stage is the wtf stage. You have encountered something, and have no reference points for what it might be. The thing just exists, an intrusion into your ordinary mode of understanding the world around you. There are things that make sense, and there are things that do not. This thing is clearly in the latter category.

The second stage is the huh stage. You've been given or acquired some context to the thing, and started to make sense of it. You still don't understand it, but whenever you encounter it again, you can confidently go "huh, I've seen this before".

The third stage is the exploratory stage. You've begun to understand the thing, and are exploring the possibilities afforded by it. Thoughts that follow the lines of "if, then" are starting to enter your head, and you try it out just to see if the thens then. Just to see if you've actually understood the thing, and to satisfy your emerging curiosity.

The fourth stage is the experimental stage. You've grasped the thing, and now try to relate it to other things previously grasped. Using your accumulated body of knowledge, you try to find where the things belong and where it does not, and where it would produce interesting results if introduced. Some of your experiments will succeed, others will fail, some will fail spectacularly.

The fifth stage is the meh stage. You've understood the thing, done the thing, done the permutations of the thing, and know where to apply it to best effect. In short, you're rather bored with it, and can do it in your sleep or mindless working hours if called upon to do so.

Of course, this is not a thing that happens once and then never again. It happens all the time, all around us. Different people are at different stages, and that which engenders a wtf reaction in one person is a meh to another person. Nothing is static - everything is constantly processed.

The things to look out for are the iteration cycles. While these stages are pretty agnostic to the online/offline divide, the online has the advantage of faster iteration cycles. Things can go from wtf to meh faster than you think, and more things can undergo this transition in parallel than you imagine. Which means that, left to its own devices, the online can produce some spectacularly fast mehs, and generate demand for very particular wtfs that seem very far from the offline experience.

So the next time you stumble upon discussions of bee sexuality, remember this post. Introducing it to the context might produce some interesting results. -

Tuesday, April 12, 2016

My endorsement in the 2016 US presidential election

Sometimes people ask me who I'd vote for in the US presidential election. I always answer the same way: whoever wins will become the president.

This tends to confuse the askers rather than enlighten them. Yet, as a foreigner who is utterly unaffected by US domestic politics, it is my position. No matter who wins, they will become president, and the US foreign policy will remain unaffected. The drone killings will continue and the occupations of randomly chosen countries will go on. Guantanamo will remain open.

The institutional setup of the US is such that it doesn't really matter who's at the top. The trends and forces that eventually result in the state apparatus doing what it does are largely autonomous and uncaring. They might affect those in close proximity to a particular situation - such as, say, the person holding office - but the momentum built up by the sheer weight of institutional determinism makes the overall picture utterly predictable. No matter who this person might be, or how their new life situation suits them.

Thus, I'm not too invested in the comings and goings of the electoral shenanigans. There are only so many hours in a day, and so very many things to do in them.

But I must say that the dude that ran for president in 2008 would make a rather neat president. He said some neat things about hope and change. I wonder what happened to him.

Sunday, March 13, 2016

How the Holocaust could happen again

The scariest thing about the Holocaust is not that millions of people were killed in it. Which, to be sure, is a scary thing in and of itself - anything that kills millions is scary - but even scarier is the weapon used to kill all these people. And the brutal industrial efficiency with which this weapon was applied.

It should be pointed out that the Holocaust was not a sudden impulse amongst a few maniacs with supercharged ambitions. Such impulses and such ambitions can be found in abundance in more people than they really ought to, but most of these people never get around to act on it. And those who do only kill comparatively few in comparison - although every death is a tragedy, even the most ruthlessly efficient mass shooter can only shoot so many before running out of bullets. No matter how megalomaniac a crazed individual gets, it is physically impossible for them to shoot millions of people.

The Nazis began their holocausting using guns. Line up the Jews and shoot them. Bang. One bullet, one dead Jew. Easy mathematics. After a while, though, they found that this brought along unexpected difficulties - bad publicity and poor working conditions. It's hard to keep a populace sedate when parts of it are visibly marched to a nearby field and shot, and the logistics of dead bodies is not a problem to be underestimated, especially when it comes to optics. Worse is the logistics of those who have to execute these policies - they break psychologically, and have to be replaced with new bodies. Seeing mass death is devastating in the extraordinary circumstances of the front lines, and even worse on the home front. If the final solution were to be sustainable, it had to utilize more efficient means.

They found more efficient means. They used them systematically and with comprehensive documentation. They mobilized enormous resources in order to find even more efficient means of utilizing these means, and then set to work using them, with the implacable pace of a looming Monday morning.

They weaponized the ordinary working day, and the workers who tried to live their ordinary workaday lives.

It's incredibly difficult to give a civilian a weapon and ask them to shoot another unarmed person who's incapable of defending themselves. Most people have an inbuilt instinct that screams that such an act is the most wrong they could possibly do, and even soldiers have to undergo hard training to suppress this instinct - even when it comes to shooting at enemy soldiers who are shooting at them. The instinct not to kill is a fundamental part of the human psyche, and those who want to exterminate a large number of humans have to take this into account. A more rational division of labor was needed to get the job done.

With this in mind, the Nazis separated their victims from the societies they were to be eradicated from. First in the form of ghettos, and then in the form of concentration camps. When the victims were removed from the ordinary lives of ordinary people, they could be processed with brutal efficiency without causing either public relations disasters or workplace hazards. The old expression "out of sight, out of mind" becomes somewhat morbid when what's not seen is the routinized killing of millions.

Routine was the main component of the Holocaust. It takes time to read millions of names (if you begin now, you'll be done in a week or so, depending on if you need to sleep or not), and it takes even longer to put these names out of existence. It cannot be improvised, and has to be carefully planned. It has to be administered. Trains must be kept rolling, employee registers updated, budgets balanced. It requires a whole range of boring activities that in and of themselves are neither interesting nor deadly.

The Nazis mobilized large parts of the civilian workforce to make the Holocaust happen, one small task at a time. Repairing and maintaining trains might seem a morally neutral job in and of itself, but not when these trains have destination Auschwitz. Sitting in a gray office and administering employee paperwork might seem like the least bloodthirsty thing imaginable, but even SS troops had to be paid. Budgets are among the most boring things in existence, yet the notion of being bored to death becomes terrifying when one reads the planned expenditures on concentration camps.

These are but three examples of tasks that needed to be performed to keep the Holocaust going. Most of these tasks didn't demand more than that they were fulfilled with a modicum of enthusiasm and with modest efficiency. They didn't demand any particular ideological conviction, and as long as they were done, the killing could continue. As long as ordinary people kept going to their ordinary jobs under ordinary working conditions, the Holocaust could add another working day to its balance sheets. Production is proceeding on schedule and within budget.

Under these conditions, the mass killings could continue out of sight, all the while the civilians continued to try to eke out their lives in the shadow of the war. Ordinary people could continue to go to work, catch movies at the cinema and live to the best of their abilities, most of the time without even thinking about Jews (except when mentioned in the propaganda). They had other things to do, and when they after the end of the war said they didn't know what was going on, this was more often than not because they actually did not know. It was not a part of their lived experience, and those things that are out of sight are also out of mind.

The scariest thing about the Holocaust is not that millions died in it. The scariest thing is that a whole society was re-focused to mass-producing death on industrial scales, and that the means of producing these deaths was the accumulated efforts of millions of ordinary citizen's daily labor. Its infrastructure was the boring routine 9-5 workday, writ large.

It becomes even scarier when we remember that "an honest day's work" is still regarded as a positive thing. Going to work, performing one's duty and earning one's keep are positively charged phrases. There are many governments in the world with the explicit aims of putting more people to work. Arbeit macht frei.

When it is said that the Holocaust could happen again, this should not be understood in terms of the continued hatred against Jews and other groups. It should be understood in terms of ordinary, honest people being put to work in the infrastructural and administrative sections of another industrial mass killing, and that this can be done without them noticing. Worse, that the inevitable whistleblowers exposing these killings will be lost among the absurd amounts of other terrible things happening in the world. What's one more hard-to-confirm rumor when there's a deadline to meet?

The only thing necessary for evil to triumph is ordinary people not doing anything in particular. One workday at a time.

Your alienation begins now.

Originally published January 30, 2016