Thursday, July 27, 2017

In the mood for some discourse

Both the two most recent discursive anomalies share a theme. That theme is, somewhat unexpectedly, mood. Or, put another way: the way reading a particular text makes you feel, and how that feeling affects your thoughts.

In case you are reading in the future, the two anomalies in question are the ones about Hyde and Booth. Since texts are always retroactively present, you can sneak over to read them without missing a beat. Go on. These words will still be here.

Mood is an underrated concept. Sometimes it is dismissed outright, as part of the overall category of 'feelings'. At other times, it is seen as a distraction from the main point of interest, e.g. 'not being in the mood', 'being in a bad mood'. There is a tendency to see mood as something that happens beside the point, and that reality happens without you while you are distracted by these irrelevant moods of yours.

Besides being both rude and bordering on gaslighting, these takes have the additional drawback of being wrong.

Booth is perhaps most explicit in his discussion of moods. One of his premises is that the reason you keep reading a particular text - a romance novel, a cartoon, a crime novel - is that you want more of whatever it is you are reading. The point is not to see if the lovers stick together, what the punchline might be or whodunnit, but to extend the present experience of reading, whatever it might be. The act of reading the text puts you into a certain (albeit at times intangible) mood, and it is this mood that fiction provides. Far from being a side point, mood is for Booth the express purpose of reading. And, by extension, writing; to create an artifact in the world that conveys the kind of mood the author is interested in conveying, and thus creating an opportunity to explore this mood - both by experiencing it through reading, and by the creative act of criticism.

If you are a podcast listener, you might have experienced a peculiar kind of sensation: that of listening to people talk about something you are utterly uninterested in, but find the discussion itself fascinating and worthwhile. This is the mood Booth writes about; the state of mind the act of partaking of something puts you in, regardless of what the subject matter happens to be.

When Booth says that books are friends, this is what he means. You can pick them off the shelves and read for a while, and be comforted by their company; they raise your mood, as friends are wont to do. His approach to criticism is this: if what you have written can provide good company, then it has merit, and writing should strive to attain such merit. To be good company.

Hyde approaches the same theme from another angle, that of rhetoric and philosophy. Moods are not just something that happens while reading, but are the guiding principle behind our thoughts and actions. If we like the places we inhabit - dwell, in his word - we will act towards them in certain ways, presumably with the intention to preserve and decorate these places. If we do not like them, the mood will be different, and our actions will follow suit. Mood is what motivates us: thus understanding mood means understanding ourselves and our place in the world.

The punk aesthetic can be understood in this light. It defines itself against the status quo and seeks to rebel against it. The point is to be something different than what is on offer by the powers that be. The fact that it is seen as ugly and vulgar by those who are attuned to the mood of the times is one of punk's express aesthetic purposes, and only adds to the appeal of those who share the sentiment.

Hyde maintains that seeing mood as guiding principle places a certain ethical responsibility on us as discursive actors in the world. When we write something, we do not simply convey a certain number of facts in a certain order and with a certain degree of accuracy - we also convey a mood. More so when engaging in public speaking, as our presence defines the mood in the room with regard to the subject matter discussed. What we say and how we say it matters, and it falls upon us to think about our impact on those who listen.

Taken together, these two variations on the theme of mood gives us a foundation on which to build further thinking about critical reading and writing. At its most basic, it allows us to ask what mood a particular artifact puts us in or is written to foster. It also allows us to reflect on our own writing, and ask ourselves if we convey the appropriate mood alongside what we want to say. At its most simple, thinking about moods this way asks us to pay attention, and to act on what we see.

More indirectly, the notion of mood gives us an opening to understand why certain people like certain works or genres. There is no shortage of writers and podcasters who do little else but repackage things that have already been said elsewhere, but who add the element of mood. Being able to understand that it is this mood that draws their audience allows us to understand why they do what they do - 'they' being both audience and authors.

A benign example is why readers like the rapt wittiness of someone like Jane Austen; the way she depicts social interactions and relations is a very distinct kind of mood indeed. On a less pleasant note, many partake of racist media just for the sake of the mood therein: hearing someone else talk about the negroes and their decadent ways gives permission to maintain that mood and mode of thinking. Keeping mood in mind allows us to understand - and critique - these things in a more interesting way.

Closer to home, it also opens the door to understanding home decoration. The point is not just simply to look good, but also to suggest a certain mood. A sidenote, to be sure, but I want to imply the general applicability of these things.

I suspect that both works discussed above might be slightly obscure to the general reader. Booth published the Company We Keep in 1988, and Hyde's anthology about the Ethos of Rhetoric came out in 2004. I also suspect that, should you have stumbled upon these books in the wild, you might not have found them particularly interesting - they are both, in a way, intended for specialized audiences. While the point of writing discursive anomalies about a particular thing is to encourage readers to pick up these things and read for themselves, in this case the point is more to convey the general mood of these two books. To introduce you to a concept you might otherwise miss.

But, then again: that is the point of most writing about writing. -

Monday, July 24, 2017

Human-level intelligences and you

There has been much ado over the years about computers becoming as intelligent as humans. Several goals have been set up and surpassed, and for each feat of computer engineering we have learnt that intelligence is a slippery thing that requires ever more refined metrics to accurately measure. Beating a human in chess was once thought a hard thing to do, but then we built a computer that could do it - and very little besides it. It is a very narrowly defined skill being put to the test, and it turns out intelligence is not the key factor that determines victory or defeat.

Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.

The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.

The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.

If all you need is a hammer, build a good one.

A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.

Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.

And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.

When does a human become as intelligent as a human?

Friday, July 14, 2017

Some words on media permanence

It is a strange thing about media artifacts that some of them age well, while others do not. Some can be forgotten for decades, only to find a new audience willing and able to engage with them. Others can not be revived as easily, and are thus consigned to reside only in the memories of those who were there at the time.

To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.

But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.

Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.

But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.

This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.

The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.

A certain sensibility has been lost.

It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.

Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.

Saturday, July 8, 2017

Care for future history

These are strange times.

Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.

These times are not normal. These times are strange.

In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:

Surely, it wasn't that strange?

We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.

Surely, it wasn't that strange?

It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.

It is the implicit challenge of our time.

Better get to it.

Sunday, May 21, 2017

Concerning the Dark Souls of US presidencies

It has been said that the current president is the Dark Souls of US presidencies. Which, to be sure, has a certain ring to it, but it lacks the virtue of truth. Let's explore the issue for a spell.

Dark Souls is a series of games built around the notion of gradual player progression. The games might seem hard at first, but if you stick with it you learn how to overcome that difficulty and become good at what the games ask you to do. The difficulty is not mechanical - the challenges do not require superhuman reflexes or superior skills to overcome - but rather psychological. By failing, again and again, the player gradually learns what needs to be learnt. The reward for this application of patience is the opportunity to excel whenever new situations arise that require the very thing just learnt. It is the player leveling up, rather than the player character.

Meanwhile, in the background of all this character development, a world and its long tragic backstory is ever so subtly unfolding. It is not a simple backstory, where this happened after that, but a series of subtle implications of social relations and emotional states of mind. Complex social processes led to cascading catastrophic outcomes which in turn sparked other social processes which -

It is a deep and complex backstory, and for the sake of brevity, it will all be ignored. Suffice to say that much of it is left unsaid, and that the player will have to piece it together from archeological fragments, old legends and features of geography.

From this description alone, you might see what I'm getting at. Gradual self-improvement through patience, slowly unfolding understanding of past events through contextual knowledge, and the characterization of subtle states of mind - neither of these things are applicable to the current president, even with excessive use of shoehorns or cherrypickers.

There probably is a past president that would live up the title of the Dark Souls of US presidencies. But that is a topic for another cycle.

Friday, May 19, 2017

My computer broke down, can you learn it?

With the recent update to Windows being in the news (not in small part thanks to a computer-eating virus which eats non-updated versions), I've been thinking about how knowledge is situated. Which might seem like a strange connection to make, until you are confronted with this question:

"My computer broke down, can you fix it?"

This is a very common situation to find oneself in, especially if one has acquired a reputation for being able to fix computers. (Even if it only came about from helping someone change the background image that one time.) The knowledge required to navigate this situation is not, however, primarily related to computers. Mostly, it comes down to knowing the asker, their general level of computer literacy and the problems they've asked you to fix in the past. It is a very particular skill set, and over time you develop it through use and abuse.

The aforementioned recent update seems to have crashlanded a fair number of systems, if anecdotal evidence is anything to go by. This made me think about whether I could fix my system if it went down as well, and after poking around for a bit (and making an extra backup of all the things for good measure), I figured that I probably could, given time.

If someone were to ask me to fix the very same problem on their system, I probably couldn't. Not because of my admittedly limited skill in these matters, but because of the different situations in which the problem is situated. If it's just me and my broken computer, then I can take my time, tinker with it, fiddle with the knobs and overall do things that are not directly goal-oriented but which nevertheless gets to the point eventually. It'd be a learning experience, albeit a terrifying one.

If it's not just me, then a whole host of other constraints and situationally specific conditions apply. For one thing, the asker might not have the patience with me learning on the job; they might want the situation dealt with and gone, and me taking my time is the opposite of that. There's also the added element of risk - tinkering is never 100% safe, and accidentally making the problem worse is equally the opposite of the solution. Being risk-averse is good, but it is also slow (yes, even slower), which overall is not conducive to getting things done in a brisk manner.

The point here is not that computers are fragile (though they are), but that knowing something rarely is a yes/no proposition. Mostentimes, we know something sufficiently well that if we were to try it out on our own we'd probably turn out all right, more or less. More often than not, the things we know stem from some first attempt that went in an orthogonal direction from well, but which nevertheless sparked the learning process that led us to where we are. We tinker, we fiddle, and eventually we figure things out.

Though, to be sure, having someone around who you can ask about these things as you go along learning speeds things up immensely.

Do be kind to their patient hearts.

Monday, April 3, 2017

Automated anti-content

So I was thinking about bots in microblogs today, and it occurred to me that they have the potential of being pure anti-content. A realization which, when stated in these terms, raises two questions. The first is "microblog, really?", and the second is "what is this anti-content you speak of?".

To answer the first question: yup, really. It's faster than describing a subset of social medias that are defined by short messages visible for a short period of time, mainly in the form of scrolling down the screen in real time. Gotta go fast.

The second question is more interesting. "Content" is a word that describes some kind of stuff, in general. It doesn't really matter what it is - as long as it is something and can fit into a defined media for a defined period of time, it is content. A person screaming into a mic for twenty minutes is content. It is as generic as it gets.

Anti-content, then. It is not generic, but is is also not original. An example would be the UTC time bot, which tweets the correct (albeit non-UTC) time once an hour. Another example is the TootBot, which toots every fifteen minutes. It is not content, but it is definitely something. You are not going to enthusiastically wake your friends in the middle of the night to tell them about the latest UTC update (though you might wake them about the toot bot), but you are going to notice them when they make their predictable rounds yet again.

Anti-content is not content. But it is familiar.

The thing about humans is that they like what is familiar. It provides a fixed point of reference, a stable framework to build upon, and - not to be underestimated - something to talk about. Stuff happens, stuff changes, but you can rely on these things to remain the same. And because they remain as they are, they can be used to anchor things which have yet to become and/or need a boost to get going.

Or, to rephrase: you can refer to them without accidentally starting a fight. After all, they have nothing to say worth screaming about. They are anti-content. And they are a part of the community, albeit a very small part. They say very little, but every time you see them, they remind you of the familiar.

And now that you have read this, you will never look upon these automated little bots the same way again. Enjoy!