Both the two most recent discursive anomalies share a theme. That theme is, somewhat unexpectedly, mood. Or, put another way: the way reading a particular text makes you feel, and how that feeling affects your thoughts.
In case you are reading in the future, the two anomalies in question are the ones about Hyde and Booth. Since texts are always retroactively present, you can sneak over to read them without missing a beat. Go on. These words will still be here.
Mood is an underrated concept. Sometimes it is dismissed outright, as part of the overall category of 'feelings'. At other times, it is seen as a distraction from the main point of interest, e.g. 'not being in the mood', 'being in a bad mood'. There is a tendency to see mood as something that happens beside the point, and that reality happens without you while you are distracted by these irrelevant moods of yours.
Besides being both rude and bordering on gaslighting, these takes have the additional drawback of being wrong.
Booth is perhaps most explicit in his discussion of moods. One of his premises is that the reason you keep reading a particular text - a romance novel, a cartoon, a crime novel - is that you want more of whatever it is you are reading. The point is not to see if the lovers stick together, what the punchline might be or whodunnit, but to extend the present experience of reading, whatever it might be. The act of reading the text puts you into a certain (albeit at times intangible) mood, and it is this mood that fiction provides. Far from being a side point, mood is for Booth the express purpose of reading. And, by extension, writing; to create an artifact in the world that conveys the kind of mood the author is interested in conveying, and thus creating an opportunity to explore this mood - both by experiencing it through reading, and by the creative act of criticism.
If you are a podcast listener, you might have experienced a peculiar kind of sensation: that of listening to people talk about something you are utterly uninterested in, but find the discussion itself fascinating and worthwhile. This is the mood Booth writes about; the state of mind the act of partaking of something puts you in, regardless of what the subject matter happens to be.
When Booth says that books are friends, this is what he means. You can pick them off the shelves and read for a while, and be comforted by their company; they raise your mood, as friends are wont to do. His approach to criticism is this: if what you have written can provide good company, then it has merit, and writing should strive to attain such merit. To be good company.
Hyde approaches the same theme from another angle, that of rhetoric and philosophy. Moods are not just something that happens while reading, but are the guiding principle behind our thoughts and actions. If we like the places we inhabit - dwell, in his word - we will act towards them in certain ways, presumably with the intention to preserve and decorate these places. If we do not like them, the mood will be different, and our actions will follow suit. Mood is what motivates us: thus understanding mood means understanding ourselves and our place in the world.
The punk aesthetic can be understood in this light. It defines itself against the status quo and seeks to rebel against it. The point is to be something different than what is on offer by the powers that be. The fact that it is seen as ugly and vulgar by those who are attuned to the mood of the times is one of punk's express aesthetic purposes, and only adds to the appeal of those who share the sentiment.
Hyde maintains that seeing mood as guiding principle places a certain ethical responsibility on us as discursive actors in the world. When we write something, we do not simply convey a certain number of facts in a certain order and with a certain degree of accuracy - we also convey a mood. More so when engaging in public speaking, as our presence defines the mood in the room with regard to the subject matter discussed. What we say and how we say it matters, and it falls upon us to think about our impact on those who listen.
Taken together, these two variations on the theme of mood gives us a foundation on which to build further thinking about critical reading and writing. At its most basic, it allows us to ask what mood a particular artifact puts us in or is written to foster. It also allows us to reflect on our own writing, and ask ourselves if we convey the appropriate mood alongside what we want to say. At its most simple, thinking about moods this way asks us to pay attention, and to act on what we see.
More indirectly, the notion of mood gives us an opening to understand why certain people like certain works or genres. There is no shortage of writers and podcasters who do little else but repackage things that have already been said elsewhere, but who add the element of mood. Being able to understand that it is this mood that draws their audience allows us to understand why they do what they do - 'they' being both audience and authors.
A benign example is why readers like the rapt wittiness of someone like Jane Austen; the way she depicts social interactions and relations is a very distinct kind of mood indeed. On a less pleasant note, many partake of racist media just for the sake of the mood therein: hearing someone else talk about the negroes and their decadent ways gives permission to maintain that mood and mode of thinking. Keeping mood in mind allows us to understand - and critique - these things in a more interesting way.
Closer to home, it also opens the door to understanding home decoration. The point is not just simply to look good, but also to suggest a certain mood. A sidenote, to be sure, but I want to imply the general applicability of these things.
I suspect that both works discussed above might be slightly obscure to the general reader. Booth published the Company We Keep in 1988, and Hyde's anthology about the Ethos of Rhetoric came out in 2004. I also suspect that, should you have stumbled upon these books in the wild, you might not have found them particularly interesting - they are both, in a way, intended for specialized audiences. While the point of writing discursive anomalies about a particular thing is to encourage readers to pick up these things and read for themselves, in this case the point is more to convey the general mood of these two books. To introduce you to a concept you might otherwise miss.
But, then again: that is the point of most writing about writing. -
Thursday, July 27, 2017
Monday, July 24, 2017
Human-level intelligences and you
There has been much ado over the years about computers becoming as intelligent as humans. Several goals have been set up and surpassed, and for each feat of computer engineering we have learnt that intelligence is a slippery thing that requires ever more refined metrics to accurately measure. Beating a human in chess was once thought a hard thing to do, but then we built a computer that could do it - and very little besides it. It is a very narrowly defined skill being put to the test, and it turns out intelligence is not the key factor that determines victory or defeat.
Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.
The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.
The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.
If all you need is a hammer, build a good one.
A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.
Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.
And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.
When does a human become as intelligent as a human?
Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.
The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.
The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.
If all you need is a hammer, build a good one.
A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.
Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.
And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.
When does a human become as intelligent as a human?
Friday, July 14, 2017
Some words on media permanence
It is a strange thing about media artifacts that some of them age well, while others do not. Some can be forgotten for decades, only to find a new audience willing and able to engage with them. Others can not be revived as easily, and are thus consigned to reside only in the memories of those who were there at the time.
To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.
But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.
Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.
But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.
This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.
The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.
A certain sensibility has been lost.
It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.
Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.
To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.
But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.
Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.
But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.
This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.
The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.
A certain sensibility has been lost.
It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.
Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.
Saturday, July 8, 2017
Care for future history
These are strange times.
Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.
These times are not normal. These times are strange.
In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:
Surely, it wasn't that strange?
We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.
Surely, it wasn't that strange?
It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.
It is the implicit challenge of our time.
Better get to it.
Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.
These times are not normal. These times are strange.
In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:
Surely, it wasn't that strange?
We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.
Surely, it wasn't that strange?
It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.
It is the implicit challenge of our time.
Better get to it.
Subscribe to:
Posts (Atom)