There is a new Star Wars on the loose. With it comes the division of the entire human race into two categories, as radical as it is universal: those who have seen it, and those who have not seen it. The gulf between these categories is immense and absolute; there are no in-betweens.
Except, of course, in the case of spoilers.
Given that I at one point was a media studies major, spoilers are utterly irrelevant to me. There is no "right" way to consume media - there are only degrees and ways of paying attention. Knowing how something ends prior to seeing it does nothing to change the experience. Everything important lies in how the medium is being used (and sometimes abused). The narrative aspects are a part of that, but there are many other parts of equal importance, and a movie is at all times the interplay between all of its parts.
To be sure, there are movies that rely heavily on surprise endings. Good ones are described in terms of subverting expectations; bad ones in terms of deus ex machinas. If a movie does not hold water despite the surprise being foretold, it is not spoiled - it was always-already a bad movie. We do not rewatch favorite movies because they surprise us, but because they are good company. If a movie is bad company, it will be thus even if someone already told you the butler did it. The goodness and badness lies not in you having knowledge - it lies on the level of production, geometry and acting.
Very few viewers found themselves disliking the recent remake of the Orient Express because they knew how it ended. The enjoyment and/or dissatisfaction lies elsewhere.
Looking around on social media tells me that this is not a view widely held. There are people posting spoilers, people yelling at the aforementioned group for posting spoilers, and people decrying the posting of spoilers in general. It is something of a trending topic, especially in relation to the new Star Wars movie. Posting spoilers is framed akin to murdering the movie, a sin above and beyond the pale. Friendships have ended over it.
It is interesting to note this difference in perspectives on media. On the one hand, there is the view that spoilers are irrelevant. On the other, the view that spoilers are everything. Both are valid experiences of being human. The fact that both views can coexist and seldom interact with each other tells us something about this world we live in.
I do not know exactly what it tells us. But it would be nice if someone posted a spoiler of it. -
Monday, December 18, 2017
Monday, December 11, 2017
Connecting the recent developments of Patreon and bitcoins
These last few days have been intense, online-wise. Patreon did what they did, and the price of bitcoins soared way above the limits of reason and sanity.
It is tempting to see these two things as connected. It is even more tempting to connect them. Because it is a very easy thing to do.
The thing about what Patreon did is that it underscores the need for what bitcoin supporters claim bitcoins do. Patreon gave - gives - everyone the opportunity to donate money at people without too much fuss, and it provided a social vehicle for accepting these donations. At the heart of Patreon's raison d'etre we find the sending and receiving of money.
In short, online transfers of money is kind of a big deal. For Patreon and bitcoin both.
If we go back to the olden days of bitcoin evangelism, we find that the emphasis was much more on the crypto than on the currency part of cryptocurrency. It would be possible to perform transactions in secret, without the prying eyes of government surveilling every transaction. You didn't have to justify why you used your digital moneys the way you did - you could do what you want with them. Including donating them to others for no particular reason whatsoever. No borders, no taxes, no limits, no donation fees.
There is no reason bitcoins could not have evolved to fill a similar role Patreon fills: donations freely given to those who are deemed worthy of them. There could have been an active crowdfunding culture within the bitcoin community.
But there isn't. And there can't be.
The reasons are manifold, but they all revolve around the fact that bitcoins fundamentally do not work as money. The recent dramatic rise in value of bitcoins, with values eclipsing $16 000, only serves to underscore this fact: if you bought something with bitcoins a week or two ago, you would have lost out on this increase. The deflationary nature of bitcoins mean that any use of them that is not getting more bitcoins is an irrational use. Buying things with bitcoins is always a losing proposition; the only winning move is to sit on them until their value inevitably rises.
The most extreme example is the bitcoin pizza, bought for 10 000 bitcoins; the estimated value of that pizza is now $137,408,583.
Moreover, the wildly fluctuating value of bitcoins make it hard to price things. You might try to sell a pair of socks for what currently seems a reasonable price, only to discover mere hours later that it now amounts to thousands of dollars. Sellers cannot set prices, buyers cannot gauge if the prices that are set actually make sense, and the usual market mechanisms determining prices are in effect nullified. Prices carry no information, and in conjunction with the deflationary process mentioned above, it makes using bitcoins as currency a wager at best and a guaranteed loss at worst.
Adding to all this is the cost of conducting bitcoin transactions. Turns out there is in fact a transaction cost to bitcoins, aptly named a fee. Sending money without paying the fee will either take a long time, or simply fail. The minimum fee is 0.00001 bitcoins, or a $1.52; more if you want the transaction to complete with any degree of certainty and/or quickness. It is not unheard of for fees to reach the twenty dollar mark.
Needless to say, buying a ten dollar pizza for thirty dollars is the opposite of a good deal.
The list goes on. The bottom line is that bitcoins do not work as money, and by extension that they cannot work as a replacement for Patreon.
At this point, I suspect that there might be a non-zero amount of readers going: why does any of this even matter? What is the connection between bitcoins and Patreon?
The trivial answer is that there is no connection. The more interesting answer is that by juxtaposing these two things with each other, we find out something useful. On the side of Patreon, we have an actually existing real usecase in the world, which might very soon be in need of a replacement; on the side of bitcoins, we find an utter fucking failure to be even theoretically relevant to this usecase.
This has implications for the "currency" part of cryptocurrency. Given that I am not part of the bitcoin community, I leave it to those readers who are to grapple with these implications best they can.
It is tempting to see these two things as connected. It is even more tempting to connect them. Because it is a very easy thing to do.
The thing about what Patreon did is that it underscores the need for what bitcoin supporters claim bitcoins do. Patreon gave - gives - everyone the opportunity to donate money at people without too much fuss, and it provided a social vehicle for accepting these donations. At the heart of Patreon's raison d'etre we find the sending and receiving of money.
In short, online transfers of money is kind of a big deal. For Patreon and bitcoin both.
If we go back to the olden days of bitcoin evangelism, we find that the emphasis was much more on the crypto than on the currency part of cryptocurrency. It would be possible to perform transactions in secret, without the prying eyes of government surveilling every transaction. You didn't have to justify why you used your digital moneys the way you did - you could do what you want with them. Including donating them to others for no particular reason whatsoever. No borders, no taxes, no limits, no donation fees.
There is no reason bitcoins could not have evolved to fill a similar role Patreon fills: donations freely given to those who are deemed worthy of them. There could have been an active crowdfunding culture within the bitcoin community.
But there isn't. And there can't be.
The reasons are manifold, but they all revolve around the fact that bitcoins fundamentally do not work as money. The recent dramatic rise in value of bitcoins, with values eclipsing $16 000, only serves to underscore this fact: if you bought something with bitcoins a week or two ago, you would have lost out on this increase. The deflationary nature of bitcoins mean that any use of them that is not getting more bitcoins is an irrational use. Buying things with bitcoins is always a losing proposition; the only winning move is to sit on them until their value inevitably rises.
The most extreme example is the bitcoin pizza, bought for 10 000 bitcoins; the estimated value of that pizza is now $137,408,583.
Moreover, the wildly fluctuating value of bitcoins make it hard to price things. You might try to sell a pair of socks for what currently seems a reasonable price, only to discover mere hours later that it now amounts to thousands of dollars. Sellers cannot set prices, buyers cannot gauge if the prices that are set actually make sense, and the usual market mechanisms determining prices are in effect nullified. Prices carry no information, and in conjunction with the deflationary process mentioned above, it makes using bitcoins as currency a wager at best and a guaranteed loss at worst.
Adding to all this is the cost of conducting bitcoin transactions. Turns out there is in fact a transaction cost to bitcoins, aptly named a fee. Sending money without paying the fee will either take a long time, or simply fail. The minimum fee is 0.00001 bitcoins, or a $1.52; more if you want the transaction to complete with any degree of certainty and/or quickness. It is not unheard of for fees to reach the twenty dollar mark.
Needless to say, buying a ten dollar pizza for thirty dollars is the opposite of a good deal.
The list goes on. The bottom line is that bitcoins do not work as money, and by extension that they cannot work as a replacement for Patreon.
At this point, I suspect that there might be a non-zero amount of readers going: why does any of this even matter? What is the connection between bitcoins and Patreon?
The trivial answer is that there is no connection. The more interesting answer is that by juxtaposing these two things with each other, we find out something useful. On the side of Patreon, we have an actually existing real usecase in the world, which might very soon be in need of a replacement; on the side of bitcoins, we find an utter fucking failure to be even theoretically relevant to this usecase.
This has implications for the "currency" part of cryptocurrency. Given that I am not part of the bitcoin community, I leave it to those readers who are to grapple with these implications best they can.
Friday, December 8, 2017
The economic utility of dry feet
Patreon has made an announcement about some upcoming changes to their fee structuring, and this has caused quite a stir. To understate it slightly, these changes are somewhat unpopular and inexplicable to creators and patrons alike. I expect there to be continued discussions about these changes in the days to come.
In a strange chain of associations, this made me think about the economics-related term 'helicopter drops'. In short, a helicopter drop consists of giving everyone a one-shot amount of money, in order to stimulate the economy. The main component of an economy is people spending money on things, and they cannot spend money they do not have. Thus, ensuring that everyone has slightly more money than before would subsequently ensure that they spent more, which would have ripple-effects all through society, as the increased economic activity would spur even more economic activity.
Further along the chain of associations, this made me think about the many things we postpone to do due to a lack of funds. Not because it isn't necessary, but because we cannot comfortably afford doing it quite yet. By pushing these things into the future, we ensure that the money we do have can be spent on the things that are acutely necessary, rather than long-term necessary. A slight discomfort in the present is the price to pay for being ready to face the future, when the time comes.
My shoes are an example of this. There are holes in them, and my feet get wet every now and again. They are broken, but it is not critical, and I can squeeze another month of use out of them if I mind my step.
There are any number of similar examples, most of which we have stopped thinking about due to having gotten used to them. I suspect that a modest helicopter drop, in the range of some $5000, would be funneled directly into the equivalent of new shoes. It would not make anyone rich or change the fundamental structure of society, but it would ensure that people had less holes in their shoes. The benefits would be impossible to measure with econometrics, but they would at the same time be immeasurably tangible to those involved.
I do not know if this thought is useful to you, but here it is.
In a strange chain of associations, this made me think about the economics-related term 'helicopter drops'. In short, a helicopter drop consists of giving everyone a one-shot amount of money, in order to stimulate the economy. The main component of an economy is people spending money on things, and they cannot spend money they do not have. Thus, ensuring that everyone has slightly more money than before would subsequently ensure that they spent more, which would have ripple-effects all through society, as the increased economic activity would spur even more economic activity.
Further along the chain of associations, this made me think about the many things we postpone to do due to a lack of funds. Not because it isn't necessary, but because we cannot comfortably afford doing it quite yet. By pushing these things into the future, we ensure that the money we do have can be spent on the things that are acutely necessary, rather than long-term necessary. A slight discomfort in the present is the price to pay for being ready to face the future, when the time comes.
My shoes are an example of this. There are holes in them, and my feet get wet every now and again. They are broken, but it is not critical, and I can squeeze another month of use out of them if I mind my step.
There are any number of similar examples, most of which we have stopped thinking about due to having gotten used to them. I suspect that a modest helicopter drop, in the range of some $5000, would be funneled directly into the equivalent of new shoes. It would not make anyone rich or change the fundamental structure of society, but it would ensure that people had less holes in their shoes. The benefits would be impossible to measure with econometrics, but they would at the same time be immeasurably tangible to those involved.
I do not know if this thought is useful to you, but here it is.
Thursday, December 7, 2017
Learning for uneducated people
The academic discipline of Education is caught in a weird place. On the one hand, the powers that be want it to be a handmaiden to the educational system, providing it with ever more refined and efficient tools. On the other hand, it is seen by other academics as a handmaiden to the educational system, and thus understood as a specialized local field of knowledge, akin to accounting; it is something that takes a certain degree of skill and knowledge to perform, but it does not translate into academic credibility.
This might seem a subtle difference, and in some ways it is. It mostly depends into who you're arguing with at a particular moment. Which, as you might imagine, changes everything.
When arguing with the powers that be, the issues that come up tend to focus on budgets, more specifically the cutting of them if particular results are not delivered on time. Be it in relation to the international measurements that are conducted regularly - such as PISA - or some political debate touching upon education that rages at the time, there are always demands to give backing in some form. Questions such as "how can we teach our kids better so we will win the next round of measuring?" or "what do you have that can support our current political position on educational policy that we made up yesterday?" are frequently thrown our way, and not responding appropriately is budgetary bad news bears.
When arguing other academics, two challenges emerge. One is to remind them that we exist, and another is - as mentioned - to convince them we're not just mere technicians and managers of the bureaucratic beast that is the educational system. Most attempts at either is usually met with annoyance, indifference, or some interesting combination of both which defy classification.
This peculiar state of things means that it is particularly difficult to assert the academic autonomy of the discipline. Part of being autonomous means other recognize you as such. The powers that be have no interest in that, given that they only ever ask for input in relation to nudging the educational system (or the discourse about it) in this direction or that. Other academics have no propensity to acknowledge it either, seeing as they do not interest themselves in the educational system, and thus their interest is effectively shut down. It is, as the saying goes, a tough crowd.
Thing is. Education is not, in fact, about education. It is about learning.
This difference is anything but subtle. Lowercase e education as an activity is something that takes place in a defined span of time at a defined location. It's something that happens in school. It's a process you go through, and then you are done. Sometimes you know more afterwards, sometimes you do not. It depends.
Learning, though. Learning can happen anywhere at any time, and in fact does happen everywhere at all times. It is the main way human beings interact with the world: some sort of sensation happens at them, and is subsequently processed into memory. Next time this same sensation is encountered, the previous experience is used as a reference point for how to proceed. Learning occurs everywhere.
An example is someone starting a new job. On the surface level, one might assume that what they learn is how to perform that job - the logistics of getting it done and the terminology that goes along with it. But that is not all that is being learnt. The learning process also involves noting who the coworkers are, how they relate to each other and their work, which things are proper and which are not, which values are (implicitly and explicitly) endorsed, and so on in a long list of impressions and sensations. A new person in a workplace does not simply learn how to do the job, but also an entire way of being in the world.
Understanding how this learning process works allows you to better understand what happens when things go wrong. Or when things go right. If someone doesn't get with the program, then you can analyze the situation and pinpoint where in the process the mismatch happened. Conversely, if someone learns the ropes faster than expected, then you can identify the thing that went right and try to replicate it with future new employees.
The focus here is not on individual capacities. A "smart" person can fail to fit in, and a "dumb" person can learn the ropes at record speed, depending on the social circumstances of the workplace in question. Learning happens when sensations occur, and sometimes this sensation can consist of a social environment (such as a workplace) communicating that you belong here - or do not belong here. Getting the message is very much dependent on which message is being sent, and many people decide that a particular career is not for them after learning that they are not welcome within it.
These are the kinds of things we study in capital e Education. Yet this is hard to convey, since so many have gotten the message that Education is merely a handmaiden to the educational system. -
This might seem a subtle difference, and in some ways it is. It mostly depends into who you're arguing with at a particular moment. Which, as you might imagine, changes everything.
When arguing with the powers that be, the issues that come up tend to focus on budgets, more specifically the cutting of them if particular results are not delivered on time. Be it in relation to the international measurements that are conducted regularly - such as PISA - or some political debate touching upon education that rages at the time, there are always demands to give backing in some form. Questions such as "how can we teach our kids better so we will win the next round of measuring?" or "what do you have that can support our current political position on educational policy that we made up yesterday?" are frequently thrown our way, and not responding appropriately is budgetary bad news bears.
When arguing other academics, two challenges emerge. One is to remind them that we exist, and another is - as mentioned - to convince them we're not just mere technicians and managers of the bureaucratic beast that is the educational system. Most attempts at either is usually met with annoyance, indifference, or some interesting combination of both which defy classification.
This peculiar state of things means that it is particularly difficult to assert the academic autonomy of the discipline. Part of being autonomous means other recognize you as such. The powers that be have no interest in that, given that they only ever ask for input in relation to nudging the educational system (or the discourse about it) in this direction or that. Other academics have no propensity to acknowledge it either, seeing as they do not interest themselves in the educational system, and thus their interest is effectively shut down. It is, as the saying goes, a tough crowd.
Thing is. Education is not, in fact, about education. It is about learning.
This difference is anything but subtle. Lowercase e education as an activity is something that takes place in a defined span of time at a defined location. It's something that happens in school. It's a process you go through, and then you are done. Sometimes you know more afterwards, sometimes you do not. It depends.
Learning, though. Learning can happen anywhere at any time, and in fact does happen everywhere at all times. It is the main way human beings interact with the world: some sort of sensation happens at them, and is subsequently processed into memory. Next time this same sensation is encountered, the previous experience is used as a reference point for how to proceed. Learning occurs everywhere.
An example is someone starting a new job. On the surface level, one might assume that what they learn is how to perform that job - the logistics of getting it done and the terminology that goes along with it. But that is not all that is being learnt. The learning process also involves noting who the coworkers are, how they relate to each other and their work, which things are proper and which are not, which values are (implicitly and explicitly) endorsed, and so on in a long list of impressions and sensations. A new person in a workplace does not simply learn how to do the job, but also an entire way of being in the world.
Understanding how this learning process works allows you to better understand what happens when things go wrong. Or when things go right. If someone doesn't get with the program, then you can analyze the situation and pinpoint where in the process the mismatch happened. Conversely, if someone learns the ropes faster than expected, then you can identify the thing that went right and try to replicate it with future new employees.
The focus here is not on individual capacities. A "smart" person can fail to fit in, and a "dumb" person can learn the ropes at record speed, depending on the social circumstances of the workplace in question. Learning happens when sensations occur, and sometimes this sensation can consist of a social environment (such as a workplace) communicating that you belong here - or do not belong here. Getting the message is very much dependent on which message is being sent, and many people decide that a particular career is not for them after learning that they are not welcome within it.
These are the kinds of things we study in capital e Education. Yet this is hard to convey, since so many have gotten the message that Education is merely a handmaiden to the educational system. -
Wednesday, November 29, 2017
The best book you ever read
No book is ever as good as that one you read as a teenager. You probably remember it - that one which you picked up and just couldn't stop reading, which then formed the basis of your emotional core for years to come. You read it once, and then probably several times afterwards, each time reinforcing its imprint upon your very being.
How would one go about finding another such book?
One approach might be to look at that first important book, to see if it has any particular qualities that distinguishes it from other books. It is easier to find things when you know what to look for, after all.
Thing is. Upon returning to the book of one's youth, there is a non-zero risk that one might discover it to be less impressive than it is in memory. The years between then and now have included many things - books, experiences, life events, deaths - which put things in perspective, and changes one's outlook on things. There is a risk that, upon returning, the book turns out to be the most bland, generic, run-o-the-mill piece of prose there ever was.
This does not diminish its value or the validity of your experiences. It does, however, draw attention to the importance of context. When a book is read is as important as what is in it: in the hands of a young person in search of meaning, any book can become an ontological and emotional foundation.
If you happen to have kids of your own, the thought of leading them towards a similar book might have occurred to you. This, again, actualizes the question of how to find such a book, and how to introduce it.
Simply telling them to read something might do the trick. Sometimes, life happens in straightforward ways.
More often than not, though, it will be something unexpected. They will pick up a book, read it, and - wham - that's the one. There is no telling which one it is, but that's the one it is now, until they become old enough to remember that book they read as a teenager.
The key, then, is to give them ample opportunity to stumble upon a good book. Keep your home well-stocked with good books, and allow access to them at all times. Play the odds. Make it more likely that the book they stumble upon is something by, say, Gloria Anzaldúa rather than by - I shudder to think - Ayn Rand.
Life is full of surprises, strange turn of events and curious edge cases. Sometimes, it is no accident that we stumble upon them. -
How would one go about finding another such book?
One approach might be to look at that first important book, to see if it has any particular qualities that distinguishes it from other books. It is easier to find things when you know what to look for, after all.
Thing is. Upon returning to the book of one's youth, there is a non-zero risk that one might discover it to be less impressive than it is in memory. The years between then and now have included many things - books, experiences, life events, deaths - which put things in perspective, and changes one's outlook on things. There is a risk that, upon returning, the book turns out to be the most bland, generic, run-o-the-mill piece of prose there ever was.
This does not diminish its value or the validity of your experiences. It does, however, draw attention to the importance of context. When a book is read is as important as what is in it: in the hands of a young person in search of meaning, any book can become an ontological and emotional foundation.
If you happen to have kids of your own, the thought of leading them towards a similar book might have occurred to you. This, again, actualizes the question of how to find such a book, and how to introduce it.
Simply telling them to read something might do the trick. Sometimes, life happens in straightforward ways.
More often than not, though, it will be something unexpected. They will pick up a book, read it, and - wham - that's the one. There is no telling which one it is, but that's the one it is now, until they become old enough to remember that book they read as a teenager.
The key, then, is to give them ample opportunity to stumble upon a good book. Keep your home well-stocked with good books, and allow access to them at all times. Play the odds. Make it more likely that the book they stumble upon is something by, say, Gloria Anzaldúa rather than by - I shudder to think - Ayn Rand.
Life is full of surprises, strange turn of events and curious edge cases. Sometimes, it is no accident that we stumble upon them. -
Thursday, November 16, 2017
Evolutionary psychology for the masses
There are a non-zero amount of people who proclaim to be adherents of evolutionary psychology. More often than not, those who are most vocal about this tend to follow up with the least interesting statements possible. Preferably about how some arbitrary gender attribute found today goes way back to primal times; for instance that women wear high heels because something something biology.
This seems to me something of a wasted opportunity. There is a great buildup - the human organism evolved over millions of years to a very specific set of environmental and social circumstances, and this has implications for how it works today - and all that backstory is wasted on making an observation about the present condition that doesn't even hold water if you have more than a passing knowledge of history and/or fashion. You do not need to invoke millions of years of gradual adaptation to be wrong - there are more direct and efficient routes to achieve that end.
A more interesting take is that the aforementioned gradual adaptation adjusted humans to a certain set of conditions, and that the modern circumstance ain't it. The disconnect between what is and what our evolutionary gestalt expects to be, is bound to create a not-insignificant amount of discomfort in actually existing human beings, and addressing this discomfort ought to be a non-trivial part of evolutionary psychology. If nothing else, it would be a more useful take than attempting to reinforce increasingly outmoded gender stereotypes.
But then again.
What could we expect from barely evolved monkeys?
This seems to me something of a wasted opportunity. There is a great buildup - the human organism evolved over millions of years to a very specific set of environmental and social circumstances, and this has implications for how it works today - and all that backstory is wasted on making an observation about the present condition that doesn't even hold water if you have more than a passing knowledge of history and/or fashion. You do not need to invoke millions of years of gradual adaptation to be wrong - there are more direct and efficient routes to achieve that end.
A more interesting take is that the aforementioned gradual adaptation adjusted humans to a certain set of conditions, and that the modern circumstance ain't it. The disconnect between what is and what our evolutionary gestalt expects to be, is bound to create a not-insignificant amount of discomfort in actually existing human beings, and addressing this discomfort ought to be a non-trivial part of evolutionary psychology. If nothing else, it would be a more useful take than attempting to reinforce increasingly outmoded gender stereotypes.
But then again.
What could we expect from barely evolved monkeys?
Saturday, November 11, 2017
Small logistics
There are a large number of small things that are easy to learn, yet which at the same time are utterly impossible to figure out. If someone shows them to you, them look like the easiest thing in the world, but if you have to speedlearn them on your own, difficulties ensue.
A dramatic example of this is a young man finding himself in the situation of having to unclasp a bra. It is a very small thing indeed, and the logistics involved can be performed without much thought, and yet. Difficulties ensue. Possibly also a non-zero amount of fumbling.
Similar (possibly, but not always, less dramatic) instances of small logistics occur just about everywhere, most of them having become so routine it takes an act of effort to notice them. Computer interfaces, what to say when ordering fast food, the art of performing an academic citation - these are all instances of small logistics where the knowing of how to get it done has merged into the back of one's mind. Once upon a time you had to learn these things, before they became obvious.
It pays off to pay attention to these things. Not only do you become aware of what you are (quite literally) doing, but you also gain the opportunity to think about other ways of doing these very things. And, if you notice someone not quite knowing how to move things along, the insight into just what they need to learn for future reference.
It's the little things, as the saying goes.
A dramatic example of this is a young man finding himself in the situation of having to unclasp a bra. It is a very small thing indeed, and the logistics involved can be performed without much thought, and yet. Difficulties ensue. Possibly also a non-zero amount of fumbling.
Similar (possibly, but not always, less dramatic) instances of small logistics occur just about everywhere, most of them having become so routine it takes an act of effort to notice them. Computer interfaces, what to say when ordering fast food, the art of performing an academic citation - these are all instances of small logistics where the knowing of how to get it done has merged into the back of one's mind. Once upon a time you had to learn these things, before they became obvious.
It pays off to pay attention to these things. Not only do you become aware of what you are (quite literally) doing, but you also gain the opportunity to think about other ways of doing these very things. And, if you notice someone not quite knowing how to move things along, the insight into just what they need to learn for future reference.
It's the little things, as the saying goes.
Friday, November 10, 2017
Count me in
It's been a hectic couple of weeks at the university, and there has been little time for writing. Or, rather, there has been too much writing, and a body can only use a keyboard for so many hours a day.
Which is another way of saying that if you wonder where the posts are, they went into methodology papers. Science stuff, you know.
One of the recurring themes in my particular course is that the distinction between qualitative and quantitative science really does not make sense any more. There are different paradigms, to be sure, but the dividing line is not between qual and quant, and they can more often than not be combined to create new insights about various things. It is somewhat counterproductive to think of these things as completely separate entities which only rarely interact, when they do in fact interact more often than not. It is also counterproductive to get into arguments about whether one is better than the other, when the simple truth is that sometimes there is a need for the one and sometimes the other.
Which, to be sure, is a very sociology thing to say. But it rings true.
Here is something to mess up the categories. Imagine a thousand deep interviews, conducted at length, with follow-ups as needed. Imagine then that the results of these interviews are (through some procedure of quantification) condensed into a series of graphs. Would that be a qualitative or quantitative study?
If your thought process is "I wish we had those kind of resources", you are ahead of the game.
Here is another category-disturbing thought. When designing surveys, a traditionally quantitative endeavor, the aim is usually to get some numbers out of it. But in order to ensure that the numbers actually mean anything, a lot of thought has to go into the questions. The respondents only have the words on the questionnaire to work with, and thus those words have to be crafted very carefully to avoid confounding factors. This is a task that requires a non-trivial amount of careful attention, empathy and understanding. In order to get something quantitative out of the ordeal, a qualitative approach has to be baked into the process.
Then there is the whole thing with getting people to actually answer the darn things. Turns out just handing them out willy-nilly is less effective than one might think.
A third category-bender is, surprisingly enough, what has happened in physics. As the units of analysis have become smaller, we run into non-trivial limitations of the hardware used to measure things. On the one hand, this is countered by building ever larger instruments (atom smashers take up a surprisingly large space). On the other hand, this is also countered by admitting that subatomic processes simply do not make sense to human beings, and the admission that we will have to think long and hard about this in order to even know what we are knowing.
As the common refrain among physicists goes: it does not make sense, you just get used to it.
These three examples might be interpreted as arguments for the supremacy of the qualitative method. But that would be to try to answer the wrong question. Determining whether one is better than the other is slightly beside the point if you will end up using both of them anyway. What is more interesting is what it means that this distinction is insufficient in describing the actual work of actual scientists, and what other line of thinking we might replace it with.
To be sure, we have interesting times ahead of us.
Which is another way of saying that if you wonder where the posts are, they went into methodology papers. Science stuff, you know.
One of the recurring themes in my particular course is that the distinction between qualitative and quantitative science really does not make sense any more. There are different paradigms, to be sure, but the dividing line is not between qual and quant, and they can more often than not be combined to create new insights about various things. It is somewhat counterproductive to think of these things as completely separate entities which only rarely interact, when they do in fact interact more often than not. It is also counterproductive to get into arguments about whether one is better than the other, when the simple truth is that sometimes there is a need for the one and sometimes the other.
Which, to be sure, is a very sociology thing to say. But it rings true.
Here is something to mess up the categories. Imagine a thousand deep interviews, conducted at length, with follow-ups as needed. Imagine then that the results of these interviews are (through some procedure of quantification) condensed into a series of graphs. Would that be a qualitative or quantitative study?
If your thought process is "I wish we had those kind of resources", you are ahead of the game.
Here is another category-disturbing thought. When designing surveys, a traditionally quantitative endeavor, the aim is usually to get some numbers out of it. But in order to ensure that the numbers actually mean anything, a lot of thought has to go into the questions. The respondents only have the words on the questionnaire to work with, and thus those words have to be crafted very carefully to avoid confounding factors. This is a task that requires a non-trivial amount of careful attention, empathy and understanding. In order to get something quantitative out of the ordeal, a qualitative approach has to be baked into the process.
Then there is the whole thing with getting people to actually answer the darn things. Turns out just handing them out willy-nilly is less effective than one might think.
A third category-bender is, surprisingly enough, what has happened in physics. As the units of analysis have become smaller, we run into non-trivial limitations of the hardware used to measure things. On the one hand, this is countered by building ever larger instruments (atom smashers take up a surprisingly large space). On the other hand, this is also countered by admitting that subatomic processes simply do not make sense to human beings, and the admission that we will have to think long and hard about this in order to even know what we are knowing.
As the common refrain among physicists goes: it does not make sense, you just get used to it.
These three examples might be interpreted as arguments for the supremacy of the qualitative method. But that would be to try to answer the wrong question. Determining whether one is better than the other is slightly beside the point if you will end up using both of them anyway. What is more interesting is what it means that this distinction is insufficient in describing the actual work of actual scientists, and what other line of thinking we might replace it with.
To be sure, we have interesting times ahead of us.
Saturday, October 14, 2017
The application of memories
Sometimes, you stumble upon a song you haven't heard in a while, and go "oh yeah, I remember this, this exists". It sparks a memory of times past, and of the emotional equilibrium (or lack thereof) that went along with them. It might be a strong memory, or a passing one. Either way, the memory chord is struck.
Most of the time, nothing much comes of it. You just remember the memory, and then move on. It is the way of things. The world is big and contains many memories.
Sometimes, you stumble upon a song from an artist you only ever heard the one song from. Out of curiosity, you decide to check if there were any other songs made back in the days, and if they are anything like what you've heard so far. After some listening, you discover that there is and that they aren't. In fact, the rest of the artist's production is nothing like that one song; it is an unexplored field of newness that awaits personal discovery.
At times, this is how new favorite artists are found.
To be sure, this process has been made simpler through systems of file sharing - whether they be spotify or discography torrents. Any time you remember something, the option is always there to shore up everything this person has ever done and peruse. All that is needed is a memory, and a name.
It is one of those things that is easy to take for granted. But it is useful, nonetheless.
Most of the time, nothing much comes of it. You just remember the memory, and then move on. It is the way of things. The world is big and contains many memories.
Sometimes, you stumble upon a song from an artist you only ever heard the one song from. Out of curiosity, you decide to check if there were any other songs made back in the days, and if they are anything like what you've heard so far. After some listening, you discover that there is and that they aren't. In fact, the rest of the artist's production is nothing like that one song; it is an unexplored field of newness that awaits personal discovery.
At times, this is how new favorite artists are found.
To be sure, this process has been made simpler through systems of file sharing - whether they be spotify or discography torrents. Any time you remember something, the option is always there to shore up everything this person has ever done and peruse. All that is needed is a memory, and a name.
It is one of those things that is easy to take for granted. But it is useful, nonetheless.
Saturday, October 7, 2017
Lies to live by
There are things you have read which have profoundly changed your mind and the way you think. More often than not, these things you have read are wrong.
This is not meant as an accusatory or derogatory statement. It is just the nature of texts - they are wrong about things, and flawed in the ways which they are right. It goes with being an imperfect medium.
Still. You did read these things, and they did change your mind. They must have done something right.
The thing about texts is that they do not have to be perfect. Or even right or wrong. They have to mobilize what you know into new thoughts, new directions and - possibly - new ways of living.
All fiction is wrong. All fiction is a lie.
But that's okay.
This is not meant as an accusatory or derogatory statement. It is just the nature of texts - they are wrong about things, and flawed in the ways which they are right. It goes with being an imperfect medium.
Still. You did read these things, and they did change your mind. They must have done something right.
The thing about texts is that they do not have to be perfect. Or even right or wrong. They have to mobilize what you know into new thoughts, new directions and - possibly - new ways of living.
All fiction is wrong. All fiction is a lie.
But that's okay.
Thursday, October 5, 2017
Let's talk about that new Star Trek thingy
There is a new Star Trek on the loose.
I have not seen it. But I have seen people talk about it, and on numerous occasions these fine folks have said - independently of each other - that while there are more important things to talk about than Star Trek, they are now going to talk about it.
When things happen many times independently of each other, the ol' pattern recognition sets in. Something seems to be going on, and it seems to be going on whilst everyone is thinking about something else. This something going on needs to be interrogated, if only to find out just what it is. It might be important.
(To be sure, it is possible to note this as an example of writing in the presence of enemies. But that's another thought.)
The notion that Star Trek is not important is a strange one. As a cultural institution, it has built the foundation for many imaginations, both public and private. It is no exaggeration to say it is a part of a shared cultural heritage - the themes and mythologies spawned from it have had an impact far greater than mere intuition would suggest. It has been a fixed cultural point of reference for generations (in canon and in real time), inspiring countless young minds to do what they do and go where they went. In terms of sheer cultural impact, Star Trek is a big one.
Thus, new iterations of Star Trek are important by virtue of their connection to old iterations. In present terms, it is important through the sheer fact that millions of people are watching it and discussing it - it becomes a part of the overall zeitgeist. In the longer term, it becomes important as a reference point (for critics and fans alike): in the old Star Trek they did x, but in the new one they did y, and this is significant of cultural change z.
This means we cannot attribute these assertions that there are more important things to talk about than Star Trek, to Star Trek. There is something else going on here.
To be sure, there are a non-zero amount of other important things to talk about. Climate change, the rapid transformations of modernity and - not least - the totality of the political situation in the US loom large as important other things. The sheer amount of clusterfucks (actual or potential) that exist in the world are sufficient to make mere lived experience seem trivial and unimportant, and thus discussions thereof follow suit.
Thing is. All we have is lived experience, and denying ourselves the opportunity to talk about it would be detrimental. Even if it happens to be what we thought about the new Star Trek series.
It is a sign of hope that people do talk about Star Trek after having made the disclaimer about there being more important things to talk about. It means there is still a humanity left to explore those final frontiers. -
I have not seen it. But I have seen people talk about it, and on numerous occasions these fine folks have said - independently of each other - that while there are more important things to talk about than Star Trek, they are now going to talk about it.
When things happen many times independently of each other, the ol' pattern recognition sets in. Something seems to be going on, and it seems to be going on whilst everyone is thinking about something else. This something going on needs to be interrogated, if only to find out just what it is. It might be important.
(To be sure, it is possible to note this as an example of writing in the presence of enemies. But that's another thought.)
The notion that Star Trek is not important is a strange one. As a cultural institution, it has built the foundation for many imaginations, both public and private. It is no exaggeration to say it is a part of a shared cultural heritage - the themes and mythologies spawned from it have had an impact far greater than mere intuition would suggest. It has been a fixed cultural point of reference for generations (in canon and in real time), inspiring countless young minds to do what they do and go where they went. In terms of sheer cultural impact, Star Trek is a big one.
Thus, new iterations of Star Trek are important by virtue of their connection to old iterations. In present terms, it is important through the sheer fact that millions of people are watching it and discussing it - it becomes a part of the overall zeitgeist. In the longer term, it becomes important as a reference point (for critics and fans alike): in the old Star Trek they did x, but in the new one they did y, and this is significant of cultural change z.
This means we cannot attribute these assertions that there are more important things to talk about than Star Trek, to Star Trek. There is something else going on here.
To be sure, there are a non-zero amount of other important things to talk about. Climate change, the rapid transformations of modernity and - not least - the totality of the political situation in the US loom large as important other things. The sheer amount of clusterfucks (actual or potential) that exist in the world are sufficient to make mere lived experience seem trivial and unimportant, and thus discussions thereof follow suit.
Thing is. All we have is lived experience, and denying ourselves the opportunity to talk about it would be detrimental. Even if it happens to be what we thought about the new Star Trek series.
It is a sign of hope that people do talk about Star Trek after having made the disclaimer about there being more important things to talk about. It means there is still a humanity left to explore those final frontiers. -
Friday, September 22, 2017
As it stands, we are in a hurry to stand still
Here is a process, probably familiar to you:
Some person of note makes a remark. This remark is problematic, and since there are many people of the opinion that problematic things are not to be left unexpounded, there is a flurry of activity to expound the problematic nature of this remark. Given that any statement is an invitation to further statements, further statements occur, some of them insightful, some of them problematic. And since a problematic statement cannot stand either unopposed or unexpounded, things compound.
You have seen this happen. Most likely online, but probably offline too.
In these situations, new topics of discussion are introduced, with varying degrees of relation to the problematic remark. Suddenly, everyone is abuzz about something, and even if you did not think you would ever have an opinion about it, you all of a sudden do. It is easy to be caught up in the moment, and the moment has a tendency to extend itself for longer than one would initially suspect.
Expounding takes time, after all. If it could be done in a hurry, it wouldn't need doing; it'd be a done thing.
Thing is. Discourse produced under these circumstances tend to be local responses to local statements, rather than global considerations. This goes with the conversational nature of the situation - everyone involved is talking to everyone involved, making things very involved. Attempts to sort things out afterwards have to go ever backward, in order to ascertain what any particular statement responded to, and what prompted that earlier statement, and so on. Statements do not stand by themselves; quoted out of context, they will read very differently than in context. (Let's avoid the temptation to ponder the meaning of being quoted out of context in context.)
The short of it is that writings produced under these circumstances have a limited shelf-life, and the long-term return on emotions invested will probably not make up for any temporary intensity. If the goal is to leave a lasting impression, this is not the way.
Consider these words from the Invisible Committee:
This, too, is a process that is probably familiar to you. Even more so now, as you cannot unsee it once becoming aware of it. It shall stand in the way, as it were.
Some person of note makes a remark. This remark is problematic, and since there are many people of the opinion that problematic things are not to be left unexpounded, there is a flurry of activity to expound the problematic nature of this remark. Given that any statement is an invitation to further statements, further statements occur, some of them insightful, some of them problematic. And since a problematic statement cannot stand either unopposed or unexpounded, things compound.
You have seen this happen. Most likely online, but probably offline too.
In these situations, new topics of discussion are introduced, with varying degrees of relation to the problematic remark. Suddenly, everyone is abuzz about something, and even if you did not think you would ever have an opinion about it, you all of a sudden do. It is easy to be caught up in the moment, and the moment has a tendency to extend itself for longer than one would initially suspect.
Expounding takes time, after all. If it could be done in a hurry, it wouldn't need doing; it'd be a done thing.
Thing is. Discourse produced under these circumstances tend to be local responses to local statements, rather than global considerations. This goes with the conversational nature of the situation - everyone involved is talking to everyone involved, making things very involved. Attempts to sort things out afterwards have to go ever backward, in order to ascertain what any particular statement responded to, and what prompted that earlier statement, and so on. Statements do not stand by themselves; quoted out of context, they will read very differently than in context. (Let's avoid the temptation to ponder the meaning of being quoted out of context in context.)
The short of it is that writings produced under these circumstances have a limited shelf-life, and the long-term return on emotions invested will probably not make up for any temporary intensity. If the goal is to leave a lasting impression, this is not the way.
Consider these words from the Invisible Committee:
Power is now immanent in life as it is technologically organized and commodified. It has the neutral appearance of facilities or of Google’s blank page. Whoever determines the organization of space, whoever governs the social environments and atmospheres, whoever administers things, whoever manages the accesses—governs men. Contemporary power has made itself the heir, on the one hand, of the old science of policing, which consists in looking after “the well-being and security of the citizens,” and, on the other, of the logistic science of militaries, the “art of moving armies,” having become an art of maintaining communication networks and ensuring strategic mobility. Absorbed in our language-bound conception of the public thing, of politics, we have continued debating while the real decisions were being implemented right before our eyes. Contemporary laws are written in steel structures and not with words. All the citizens’ indignation can only end up butting its dazed forehead against the reinforced concrete of this world.
This, too, is a process that is probably familiar to you. Even more so now, as you cannot unsee it once becoming aware of it. It shall stand in the way, as it were.
Monday, September 18, 2017
Reverse identity politics
Strangeness is yet again afoot. This particular strangeness I suspect many of you readers have taken note of on occasion, seeing as it is the topic of the day and has been for many days.
It concerns the liberal subject.
Since there is considerable confusion with regard to the exact meaning of both the word "liberal" and "subject", it would only be prudent to define their combination. Lest we be lost in the vagaries of the English language.
The liberal subject is what is visible to the bureaucracies of the liberal state. In an immediate sense, this takes the form of tax records, hospital journals, criminal records or other documents that in some way depict a person. In a less immediate sense, it is a person as it is defined in the code of laws that governs the land: their rights, their obligations and - most importantly - who they as citizens are supposed to be.
This is something else than who they are, in any sense of self or identity. This is a purely discursive construct, and exists only as an aggregate of small fragments that combine to make up a whole. One regulation here, one regulation there; small things. But these fragments add up, and the construct that emerges has real life implications.
An example of this are rights that a person has to actively claim in order to actually benefit from. There are any number of these, from municipal to national. In theory, all you as a person has to do is to fill out the proper paperwork and possibly do some talking to some bureaucrats, and then it's all yours. You as a liberal subject have it within your power to mobilize the apparatus of state on your behalf in this regard, should you but choose to do so.
You as an actual person more than likely have no idea that these rights even exist, and there is a non-zero percentage that no one in your neighborhood does either. You as an actual person have a limited knowledge about the finer points of legal print that surround you, and frankly I suspect that you have better ideas about how to spend your life than reading every rule and regulation there is based on some just in case basis.
But.
The difference between you as a liberal subject and you as an actual person does not exist as far as the liberal state goes. Who you are outside the rules and regulations literally does not matter - it does not exist, it is not visible, and it is not a proper justification for action of any kind. The actual you that walks around, breathes and has impressions of the world - does not exist. In the eyes of the law, you are a citizen. No more, no less.
This means that any failure on your part to act in the prescribed manner is your fault. Even if you had no idea you were supposed to do it, or were utterly oblivious to the fact that doing it existed as a possibility in the world.
Or, phrased slightly differently: if you did not claim your right, you actively chose not to claim it. Ignorance is no excuse.
I imagine that actual you might have objections to this state of things. Good. It means that you perceive the strangeness that is afoot.
Welcome to modernity, citizen.
It concerns the liberal subject.
Since there is considerable confusion with regard to the exact meaning of both the word "liberal" and "subject", it would only be prudent to define their combination. Lest we be lost in the vagaries of the English language.
The liberal subject is what is visible to the bureaucracies of the liberal state. In an immediate sense, this takes the form of tax records, hospital journals, criminal records or other documents that in some way depict a person. In a less immediate sense, it is a person as it is defined in the code of laws that governs the land: their rights, their obligations and - most importantly - who they as citizens are supposed to be.
This is something else than who they are, in any sense of self or identity. This is a purely discursive construct, and exists only as an aggregate of small fragments that combine to make up a whole. One regulation here, one regulation there; small things. But these fragments add up, and the construct that emerges has real life implications.
An example of this are rights that a person has to actively claim in order to actually benefit from. There are any number of these, from municipal to national. In theory, all you as a person has to do is to fill out the proper paperwork and possibly do some talking to some bureaucrats, and then it's all yours. You as a liberal subject have it within your power to mobilize the apparatus of state on your behalf in this regard, should you but choose to do so.
You as an actual person more than likely have no idea that these rights even exist, and there is a non-zero percentage that no one in your neighborhood does either. You as an actual person have a limited knowledge about the finer points of legal print that surround you, and frankly I suspect that you have better ideas about how to spend your life than reading every rule and regulation there is based on some just in case basis.
But.
The difference between you as a liberal subject and you as an actual person does not exist as far as the liberal state goes. Who you are outside the rules and regulations literally does not matter - it does not exist, it is not visible, and it is not a proper justification for action of any kind. The actual you that walks around, breathes and has impressions of the world - does not exist. In the eyes of the law, you are a citizen. No more, no less.
This means that any failure on your part to act in the prescribed manner is your fault. Even if you had no idea you were supposed to do it, or were utterly oblivious to the fact that doing it existed as a possibility in the world.
Or, phrased slightly differently: if you did not claim your right, you actively chose not to claim it. Ignorance is no excuse.
I imagine that actual you might have objections to this state of things. Good. It means that you perceive the strangeness that is afoot.
Welcome to modernity, citizen.
Monday, September 11, 2017
What Mastodon needs (and then some)
Mastodon participation requires non-trivial levels of literacy.
I need you to look at this statement. It is not a condemnation, it is not an accusation; it is merely a statement of fact. An important statement.
If we look closely at the statement, we see that it includes five components, which can be parsed thusly:
Mastodon
participation
requires
non-trivial levels [of]
literacy
Depending on which part we choose to emphasize, the statement will take on different implications. I suspect that the most immediate reading is to emphasize the "non-trivial levels", and hurry to the conclusion that we need to take action to lower these barriers to participation. While this is by no means a wrong conclusion - removing barriers to participation is seldom wrong - it is not the only conclusion.
Let's look at the statement as a whole. What does it mean that that Mastodon requires non-trivial levels of literacy?
It means that you have to be able to read, and be able to read well, in order to get things done. Not only do you need to be able to look at words and know what they mean - you have to be able to look at who is saying them, when and why, and from all this contextual information piece together what is going on. Above all this, you have to navigate the situation - both as it stands at any particular moment, and in a more general overarching sense - in order to figure out how to appropriately respond to what's going on.
Not to put a fine point on it: this is a non-trivial amount of literacy.
Depending on where we place our emphasis, we end up with different questions and different calls to action. What does it mean for Mastodon to require something? What even is Mastodon, and who gets to define it? What does participation mean, and how do we organize it? Does literacy include the capacity to code?
This post is not meant to answer any questions, or even to pose them in anything resembling a comprehensive fashion. Rather, it serves as something to anchor your thoughts on as the Mastodon project toots forward. And as a reminder that:
Mastodon participation requires non-trivial levels of literacy.
I need you to look at this statement. It is not a condemnation, it is not an accusation; it is merely a statement of fact. An important statement.
If we look closely at the statement, we see that it includes five components, which can be parsed thusly:
Mastodon
participation
requires
non-trivial levels [of]
literacy
Depending on which part we choose to emphasize, the statement will take on different implications. I suspect that the most immediate reading is to emphasize the "non-trivial levels", and hurry to the conclusion that we need to take action to lower these barriers to participation. While this is by no means a wrong conclusion - removing barriers to participation is seldom wrong - it is not the only conclusion.
Let's look at the statement as a whole. What does it mean that that Mastodon requires non-trivial levels of literacy?
It means that you have to be able to read, and be able to read well, in order to get things done. Not only do you need to be able to look at words and know what they mean - you have to be able to look at who is saying them, when and why, and from all this contextual information piece together what is going on. Above all this, you have to navigate the situation - both as it stands at any particular moment, and in a more general overarching sense - in order to figure out how to appropriately respond to what's going on.
Not to put a fine point on it: this is a non-trivial amount of literacy.
Depending on where we place our emphasis, we end up with different questions and different calls to action. What does it mean for Mastodon to require something? What even is Mastodon, and who gets to define it? What does participation mean, and how do we organize it? Does literacy include the capacity to code?
This post is not meant to answer any questions, or even to pose them in anything resembling a comprehensive fashion. Rather, it serves as something to anchor your thoughts on as the Mastodon project toots forward. And as a reminder that:
Mastodon participation requires non-trivial levels of literacy.
Saturday, September 9, 2017
No, actually, it is both sides
Sometimes, I get into brutally one-sided fights with individuals who want to argue that capitalism is better than communism. I say one-sided, since they are interested in arguing and I'm not, and I shift the brunt of the emotional labor involved in keeping an argument going squarely upon them. This is a patently unfair move, to be sure, but if you seek me out specifically to reiterate a high school debate, it's an unfairness brought upon yourself; unlike both communism and capitalism, it is within your individual power to avoid this particular structural unfairness.
A more interesting approach to the capitalism/communism divide is to see them both as possible manifestations of modernity, with shared roots, shared symptoms and shared absurdnesses. Modernity could go either of these ways (possibly others as well), and we are now armed with a century of empirical data to study and learn from. Declaring either alternative to be 'better' and ending one's analysis there is a failure to engage with the data; it's ideology.
Sometimes, my non-participation in these fights is interpreted as an ideological proclamation. Since I refuse to partake in these small moments of grandstanding against communism, I must be on team communism. And thus, they unleash the killer question, the question to end all questions:
Do you want to live like they did in the Soviet Union?
Funny you should ask.
There are a non-trivial amount of stories emerging from the US right now about how the current megastorms (Harvey, Irma) are impacting ordinary everyday citizens. Some are about price gauging, which is to say the process predicted by neoclassical economics wherein it becomes more expensive to survive the worse things get. Those stories are not surprising; economists have long referred to this as the cost of doing business. More surprising, however, are the stories of individuals fleeing the oncoming megastorms - and subsequently getting fired for not showing up to work.
If your frame of reference is that capitalism is better than communism, then you will be ill equipped to discuss this state of things. It makes no sense on the face of it to penalize workers for evacuating in the face of a storm encompassing whole states; the words force majeure spring to mind. No reasonable person would expect ordinary people to have to stay and die in the face of overwhelming natural forces for the sake of a contractual agreement. Those kinds of suicidal heroics for symbolic causes are the stuff of war legends, not of everyday workaday business as usual.
To be sure, die-hard ideological capitalists would probably not be surprised to hear of these things if they were told it happened in the Soviet Union. But it is happening now, in the United States, the self-avowed bastion of capitalist free enterprise. Why do we see the same disregard for individual liberty in both instances?
If you view communism and capitalism as two possible variations of the same overarching historical tendency, then these stories become less confusing. Seen in the light of increased bureaucratization and the insistence that formal rules trump informal actualities (e.g. megastorms), it makes sense. We may not agree with the practice of expecting employees to stand and die for companies that spend more money lobbying against increased minimal wages than it would cost to simply pay those minimal wages, but we have a framework for understanding that these demands do not spring from nothing. There are historical trends and forces at work, and you do not have to have read Kafka to understand them.
But it helps.
A more interesting approach to the capitalism/communism divide is to see them both as possible manifestations of modernity, with shared roots, shared symptoms and shared absurdnesses. Modernity could go either of these ways (possibly others as well), and we are now armed with a century of empirical data to study and learn from. Declaring either alternative to be 'better' and ending one's analysis there is a failure to engage with the data; it's ideology.
Sometimes, my non-participation in these fights is interpreted as an ideological proclamation. Since I refuse to partake in these small moments of grandstanding against communism, I must be on team communism. And thus, they unleash the killer question, the question to end all questions:
Do you want to live like they did in the Soviet Union?
Funny you should ask.
There are a non-trivial amount of stories emerging from the US right now about how the current megastorms (Harvey, Irma) are impacting ordinary everyday citizens. Some are about price gauging, which is to say the process predicted by neoclassical economics wherein it becomes more expensive to survive the worse things get. Those stories are not surprising; economists have long referred to this as the cost of doing business. More surprising, however, are the stories of individuals fleeing the oncoming megastorms - and subsequently getting fired for not showing up to work.
If your frame of reference is that capitalism is better than communism, then you will be ill equipped to discuss this state of things. It makes no sense on the face of it to penalize workers for evacuating in the face of a storm encompassing whole states; the words force majeure spring to mind. No reasonable person would expect ordinary people to have to stay and die in the face of overwhelming natural forces for the sake of a contractual agreement. Those kinds of suicidal heroics for symbolic causes are the stuff of war legends, not of everyday workaday business as usual.
To be sure, die-hard ideological capitalists would probably not be surprised to hear of these things if they were told it happened in the Soviet Union. But it is happening now, in the United States, the self-avowed bastion of capitalist free enterprise. Why do we see the same disregard for individual liberty in both instances?
If you view communism and capitalism as two possible variations of the same overarching historical tendency, then these stories become less confusing. Seen in the light of increased bureaucratization and the insistence that formal rules trump informal actualities (e.g. megastorms), it makes sense. We may not agree with the practice of expecting employees to stand and die for companies that spend more money lobbying against increased minimal wages than it would cost to simply pay those minimal wages, but we have a framework for understanding that these demands do not spring from nothing. There are historical trends and forces at work, and you do not have to have read Kafka to understand them.
But it helps.
Thursday, August 31, 2017
Discursive notches
There is a strange process afoot, which I suspect is easier to describe than to explain. In its most basic form, it goes something like this:
Someone has an online presence, most commonly in the form of a content creator. They describe themselves as rational, skeptic and free-thinking, often with an undertone of anti-authoritarianism. They position themselves in opposition to conservatives on a number of issues, for instance when it comes to the role of religion in politics. Their god-terms (to wit) are science, rationality and skepticism, with the corresponding devil-terms of religion and tradition.
Fast-forward a couple of years, and things have radically changed. While there might be lingering traces of skeptical roots, the overall tone and messaging have changed. If the tone was polemical before, it has now intensified and become increasingly specific. The prior focus on denouncing anti-scientific sentiments has been replaced with denouncing leftist SJW feminists, wherever these may be found. Similarly, the notion of free-thinking has been replaced with what can only be called a liturgy: there is a number of stock phrases that are used almost verbatim by members of the community.
The transition from the one type of person to the other seems contingent to me. Out of all the possible developmental paths things could have taken, this one underwent the formality of actually happening. Things could have been different, but they are not.
The question posed by this state of things is: why? What led these self-awowed critical thinkers to join the relentless chant against the so-called SJWs?
A less obvious question is why those who, today, display interest in the skeptical line of thinking tend to follow the same trajectory as those who did years ago. What compels them to undertake the same journey, even though the present-day discourse bears little resemblance to the source material? What discursive notches are at play?
There is a strange process afoot, which I suspect is easier to describe than to explain. A first step in explaining it is to notice it.
Someone has an online presence, most commonly in the form of a content creator. They describe themselves as rational, skeptic and free-thinking, often with an undertone of anti-authoritarianism. They position themselves in opposition to conservatives on a number of issues, for instance when it comes to the role of religion in politics. Their god-terms (to wit) are science, rationality and skepticism, with the corresponding devil-terms of religion and tradition.
Fast-forward a couple of years, and things have radically changed. While there might be lingering traces of skeptical roots, the overall tone and messaging have changed. If the tone was polemical before, it has now intensified and become increasingly specific. The prior focus on denouncing anti-scientific sentiments has been replaced with denouncing leftist SJW feminists, wherever these may be found. Similarly, the notion of free-thinking has been replaced with what can only be called a liturgy: there is a number of stock phrases that are used almost verbatim by members of the community.
The transition from the one type of person to the other seems contingent to me. Out of all the possible developmental paths things could have taken, this one underwent the formality of actually happening. Things could have been different, but they are not.
The question posed by this state of things is: why? What led these self-awowed critical thinkers to join the relentless chant against the so-called SJWs?
A less obvious question is why those who, today, display interest in the skeptical line of thinking tend to follow the same trajectory as those who did years ago. What compels them to undertake the same journey, even though the present-day discourse bears little resemblance to the source material? What discursive notches are at play?
There is a strange process afoot, which I suspect is easier to describe than to explain. A first step in explaining it is to notice it.
Saturday, August 19, 2017
Intersectional lines of flight
In the most recent anomaly, I use the concept of international supply chains to illustrate the possibilities of intersectional analyses. It is both a joke and an illustration: a joke in that it is not a concept you would expect to see in a text on intersectionality, and an illustration in that there is no real reason why it could not be included in an intersectional analysis. One would have to make a case for including it, but that goes for every other methodological aspect as well, so it is not unique in that regard.
There are always more potential analyses than actualized ones. This is due to the fact that it is easier to come up with ideas than to go through the months long painstaking process of gathering and processing the data. There really is nothing stopping anyone from saying "hey, we should analyze x in the light of y" - the only effort involved is to have the idea in the first place. And ideas are plentiful.
If you've read your Feyerabend, you can have ungodly amounts of fun generating ideas for potential analyses about the most counterintuitive objects from the most unexpected of angles. Indeed, if you've read your Giddens, you have seen it in action; that famous introduction sure is effective in showing how coffee is not just a beverage but also a social institution, a major economic commodity, a marker of social status, and a whole host of other things condensed (and percolated) into one singular thing. There are no real limits to how many approaches you can use - in theory and in mind.
In practice, there are limits about. Some limits are related to energy - you only have so much of it. Some limits are related to genres and conventions - you are expected to follow the written and unwritten rules for how to go about things. Some limits are related to empirical applicability - some approaches simply will not work.
The first kind of limit is absolute. The second one is negotiable.
Among those who for whatever reason oppose the notion of intersectionality, it is common to make reference to the third kind of limit. "Atoms do not have genders", they might say, implying that an intersectional analysis of physics is impossible. More specifically, they imply that the objective (and thus scientific) ontic universe cannot be understood using the methods and concepts of the social sciences, and that true scientists should be left alone to pursue their important work unperturbed.
They are usually perturbed when an intersectional analysis about how 'objectivity' is a gendered concept with roots in imperialist colonial practices, and thus cannot be used uncritically to convey what they want to convey. The fact that this is a successful application of intersectional analysis is shoved aside by the assertion that no, it isn't.
Thus, we find ourselves back at the second kind of limit. Genres and conventions.
If you read enough about intersectionality, you will eventually come across appeals to include animals in the overall roster of categories. In its mildest forms, this pans out as arguments to strengthen animal protection laws; if it is unethical to let humans suffer, then surely it is unethical to let other forms of life suffer, too. In more radical forms, we find militant veganism (though, to be sure, it is likely militant vegans found their way to where they are by other routes than methodological considerations). Somewhere between these positions, there is a point where it becomes unstrategic to include animals in your analysis.
It is not difficult to come up with intersectional analyses which include animals. For instance: there is a class (or, perhaps more fittingly, caste) system in place with regards to animals. Some animals (dogs, cats) are pets, and kept around the house. Some animals are slaves to be exploited to the fullest extent of their biology (mutated, deformed fowl who live their life in dark factories). Some animals are poached for their alleged medicinal properties (tigers, elephants). Some animals are national symbols (bald-headed eagles). I probably do not need to flesh out the differences to successfully convey that there is something to be learnt by performing an analysis along these lines. Or that international supply chains might be involved somehow.
But.
It is unstrategic to perform such analyses. They do not get funded, for one. They also do not tend to be read with a sense of delighted gratitude; more often than not they are dismissed as prattling sentimental nonsense, along with their authors. There are limits to what a serious participant of contemporary discourse can say, and it is solid strategy to be aware of these limits.
Indeed, these very limits are very rewarding to perform an intersectional analysis of. I would go so far as to say it is a good idea. -
There are always more potential analyses than actualized ones. This is due to the fact that it is easier to come up with ideas than to go through the months long painstaking process of gathering and processing the data. There really is nothing stopping anyone from saying "hey, we should analyze x in the light of y" - the only effort involved is to have the idea in the first place. And ideas are plentiful.
If you've read your Feyerabend, you can have ungodly amounts of fun generating ideas for potential analyses about the most counterintuitive objects from the most unexpected of angles. Indeed, if you've read your Giddens, you have seen it in action; that famous introduction sure is effective in showing how coffee is not just a beverage but also a social institution, a major economic commodity, a marker of social status, and a whole host of other things condensed (and percolated) into one singular thing. There are no real limits to how many approaches you can use - in theory and in mind.
In practice, there are limits about. Some limits are related to energy - you only have so much of it. Some limits are related to genres and conventions - you are expected to follow the written and unwritten rules for how to go about things. Some limits are related to empirical applicability - some approaches simply will not work.
The first kind of limit is absolute. The second one is negotiable.
Among those who for whatever reason oppose the notion of intersectionality, it is common to make reference to the third kind of limit. "Atoms do not have genders", they might say, implying that an intersectional analysis of physics is impossible. More specifically, they imply that the objective (and thus scientific) ontic universe cannot be understood using the methods and concepts of the social sciences, and that true scientists should be left alone to pursue their important work unperturbed.
They are usually perturbed when an intersectional analysis about how 'objectivity' is a gendered concept with roots in imperialist colonial practices, and thus cannot be used uncritically to convey what they want to convey. The fact that this is a successful application of intersectional analysis is shoved aside by the assertion that no, it isn't.
Thus, we find ourselves back at the second kind of limit. Genres and conventions.
If you read enough about intersectionality, you will eventually come across appeals to include animals in the overall roster of categories. In its mildest forms, this pans out as arguments to strengthen animal protection laws; if it is unethical to let humans suffer, then surely it is unethical to let other forms of life suffer, too. In more radical forms, we find militant veganism (though, to be sure, it is likely militant vegans found their way to where they are by other routes than methodological considerations). Somewhere between these positions, there is a point where it becomes unstrategic to include animals in your analysis.
It is not difficult to come up with intersectional analyses which include animals. For instance: there is a class (or, perhaps more fittingly, caste) system in place with regards to animals. Some animals (dogs, cats) are pets, and kept around the house. Some animals are slaves to be exploited to the fullest extent of their biology (mutated, deformed fowl who live their life in dark factories). Some animals are poached for their alleged medicinal properties (tigers, elephants). Some animals are national symbols (bald-headed eagles). I probably do not need to flesh out the differences to successfully convey that there is something to be learnt by performing an analysis along these lines. Or that international supply chains might be involved somehow.
But.
It is unstrategic to perform such analyses. They do not get funded, for one. They also do not tend to be read with a sense of delighted gratitude; more often than not they are dismissed as prattling sentimental nonsense, along with their authors. There are limits to what a serious participant of contemporary discourse can say, and it is solid strategy to be aware of these limits.
Indeed, these very limits are very rewarding to perform an intersectional analysis of. I would go so far as to say it is a good idea. -
Friday, August 18, 2017
Who and what to know
A while back ago, I was attending a social gathering where people came, discussed for a while, and left. There was no fixed topic of discussion, or other purpose than the sheer getting together and talking. It was a fluid situation.
At one particular moment, those present got to talking about family relations and relatives. There were old folks present (persons in their sixties and upwards) who talked about their relatives and relations in terms of individuals. The reference points went along these lines: he was the one who was married to her, and they had that fancy car, remember? or: remember the old man who lived on that hill back in the days - he had a nephew, who married this other person who ran that store, and so on.
For those listening in on the conversation without knowing (and thus not remembering) these particular facts or persons, this line of describing who's who will remain a work in progress. More information is required about the nature of marriages, cars, hills and other aspects of local historical memory to make sense of it all. It is a situated knowledge about a specific cast of characters, and the only way to really become someone in the know would be to stick around long enough to become situated.
After a while of establishing who's who, someone asked one of the young persons present if they knew the children of those discussed. As it turned out, they did, in a way. They knew of these persons, but had never really interacted in any significant fashion. The most succinct summation of the situation put it thusly: oh yeah, him. He was in B, so I never talked to him.
This is a distinctly different way of relating to social relations. The B in this case refers to an administrative subdivision of school populations - 6A, 6B or 6C. These are all sixth grade, but for purposes of keeping group sizes manageable, divided into three groups. Referring to B as a known fact implies knowing these administrative subdivisions and their social implications, which is a radically different way of organizing who's who than the individual-to-individual approach outlined above.
The old folks present did not know the specific implications of the letter B. But, being old and wise, they picked up the gist that this letter somehow meant that the individuals in question did not know each other, and continued the discussion armed with this new nugget of contextual information.
The difference between young and old in this case is not subtle. In fact, it seems to be taken right out of some introductory textbook on sociology, wherein it describes the gradual expansion of bureaucracy into more and more aspects of our lives. The old ones thought in terms of individuals; the young ones in terms of administrative subdivisions. It was, in a single moment, a crystallization of modernity.
It was a strange moment, and I have pondered it ever since.
At one particular moment, those present got to talking about family relations and relatives. There were old folks present (persons in their sixties and upwards) who talked about their relatives and relations in terms of individuals. The reference points went along these lines: he was the one who was married to her, and they had that fancy car, remember? or: remember the old man who lived on that hill back in the days - he had a nephew, who married this other person who ran that store, and so on.
For those listening in on the conversation without knowing (and thus not remembering) these particular facts or persons, this line of describing who's who will remain a work in progress. More information is required about the nature of marriages, cars, hills and other aspects of local historical memory to make sense of it all. It is a situated knowledge about a specific cast of characters, and the only way to really become someone in the know would be to stick around long enough to become situated.
After a while of establishing who's who, someone asked one of the young persons present if they knew the children of those discussed. As it turned out, they did, in a way. They knew of these persons, but had never really interacted in any significant fashion. The most succinct summation of the situation put it thusly: oh yeah, him. He was in B, so I never talked to him.
This is a distinctly different way of relating to social relations. The B in this case refers to an administrative subdivision of school populations - 6A, 6B or 6C. These are all sixth grade, but for purposes of keeping group sizes manageable, divided into three groups. Referring to B as a known fact implies knowing these administrative subdivisions and their social implications, which is a radically different way of organizing who's who than the individual-to-individual approach outlined above.
The old folks present did not know the specific implications of the letter B. But, being old and wise, they picked up the gist that this letter somehow meant that the individuals in question did not know each other, and continued the discussion armed with this new nugget of contextual information.
The difference between young and old in this case is not subtle. In fact, it seems to be taken right out of some introductory textbook on sociology, wherein it describes the gradual expansion of bureaucracy into more and more aspects of our lives. The old ones thought in terms of individuals; the young ones in terms of administrative subdivisions. It was, in a single moment, a crystallization of modernity.
It was a strange moment, and I have pondered it ever since.
Thursday, July 27, 2017
In the mood for some discourse
Both the two most recent discursive anomalies share a theme. That theme is, somewhat unexpectedly, mood. Or, put another way: the way reading a particular text makes you feel, and how that feeling affects your thoughts.
In case you are reading in the future, the two anomalies in question are the ones about Hyde and Booth. Since texts are always retroactively present, you can sneak over to read them without missing a beat. Go on. These words will still be here.
Mood is an underrated concept. Sometimes it is dismissed outright, as part of the overall category of 'feelings'. At other times, it is seen as a distraction from the main point of interest, e.g. 'not being in the mood', 'being in a bad mood'. There is a tendency to see mood as something that happens beside the point, and that reality happens without you while you are distracted by these irrelevant moods of yours.
Besides being both rude and bordering on gaslighting, these takes have the additional drawback of being wrong.
Booth is perhaps most explicit in his discussion of moods. One of his premises is that the reason you keep reading a particular text - a romance novel, a cartoon, a crime novel - is that you want more of whatever it is you are reading. The point is not to see if the lovers stick together, what the punchline might be or whodunnit, but to extend the present experience of reading, whatever it might be. The act of reading the text puts you into a certain (albeit at times intangible) mood, and it is this mood that fiction provides. Far from being a side point, mood is for Booth the express purpose of reading. And, by extension, writing; to create an artifact in the world that conveys the kind of mood the author is interested in conveying, and thus creating an opportunity to explore this mood - both by experiencing it through reading, and by the creative act of criticism.
If you are a podcast listener, you might have experienced a peculiar kind of sensation: that of listening to people talk about something you are utterly uninterested in, but find the discussion itself fascinating and worthwhile. This is the mood Booth writes about; the state of mind the act of partaking of something puts you in, regardless of what the subject matter happens to be.
When Booth says that books are friends, this is what he means. You can pick them off the shelves and read for a while, and be comforted by their company; they raise your mood, as friends are wont to do. His approach to criticism is this: if what you have written can provide good company, then it has merit, and writing should strive to attain such merit. To be good company.
Hyde approaches the same theme from another angle, that of rhetoric and philosophy. Moods are not just something that happens while reading, but are the guiding principle behind our thoughts and actions. If we like the places we inhabit - dwell, in his word - we will act towards them in certain ways, presumably with the intention to preserve and decorate these places. If we do not like them, the mood will be different, and our actions will follow suit. Mood is what motivates us: thus understanding mood means understanding ourselves and our place in the world.
The punk aesthetic can be understood in this light. It defines itself against the status quo and seeks to rebel against it. The point is to be something different than what is on offer by the powers that be. The fact that it is seen as ugly and vulgar by those who are attuned to the mood of the times is one of punk's express aesthetic purposes, and only adds to the appeal of those who share the sentiment.
Hyde maintains that seeing mood as guiding principle places a certain ethical responsibility on us as discursive actors in the world. When we write something, we do not simply convey a certain number of facts in a certain order and with a certain degree of accuracy - we also convey a mood. More so when engaging in public speaking, as our presence defines the mood in the room with regard to the subject matter discussed. What we say and how we say it matters, and it falls upon us to think about our impact on those who listen.
Taken together, these two variations on the theme of mood gives us a foundation on which to build further thinking about critical reading and writing. At its most basic, it allows us to ask what mood a particular artifact puts us in or is written to foster. It also allows us to reflect on our own writing, and ask ourselves if we convey the appropriate mood alongside what we want to say. At its most simple, thinking about moods this way asks us to pay attention, and to act on what we see.
More indirectly, the notion of mood gives us an opening to understand why certain people like certain works or genres. There is no shortage of writers and podcasters who do little else but repackage things that have already been said elsewhere, but who add the element of mood. Being able to understand that it is this mood that draws their audience allows us to understand why they do what they do - 'they' being both audience and authors.
A benign example is why readers like the rapt wittiness of someone like Jane Austen; the way she depicts social interactions and relations is a very distinct kind of mood indeed. On a less pleasant note, many partake of racist media just for the sake of the mood therein: hearing someone else talk about the negroes and their decadent ways gives permission to maintain that mood and mode of thinking. Keeping mood in mind allows us to understand - and critique - these things in a more interesting way.
Closer to home, it also opens the door to understanding home decoration. The point is not just simply to look good, but also to suggest a certain mood. A sidenote, to be sure, but I want to imply the general applicability of these things.
I suspect that both works discussed above might be slightly obscure to the general reader. Booth published the Company We Keep in 1988, and Hyde's anthology about the Ethos of Rhetoric came out in 2004. I also suspect that, should you have stumbled upon these books in the wild, you might not have found them particularly interesting - they are both, in a way, intended for specialized audiences. While the point of writing discursive anomalies about a particular thing is to encourage readers to pick up these things and read for themselves, in this case the point is more to convey the general mood of these two books. To introduce you to a concept you might otherwise miss.
But, then again: that is the point of most writing about writing. -
In case you are reading in the future, the two anomalies in question are the ones about Hyde and Booth. Since texts are always retroactively present, you can sneak over to read them without missing a beat. Go on. These words will still be here.
Mood is an underrated concept. Sometimes it is dismissed outright, as part of the overall category of 'feelings'. At other times, it is seen as a distraction from the main point of interest, e.g. 'not being in the mood', 'being in a bad mood'. There is a tendency to see mood as something that happens beside the point, and that reality happens without you while you are distracted by these irrelevant moods of yours.
Besides being both rude and bordering on gaslighting, these takes have the additional drawback of being wrong.
Booth is perhaps most explicit in his discussion of moods. One of his premises is that the reason you keep reading a particular text - a romance novel, a cartoon, a crime novel - is that you want more of whatever it is you are reading. The point is not to see if the lovers stick together, what the punchline might be or whodunnit, but to extend the present experience of reading, whatever it might be. The act of reading the text puts you into a certain (albeit at times intangible) mood, and it is this mood that fiction provides. Far from being a side point, mood is for Booth the express purpose of reading. And, by extension, writing; to create an artifact in the world that conveys the kind of mood the author is interested in conveying, and thus creating an opportunity to explore this mood - both by experiencing it through reading, and by the creative act of criticism.
If you are a podcast listener, you might have experienced a peculiar kind of sensation: that of listening to people talk about something you are utterly uninterested in, but find the discussion itself fascinating and worthwhile. This is the mood Booth writes about; the state of mind the act of partaking of something puts you in, regardless of what the subject matter happens to be.
When Booth says that books are friends, this is what he means. You can pick them off the shelves and read for a while, and be comforted by their company; they raise your mood, as friends are wont to do. His approach to criticism is this: if what you have written can provide good company, then it has merit, and writing should strive to attain such merit. To be good company.
Hyde approaches the same theme from another angle, that of rhetoric and philosophy. Moods are not just something that happens while reading, but are the guiding principle behind our thoughts and actions. If we like the places we inhabit - dwell, in his word - we will act towards them in certain ways, presumably with the intention to preserve and decorate these places. If we do not like them, the mood will be different, and our actions will follow suit. Mood is what motivates us: thus understanding mood means understanding ourselves and our place in the world.
The punk aesthetic can be understood in this light. It defines itself against the status quo and seeks to rebel against it. The point is to be something different than what is on offer by the powers that be. The fact that it is seen as ugly and vulgar by those who are attuned to the mood of the times is one of punk's express aesthetic purposes, and only adds to the appeal of those who share the sentiment.
Hyde maintains that seeing mood as guiding principle places a certain ethical responsibility on us as discursive actors in the world. When we write something, we do not simply convey a certain number of facts in a certain order and with a certain degree of accuracy - we also convey a mood. More so when engaging in public speaking, as our presence defines the mood in the room with regard to the subject matter discussed. What we say and how we say it matters, and it falls upon us to think about our impact on those who listen.
Taken together, these two variations on the theme of mood gives us a foundation on which to build further thinking about critical reading and writing. At its most basic, it allows us to ask what mood a particular artifact puts us in or is written to foster. It also allows us to reflect on our own writing, and ask ourselves if we convey the appropriate mood alongside what we want to say. At its most simple, thinking about moods this way asks us to pay attention, and to act on what we see.
More indirectly, the notion of mood gives us an opening to understand why certain people like certain works or genres. There is no shortage of writers and podcasters who do little else but repackage things that have already been said elsewhere, but who add the element of mood. Being able to understand that it is this mood that draws their audience allows us to understand why they do what they do - 'they' being both audience and authors.
A benign example is why readers like the rapt wittiness of someone like Jane Austen; the way she depicts social interactions and relations is a very distinct kind of mood indeed. On a less pleasant note, many partake of racist media just for the sake of the mood therein: hearing someone else talk about the negroes and their decadent ways gives permission to maintain that mood and mode of thinking. Keeping mood in mind allows us to understand - and critique - these things in a more interesting way.
Closer to home, it also opens the door to understanding home decoration. The point is not just simply to look good, but also to suggest a certain mood. A sidenote, to be sure, but I want to imply the general applicability of these things.
I suspect that both works discussed above might be slightly obscure to the general reader. Booth published the Company We Keep in 1988, and Hyde's anthology about the Ethos of Rhetoric came out in 2004. I also suspect that, should you have stumbled upon these books in the wild, you might not have found them particularly interesting - they are both, in a way, intended for specialized audiences. While the point of writing discursive anomalies about a particular thing is to encourage readers to pick up these things and read for themselves, in this case the point is more to convey the general mood of these two books. To introduce you to a concept you might otherwise miss.
But, then again: that is the point of most writing about writing. -
Monday, July 24, 2017
Human-level intelligences and you
There has been much ado over the years about computers becoming as intelligent as humans. Several goals have been set up and surpassed, and for each feat of computer engineering we have learnt that intelligence is a slippery thing that requires ever more refined metrics to accurately measure. Beating a human in chess was once thought a hard thing to do, but then we built a computer that could do it - and very little besides it. It is a very narrowly defined skill being put to the test, and it turns out intelligence is not the key factor that determines victory or defeat.
Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.
The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.
The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.
If all you need is a hammer, build a good one.
A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.
Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.
And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.
When does a human become as intelligent as a human?
Fast forward a bit, and we have computers giving trivia nerds a run for their money in Jeopardy. Turns out intelligence isn't the defining factor here either, on both sides. For computers, it's all a matter of being able to crawl through large amounts of available data fast enough to generate a sentence. For humans, it's a matter of having encountered something in the past and being able to recount it in a timely fashion. Similar tasks, indeed. but neither require intelligence. Either the sorting algorithm is optimized enough to get the processing done on time, or it is not. Either you remember that character from that one soap opera you saw years and years ago, or you do not.
The win condition is clearly defined, but the path to fulfilling it does not require intelligence proper. It can go either way, based on what basically amounts to a coin toss, and however you want to go about defining intelligence, that probably is not it.
The question of computers becoming as intelligent as humans has ever so gradually been replaced with an understanding that computers do not have to be. In the case of chess, a specialized dumb computer gets the work done; the same goes for other tasks, with similar degrees of dumb specialization. Get the dumb computer to do it really really way, and the job gets done.
If all you need is a hammer, build a good one.
A more interesting (and more unsettling question) is when a human becomes as intelligent as a human. This might seem somewhat tautological: 1 = 1, after all. Humans are human. But humans have this peculiar quality of being made, not born. As creatures of culture, we have to learn the proper ways to go about living, being and doing, And - more to the point - we can fail to learn these things.
Just what "these things" are is a matter of some debate, and has shifted over the years. A quick way to gauge where the standards are at any moment in time would be to look at the national curricula for the educational system of where you happen to be, and analyze what is given importance and what is not. There are always some things given more attention than others, some aspect promoted above others. And, at the core, some things are deemed to be of such importance that all citizens need to know them. Some minimum of knowledge to be had by all. Some minimum level of intelligence.
And there are always a number of citizens who do not qualify. Who are not, for any given definition of intelligence, up for it.
When does a human become as intelligent as a human?
Friday, July 14, 2017
Some words on media permanence
It is a strange thing about media artifacts that some of them age well, while others do not. Some can be forgotten for decades, only to find a new audience willing and able to engage with them. Others can not be revived as easily, and are thus consigned to reside only in the memories of those who were there at the time.
To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.
But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.
Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.
But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.
This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.
The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.
A certain sensibility has been lost.
It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.
Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.
To be sure, this applies to things that are not media artifacts, too. Things happen, and after they have happened you were either there or you were not, and your memories of the event are shaped accordingly. It is a very important aspect of the human condition.
But the point of media artifacts is that it is possible to return to them at a later date. They are supposed to have some sort of permanence - it is a key feature. Books remain as written, pictures as pictured, movies as directed. It would be a substantial design flaw if these things did not last.
Though, then again, some things do not last. Books fall apart, movies fade, hard drives crash. Entropy is not kind to supposedly eternal things. Look upon these works, ye mighty.
But. All these things aside: some media artifacts age well, and some do not. Some can be readily introduced to new audiences, while others remain indecipherable mysteries even upon close encounter. There is a difference, and it is very distinct from the question of whether or not we're trying to jam a VHS tape into a Betamax player.
This difference can be clearly seen if we contrast Deep Space Nine and Babylon 5. Despite being from roughly the same time, belonging to the same genre and sharing a non-insignificant portion of plot elements, one of these television series is instantly accessible to contemporary audiences while the other is not. Though it pains me to say, it takes a non-trivial effort on the part of those who are not nostalgically attached to Babylon 5 to view it with contemporary eyes. A certain sensibility has been lost, and the gloriously cheesy CGI effects turn into obstacles to further viewing. Surmountable obstacles, but obstacles nonetheless.
The same goes for computer games. I imagine that, should we use the Civilization series as a benchmark, there would be different cutoff points for different audiences. For my part, the first iteration is unplayable, and I suspect many of my younger peers would balk at Civilization 2. I also have fears that 3 or even 4 might be too much of a learning curve for those who were not there to remember it. Not because the games are inherently impossible to play, but because the contemporary frameworks for how games are supposed to work (and how intuitive user interfaces are supposed to be) have shifted between then and now.
A certain sensibility has been lost.
It would be a mistake to label this development as either good or bad. The young ones have not destroyed theater by their use of the lyre, despite all accounts to the contrary. These changes are simply something that have happened, and have to be understood as such. Moreover, it is something to take into account as yet another generation grows up in a society overflowing with media artifacts, old and new.
Some of these artifacts will constitute shared experiences, while others will not. Such is the way of these things.
Saturday, July 8, 2017
Care for future history
These are strange times.
Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.
These times are not normal. These times are strange.
In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:
Surely, it wasn't that strange?
We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.
Surely, it wasn't that strange?
It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.
It is the implicit challenge of our time.
Better get to it.
Since you're living in these times, the above statement is probably not a surprise to you. In fact, it might very well be the least surprising statement of our time. Especially if you happen to have a presence on twitter, and even more so if this presence is in the parts where the statement "this is not normal" is commonplace, or where a certain president makes his rounds. The two are related, in that the former refer to the latter: it is a reminder and an incantation to ensure that you do not get used to these strange new times and start to see them as normal.
These times are not normal. These times are strange.
In the future, there will doubtless be summaries and retrospectives of these times. More than likely, these will be written with academic rigor, historical nuance and critical stringency. Even more likely, all the effort put into making these retrospectives such will be made moot by this simple counterquestion:
Surely, it wasn't that strange?
We can see this future approaching. Less strange times will come, and frames of reference will be desensitized to the strangeness of our time. In a future where it is not common for presidents to tweet at the television as if encountering the subject matter for the first time, the claim that there once were such a president will be extraordinary.
Surely, it wasn't that strange?
It behooves us - we who live in these strange times - to leave behind cultural artifacts that underline and underscore just how strange these times were. Small nuggets of contemporaneity that give credence to the strangeness we ever so gradually come to take for granted. Give the future clear direction that, yep, there is a before and an after, but not yet, and we knew it.
It is the implicit challenge of our time.
Better get to it.
Sunday, May 21, 2017
Concerning the Dark Souls of US presidencies
It has been said that the current president is the Dark Souls of US presidencies. Which, to be sure, has a certain ring to it, but it lacks the virtue of truth. Let's explore the issue for a spell.
Dark Souls is a series of games built around the notion of gradual player progression. The games might seem hard at first, but if you stick with it you learn how to overcome that difficulty and become good at what the games ask you to do. The difficulty is not mechanical - the challenges do not require superhuman reflexes or superior skills to overcome - but rather psychological. By failing, again and again, the player gradually learns what needs to be learnt. The reward for this application of patience is the opportunity to excel whenever new situations arise that require the very thing just learnt. It is the player leveling up, rather than the player character.
Meanwhile, in the background of all this character development, a world and its long tragic backstory is ever so subtly unfolding. It is not a simple backstory, where this happened after that, but a series of subtle implications of social relations and emotional states of mind. Complex social processes led to cascading catastrophic outcomes which in turn sparked other social processes which -
It is a deep and complex backstory, and for the sake of brevity, it will all be ignored. Suffice to say that much of it is left unsaid, and that the player will have to piece it together from archeological fragments, old legends and features of geography.
From this description alone, you might see what I'm getting at. Gradual self-improvement through patience, slowly unfolding understanding of past events through contextual knowledge, and the characterization of subtle states of mind - neither of these things are applicable to the current president, even with excessive use of shoehorns or cherrypickers.
There probably is a past president that would live up the title of the Dark Souls of US presidencies. But that is a topic for another cycle.
Dark Souls is a series of games built around the notion of gradual player progression. The games might seem hard at first, but if you stick with it you learn how to overcome that difficulty and become good at what the games ask you to do. The difficulty is not mechanical - the challenges do not require superhuman reflexes or superior skills to overcome - but rather psychological. By failing, again and again, the player gradually learns what needs to be learnt. The reward for this application of patience is the opportunity to excel whenever new situations arise that require the very thing just learnt. It is the player leveling up, rather than the player character.
Meanwhile, in the background of all this character development, a world and its long tragic backstory is ever so subtly unfolding. It is not a simple backstory, where this happened after that, but a series of subtle implications of social relations and emotional states of mind. Complex social processes led to cascading catastrophic outcomes which in turn sparked other social processes which -
It is a deep and complex backstory, and for the sake of brevity, it will all be ignored. Suffice to say that much of it is left unsaid, and that the player will have to piece it together from archeological fragments, old legends and features of geography.
From this description alone, you might see what I'm getting at. Gradual self-improvement through patience, slowly unfolding understanding of past events through contextual knowledge, and the characterization of subtle states of mind - neither of these things are applicable to the current president, even with excessive use of shoehorns or cherrypickers.
There probably is a past president that would live up the title of the Dark Souls of US presidencies. But that is a topic for another cycle.
Friday, May 19, 2017
My computer broke down, can you learn it?
With the recent update to Windows being in the news (not in small part thanks to a computer-eating virus which eats non-updated versions), I've been thinking about how knowledge is situated. Which might seem like a strange connection to make, until you are confronted with this question:
"My computer broke down, can you fix it?"
This is a very common situation to find oneself in, especially if one has acquired a reputation for being able to fix computers. (Even if it only came about from helping someone change the background image that one time.) The knowledge required to navigate this situation is not, however, primarily related to computers. Mostly, it comes down to knowing the asker, their general level of computer literacy and the problems they've asked you to fix in the past. It is a very particular skill set, and over time you develop it through use and abuse.
The aforementioned recent update seems to have crashlanded a fair number of systems, if anecdotal evidence is anything to go by. This made me think about whether I could fix my system if it went down as well, and after poking around for a bit (and making an extra backup of all the things for good measure), I figured that I probably could, given time.
If someone were to ask me to fix the very same problem on their system, I probably couldn't. Not because of my admittedly limited skill in these matters, but because of the different situations in which the problem is situated. If it's just me and my broken computer, then I can take my time, tinker with it, fiddle with the knobs and overall do things that are not directly goal-oriented but which nevertheless gets to the point eventually. It'd be a learning experience, albeit a terrifying one.
If it's not just me, then a whole host of other constraints and situationally specific conditions apply. For one thing, the asker might not have the patience with me learning on the job; they might want the situation dealt with and gone, and me taking my time is the opposite of that. There's also the added element of risk - tinkering is never 100% safe, and accidentally making the problem worse is equally the opposite of the solution. Being risk-averse is good, but it is also slow (yes, even slower), which overall is not conducive to getting things done in a brisk manner.
The point here is not that computers are fragile (though they are), but that knowing something rarely is a yes/no proposition. Mostentimes, we know something sufficiently well that if we were to try it out on our own we'd probably turn out all right, more or less. More often than not, the things we know stem from some first attempt that went in an orthogonal direction from well, but which nevertheless sparked the learning process that led us to where we are. We tinker, we fiddle, and eventually we figure things out.
Though, to be sure, having someone around who you can ask about these things as you go along learning speeds things up immensely.
Do be kind to their patient hearts.
"My computer broke down, can you fix it?"
This is a very common situation to find oneself in, especially if one has acquired a reputation for being able to fix computers. (Even if it only came about from helping someone change the background image that one time.) The knowledge required to navigate this situation is not, however, primarily related to computers. Mostly, it comes down to knowing the asker, their general level of computer literacy and the problems they've asked you to fix in the past. It is a very particular skill set, and over time you develop it through use and abuse.
The aforementioned recent update seems to have crashlanded a fair number of systems, if anecdotal evidence is anything to go by. This made me think about whether I could fix my system if it went down as well, and after poking around for a bit (and making an extra backup of all the things for good measure), I figured that I probably could, given time.
If someone were to ask me to fix the very same problem on their system, I probably couldn't. Not because of my admittedly limited skill in these matters, but because of the different situations in which the problem is situated. If it's just me and my broken computer, then I can take my time, tinker with it, fiddle with the knobs and overall do things that are not directly goal-oriented but which nevertheless gets to the point eventually. It'd be a learning experience, albeit a terrifying one.
If it's not just me, then a whole host of other constraints and situationally specific conditions apply. For one thing, the asker might not have the patience with me learning on the job; they might want the situation dealt with and gone, and me taking my time is the opposite of that. There's also the added element of risk - tinkering is never 100% safe, and accidentally making the problem worse is equally the opposite of the solution. Being risk-averse is good, but it is also slow (yes, even slower), which overall is not conducive to getting things done in a brisk manner.
The point here is not that computers are fragile (though they are), but that knowing something rarely is a yes/no proposition. Mostentimes, we know something sufficiently well that if we were to try it out on our own we'd probably turn out all right, more or less. More often than not, the things we know stem from some first attempt that went in an orthogonal direction from well, but which nevertheless sparked the learning process that led us to where we are. We tinker, we fiddle, and eventually we figure things out.
Though, to be sure, having someone around who you can ask about these things as you go along learning speeds things up immensely.
Do be kind to their patient hearts.
Monday, April 3, 2017
Automated anti-content
So I was thinking about bots in microblogs today, and it occurred to me that they have the potential of being pure anti-content. A realization which, when stated in these terms, raises two questions. The first is "microblog, really?", and the second is "what is this anti-content you speak of?".
To answer the first question: yup, really. It's faster than describing a subset of social medias that are defined by short messages visible for a short period of time, mainly in the form of scrolling down the screen in real time. Gotta go fast.
The second question is more interesting. "Content" is a word that describes some kind of stuff, in general. It doesn't really matter what it is - as long as it is something and can fit into a defined media for a defined period of time, it is content. A person screaming into a mic for twenty minutes is content. It is as generic as it gets.
Anti-content, then. It is not generic, but is is also not original. An example would be the UTC time bot, which tweets the correct (albeit non-UTC) time once an hour. Another example is the TootBot, which toots every fifteen minutes. It is not content, but it is definitely something. You are not going to enthusiastically wake your friends in the middle of the night to tell them about the latest UTC update (though you might wake them about the toot bot), but you are going to notice them when they make their predictable rounds yet again.
Anti-content is not content. But it is familiar.
The thing about humans is that they like what is familiar. It provides a fixed point of reference, a stable framework to build upon, and - not to be underestimated - something to talk about. Stuff happens, stuff changes, but you can rely on these things to remain the same. And because they remain as they are, they can be used to anchor things which have yet to become and/or need a boost to get going.
Or, to rephrase: you can refer to them without accidentally starting a fight. After all, they have nothing to say worth screaming about. They are anti-content. And they are a part of the community, albeit a very small part. They say very little, but every time you see them, they remind you of the familiar.
And now that you have read this, you will never look upon these automated little bots the same way again. Enjoy!
To answer the first question: yup, really. It's faster than describing a subset of social medias that are defined by short messages visible for a short period of time, mainly in the form of scrolling down the screen in real time. Gotta go fast.
The second question is more interesting. "Content" is a word that describes some kind of stuff, in general. It doesn't really matter what it is - as long as it is something and can fit into a defined media for a defined period of time, it is content. A person screaming into a mic for twenty minutes is content. It is as generic as it gets.
Anti-content, then. It is not generic, but is is also not original. An example would be the UTC time bot, which tweets the correct (albeit non-UTC) time once an hour. Another example is the TootBot, which toots every fifteen minutes. It is not content, but it is definitely something. You are not going to enthusiastically wake your friends in the middle of the night to tell them about the latest UTC update (though you might wake them about the toot bot), but you are going to notice them when they make their predictable rounds yet again.
Anti-content is not content. But it is familiar.
The thing about humans is that they like what is familiar. It provides a fixed point of reference, a stable framework to build upon, and - not to be underestimated - something to talk about. Stuff happens, stuff changes, but you can rely on these things to remain the same. And because they remain as they are, they can be used to anchor things which have yet to become and/or need a boost to get going.
Or, to rephrase: you can refer to them without accidentally starting a fight. After all, they have nothing to say worth screaming about. They are anti-content. And they are a part of the community, albeit a very small part. They say very little, but every time you see them, they remind you of the familiar.
And now that you have read this, you will never look upon these automated little bots the same way again. Enjoy!
Tuesday, March 28, 2017
Free speech vs rational debate
An interesting thing about the most vocal defenders of free speech at all costs is that they often conflate free speech and rational debate. Which is a strange thing to do - if you argue something with loudness and extreme forwardness, the least that could be expected from you is that you know what you are on about. Yet, somehow, free speech maximalists often show a brutal lack of understanding of the difference between rational debate and free speech.
To illustrate the difference, I shall describe a case where it is not rational to engage in public debate, and where the debate itself has detrimental effects to the society within which it takes place. The debate in question is whether it is the right course of action to exterminate a specific group of people.
For those who belong to this specific group, it is not rational to participate in such debates. The most immediate reason is that you might lose. No matter how unlikely, the mere possibility of losing is reason enough to stay clear of such debates. To the proponents of the extermination policy, your participation in the debate is an additional justification for their point of view. "They can't even defend themselves!" they'd claim, and then move from word to action. Perhaps not immediately, but eventually the final day would come.
The tragic part is that you would lose even if you won. If you won, it would most likely be because you gave reasons for why your extermination is a bad idea. These reasons might be good in and of themselves, but there would be a finite amount of them, and with enough journalistic efficiency these reasons could be summarized into a list. From the very moment the debate ended, this list would constitute the reasons society abstains from exterminating you.
The existence of such a list would constitute an opening for those who favor your extermination. One by one, the proponents could work to undermine these reasons, until they are no longer seen as sufficient reasons for abstaining. The debate would reopen, and you would find yourself in a weaker position than last time around. You would yet again have to defend your right to exist, and you would have to do it using an ever shrinking range of possible arguments in your favor.
Needless to say, this process would continue until there are no reasons left. And then the proponents of your extermination would have won.
This is detrimental not only to the group targeted for extermination, but also for the society as a whole. For each round of these debates, the society would slip one step closer to enacting genocidal policies. Which, to any decent and moral person, is not a desirable outcome.
The rational thing to do in order to avoid such an outcome is to simply not have these debates. Exorcise them public discourse, and keep them off the realms of possible topics. Do not entertain the thoughts, shun those who persist in proposing them, ban them from polite conversation. Keep the opinion marginalized. No good outcome can come from having these debates, and thus the rational thing to do is to simply not have them.
Free speech maximalists want to have these debates anyway, in the name of free speech. But they conflate free speech with rational debate, and as you have seen, there is a very concrete case where these two things are mutually exclusive. If they are to be honest to themselves, they will eventually have to make a choice between one or the other.
If you began reading this post with the opinion that we should have these debates anyway, and still hold that opinion, then I want you to be fully aware of what you are proposing. I fully trust that you will, in your own time and on your own terms, make the rational choice.
To illustrate the difference, I shall describe a case where it is not rational to engage in public debate, and where the debate itself has detrimental effects to the society within which it takes place. The debate in question is whether it is the right course of action to exterminate a specific group of people.
For those who belong to this specific group, it is not rational to participate in such debates. The most immediate reason is that you might lose. No matter how unlikely, the mere possibility of losing is reason enough to stay clear of such debates. To the proponents of the extermination policy, your participation in the debate is an additional justification for their point of view. "They can't even defend themselves!" they'd claim, and then move from word to action. Perhaps not immediately, but eventually the final day would come.
The tragic part is that you would lose even if you won. If you won, it would most likely be because you gave reasons for why your extermination is a bad idea. These reasons might be good in and of themselves, but there would be a finite amount of them, and with enough journalistic efficiency these reasons could be summarized into a list. From the very moment the debate ended, this list would constitute the reasons society abstains from exterminating you.
The existence of such a list would constitute an opening for those who favor your extermination. One by one, the proponents could work to undermine these reasons, until they are no longer seen as sufficient reasons for abstaining. The debate would reopen, and you would find yourself in a weaker position than last time around. You would yet again have to defend your right to exist, and you would have to do it using an ever shrinking range of possible arguments in your favor.
Needless to say, this process would continue until there are no reasons left. And then the proponents of your extermination would have won.
This is detrimental not only to the group targeted for extermination, but also for the society as a whole. For each round of these debates, the society would slip one step closer to enacting genocidal policies. Which, to any decent and moral person, is not a desirable outcome.
The rational thing to do in order to avoid such an outcome is to simply not have these debates. Exorcise them public discourse, and keep them off the realms of possible topics. Do not entertain the thoughts, shun those who persist in proposing them, ban them from polite conversation. Keep the opinion marginalized. No good outcome can come from having these debates, and thus the rational thing to do is to simply not have them.
Free speech maximalists want to have these debates anyway, in the name of free speech. But they conflate free speech with rational debate, and as you have seen, there is a very concrete case where these two things are mutually exclusive. If they are to be honest to themselves, they will eventually have to make a choice between one or the other.
If you began reading this post with the opinion that we should have these debates anyway, and still hold that opinion, then I want you to be fully aware of what you are proposing. I fully trust that you will, in your own time and on your own terms, make the rational choice.
Monday, March 6, 2017
What cyborg Harry Potter can teach us about teaching
After revisiting the recount of my masters thesis, I realized that it is rather German. That is to say, it goes on at length to establish some general principle, but then doesn't bother to give examples of how this principle is realized. Which is a style of writing well suited for some purposes, but, let's face it, is also rather annoying. So let's contextualize this general principle for a spell, by relating fan fiction to the subject of history.
The general principle is that people learn by doing things that they are interested in doing. This happens automatically, without the addition of directed conscious effort. When someone does something, the doing of that thing places them in situations and frames of mind which facilitate the process of learning, and the more doing that takes place, the more learning subsequently follows. Being interested bring with it the propensity of doing more of it, and of paying attention whilst doing it. It is, in most cases, a self-perpetuating process.
This is rather straightforward, and the biggest drawback with this line of thinking is that it takes too many words to convey with regards to how straightforward it is. You begin reading, work through the verbiage, and then conclude at the end that it would have been sufficient to just say "you learn by doing". Which is true, but it also goes to show how much effort you have to put in to convey something straightforward. In retrospect, it is obvious, but you have to go through the process before it becomes retrospectively obvious.
Thus, we have what we need to get to work: the general principle of learning by doing, and the notion of retroactive obviousness. Let's move on to fan fiction and the subject of history. Specifically, let's move on to how the notion of 'canon' relates to the teaching of history.
Canon, in the context of fan fiction, denotes a particular set of works which can be considered official or true (as far as fictional depictions are true). In the case of, say, Harry Potter, the books written by Rowling are canonical, and the specific words found within these books carry significance in that they are the source material from which all knowledge of the fictional universe are garnered. Any further discussion about the Harry Potter universe will have to take these books as written, and conform to the limits imposed by Rowling having written them in a specific way instead of another.
Or, to put it another way: it is canonical that Harry Potter is a wizard that attended Hogwarts, a school for magically proficient youngsters. It is, however, not canon that Harry at a young age underwent a series of radical medical procedures which replaced everything but his visible exterior with cybernetic machinery, and that he is a robot that passes for a human child. The former is canon, the latter I just made up. Those who want to talk about what happened in the narrative universe of Harry Potter have to stick to what actually happened in the narrative - which is to say, the source material, as written.
Any particular work of fan fiction set in a particular narrative universe has to be related to the source material, in various ways. The fan work has to cohere with the source material (i.e. be about wizard Harry rather than cyborg Harry), and it has to cohere enough that assumptions from/about the source material carry over to the fan work. The closer to the source material a fan work manages to cohere, the more interesting things it has to say about the canonical narrative universe.
This introduces an element of evaluation to the act of reading fan fiction (and even more so to writing it). The act of reading also becomes an act of comparing - does the fan work cohere with the source material, and if there are inconsistencies, where are they? A critical reader can move back and forth between the different texts to find out whether they cohere, contradict or - more interestingly - pose further questions about the source material that are revealed through the act of writing the particular work in question.
Whether or not a reader actually makes the effort to make such comparisons depends entirely upon their level of interest. But, as we stated at the top of this post, people do the things they are interested in, and it is by doing the things they are interested in that they end up learning what they actually learn.
Thus, those who are interested in fan fiction about Harry Potter will eventually learn the skills associated with comparing a fan work with canonical works, by virtue of following their interest. They will find out which works are considered canonical, which works are not canonical and which works occupy ambiguous gray areas between these two poles. Or how to handle situations where canonical works disagree - such as when the books and their movie adaptations contradict each other. Which canonical authority has preference?
If you are a teacher of history, then these are the very questions you wish your students to engage with. Not about Harry Potter, mind, but about the general validity of narratives told about the past. Which works are canonical, which are not, and what do you do with all the gray sources in between? Which statements about the past can be substantiated with references to the source material, and which are but speculation? How do you position yourself as a critical reader with regards to the source material at hand? What do you do when you encounter a text about a historical equivalent of cyborg Harry? These are questions that practitioners of fan fiction engage with, albeit not always explicitly.
The pedagogical challenge that follows from the general principle that learning follows from doing what you are interested in, is to identify what students are interested in and which skill sets they have developed during the course of following their interests. By doing this, a teacher can utilize the retroactive obviousness inherent in applying what a student already knows to new situations. Rather than restarting from square one, we do something more interesting.
Fortunately, everyone is interested in something. But that goes without saying.
Obviously.
The general principle is that people learn by doing things that they are interested in doing. This happens automatically, without the addition of directed conscious effort. When someone does something, the doing of that thing places them in situations and frames of mind which facilitate the process of learning, and the more doing that takes place, the more learning subsequently follows. Being interested bring with it the propensity of doing more of it, and of paying attention whilst doing it. It is, in most cases, a self-perpetuating process.
This is rather straightforward, and the biggest drawback with this line of thinking is that it takes too many words to convey with regards to how straightforward it is. You begin reading, work through the verbiage, and then conclude at the end that it would have been sufficient to just say "you learn by doing". Which is true, but it also goes to show how much effort you have to put in to convey something straightforward. In retrospect, it is obvious, but you have to go through the process before it becomes retrospectively obvious.
Thus, we have what we need to get to work: the general principle of learning by doing, and the notion of retroactive obviousness. Let's move on to fan fiction and the subject of history. Specifically, let's move on to how the notion of 'canon' relates to the teaching of history.
Canon, in the context of fan fiction, denotes a particular set of works which can be considered official or true (as far as fictional depictions are true). In the case of, say, Harry Potter, the books written by Rowling are canonical, and the specific words found within these books carry significance in that they are the source material from which all knowledge of the fictional universe are garnered. Any further discussion about the Harry Potter universe will have to take these books as written, and conform to the limits imposed by Rowling having written them in a specific way instead of another.
Or, to put it another way: it is canonical that Harry Potter is a wizard that attended Hogwarts, a school for magically proficient youngsters. It is, however, not canon that Harry at a young age underwent a series of radical medical procedures which replaced everything but his visible exterior with cybernetic machinery, and that he is a robot that passes for a human child. The former is canon, the latter I just made up. Those who want to talk about what happened in the narrative universe of Harry Potter have to stick to what actually happened in the narrative - which is to say, the source material, as written.
Any particular work of fan fiction set in a particular narrative universe has to be related to the source material, in various ways. The fan work has to cohere with the source material (i.e. be about wizard Harry rather than cyborg Harry), and it has to cohere enough that assumptions from/about the source material carry over to the fan work. The closer to the source material a fan work manages to cohere, the more interesting things it has to say about the canonical narrative universe.
This introduces an element of evaluation to the act of reading fan fiction (and even more so to writing it). The act of reading also becomes an act of comparing - does the fan work cohere with the source material, and if there are inconsistencies, where are they? A critical reader can move back and forth between the different texts to find out whether they cohere, contradict or - more interestingly - pose further questions about the source material that are revealed through the act of writing the particular work in question.
Whether or not a reader actually makes the effort to make such comparisons depends entirely upon their level of interest. But, as we stated at the top of this post, people do the things they are interested in, and it is by doing the things they are interested in that they end up learning what they actually learn.
Thus, those who are interested in fan fiction about Harry Potter will eventually learn the skills associated with comparing a fan work with canonical works, by virtue of following their interest. They will find out which works are considered canonical, which works are not canonical and which works occupy ambiguous gray areas between these two poles. Or how to handle situations where canonical works disagree - such as when the books and their movie adaptations contradict each other. Which canonical authority has preference?
If you are a teacher of history, then these are the very questions you wish your students to engage with. Not about Harry Potter, mind, but about the general validity of narratives told about the past. Which works are canonical, which are not, and what do you do with all the gray sources in between? Which statements about the past can be substantiated with references to the source material, and which are but speculation? How do you position yourself as a critical reader with regards to the source material at hand? What do you do when you encounter a text about a historical equivalent of cyborg Harry? These are questions that practitioners of fan fiction engage with, albeit not always explicitly.
The pedagogical challenge that follows from the general principle that learning follows from doing what you are interested in, is to identify what students are interested in and which skill sets they have developed during the course of following their interests. By doing this, a teacher can utilize the retroactive obviousness inherent in applying what a student already knows to new situations. Rather than restarting from square one, we do something more interesting.
Fortunately, everyone is interested in something. But that goes without saying.
Obviously.
Sunday, February 26, 2017
Roundabout canons
Every academic discipline has a canon. That is to say, a series of texts that most of those who are active in the field have read, or at least have some sort of working understanding of. The exact composition of these texts vary from field to field (and over time), but at any given moment you can be sure that there is a set of books most practitioners within a particular field of knowledge knows about. The canon as a general category, whilst undefined in its particulars, still exists.
It is markedly more defined at local levels. It is especially defined at local sites of education, where there are syllabi that explicitly specify which texts are included in the mandatory coursework. Teachers are expected to know these texts well enough to teach them, and students are expected to read them well enough to mobilize some parts of their content through some sort of practice. Such as writing an essay on just what the mandatory texts have to say.
Invariably, there will be some students who are just not feeling it when it comes to going through the academic motions. Invariably, these students will turn to the internet for an easy way out. Invariably, some of these students will yoink a text from the internet and turn it in as if it were their own.
Thing is. If the texts and/or the subject matter remains the same over the years, patterns will emerge. Students will be faced with the same task of producing some work on a topic, and they will conduct the same web searches year after year. And, if general laziness is a constant, they will find the same first-page results and turn them in, unaware of their participation in an ever more established tradition. [A fun sidenote: I have a few blog posts which receive a boost in traffic two times a year, which coincide very closely to when their subject matter is taught at my local university.]
What I wonder is - how many times does a particular web-copied text need to be turned in before those in charge of grading start to recognize it? Or, phrased another way: how many iterations does it take for these easy-to-find texts to become part of the local canon?
A canon is wider than merely those lists found in official documents, such as syllabi. Informal inclusion is a very real phenomena, and when a particular text keeps showing up again and again and again -
Now there is food for thought.
It is markedly more defined at local levels. It is especially defined at local sites of education, where there are syllabi that explicitly specify which texts are included in the mandatory coursework. Teachers are expected to know these texts well enough to teach them, and students are expected to read them well enough to mobilize some parts of their content through some sort of practice. Such as writing an essay on just what the mandatory texts have to say.
Invariably, there will be some students who are just not feeling it when it comes to going through the academic motions. Invariably, these students will turn to the internet for an easy way out. Invariably, some of these students will yoink a text from the internet and turn it in as if it were their own.
Thing is. If the texts and/or the subject matter remains the same over the years, patterns will emerge. Students will be faced with the same task of producing some work on a topic, and they will conduct the same web searches year after year. And, if general laziness is a constant, they will find the same first-page results and turn them in, unaware of their participation in an ever more established tradition. [A fun sidenote: I have a few blog posts which receive a boost in traffic two times a year, which coincide very closely to when their subject matter is taught at my local university.]
What I wonder is - how many times does a particular web-copied text need to be turned in before those in charge of grading start to recognize it? Or, phrased another way: how many iterations does it take for these easy-to-find texts to become part of the local canon?
A canon is wider than merely those lists found in official documents, such as syllabi. Informal inclusion is a very real phenomena, and when a particular text keeps showing up again and again and again -
Now there is food for thought.
Wednesday, February 22, 2017
Postmodernism, a primer
There has been a lot of talk about postmodernism lately, and the only thing larger than the distaste for it is the confusion about what it actually is. While it might be tempting to label this as a postmodern state of things, it's not. It's just confused, and confusion is not postmodernism. The latter might lead to the former, but that is the extent of the connection between the two.
If you've ever read a textbook that in some way deals with postmodernism, then you've probably encountered the introductory statement that the word consists of two parts - post and modernism. Post- as a prefix means that whatever it is fixed to happened in the past. When it is fixed to modernism, we get a word that means "the stuff that happened after modernism". Modernism came first, then postmodernism - in that order.
There are two main reasons for including introductory remarks of this kind. The first is that it has become tradition and convention at this point, and it's easier to latch on to what has already been established than to be creative. The second is that you cannot treat postmodernism as an entity unto itself - it has to be understood in relation to what came before. If you do not understand modernity, you will not understand postmodernity. The one came from the other, and it could not have happened in any other way.
It is vitally important to underscore this intimate relationship. It is a historical progression which is not merely chronological - the tendencies and practices set in motion in the modern time period kept going in the postmodern time period. They are linked, similar and connected.
The modern project was (and is) one of enlightened critical thinking. Traditional institutions, mainly those of monarchies and churches, were no longer to be seen as the absolute authorities when it came to the truth. Instead of relying on ancient authorities (or very present authorities, as it were), the moderns wanted to rely on science and reason.
An example of this shift from ancient authority to a more modern way of thinking is Galileo and the notion that the Earth goes around the sun. Using the tools at hand, Galileo figured out that Earth is not the center of the solar system. The traditional authorities, who held that the Earth was in fact the center, did not agree, and much ado was made about it. In the end, you know how it all turned out.
This ambition to test things by means of science and reason wasn't limited to one person and one particular way of looking at things. Over time, it became the default mode for everything - everything could be questioned, measured, re-examined and put to the test. Those things that were found to not hold up to the standards of scientific testing were thrown out, and those things that did hold up were expanded upon.
The scientific implications of this are fairly obvious: you can get a whole lot more done if you are allowed to freely use the scientific method, without having to make sure everything you find corresponds to what the authorities want you to say. Science builds on science alone, and its findings are all the more robust for it.
The social implications, however, are less straightforward. If long-held beliefs about the cosmos as a whole could be questioned and challenged, then so could long-held beliefs about things of a smaller and more private nature. If the church was wrong about the Earth being at the center of the solar system, then it might also be wrong about marriage, sexuality, and other social institutions. Everything is up for questioning. Everything.
This process of questioning everything kept going, and over time more and more things that were once taken for granted were put to the task of defending themselves. Everything that was once solid melted away, and what came instead was something completely different. Where once kings and bishops ruled, there are now scientists and bureaucrats. And marketers.
Mind you, this is all part of modernity. This is the part that came before postmodernism became a thing. Postmodernism is what happened after this process had been around for a while and become the status quo.
The thing about questioning everything is that you can't really keep doing it forever. At some point, you arrive at the conclusion that some questions have been answered once and for all, and thus that there is no need to go back to them. You begin to take things for granted, and enshrine them as the way things are supposed to be. There are other, more important things to do than reinventing the wheel. There is an order to things and a tradition to consider, both of which are as they should be. The product of modernity is a new range of authorities which dictate what is to be taken for granted and what is to be questioned.
Postmodernism is a return to the very modern urge to question everything and make present institutions answer for themselves. It is, in essence, a return to the modern impulse to trust reason and science rather than tradition or authority - even if these very same traditions and authorities have used reason and science in the process of becoming what they are. But instead of asking whether the Earth revolves around the sun or not, it asks: why do we do the things we do the way we do them, and might there not be a better way to go about it?
Postmodernism happened after the modern project. Post-modernism. But it is still very modern. It is modernity turned upon itself.
If you, after having read this, are slightly more confused about postmodernism, then that is good. It will have primed you for this next statement:
Academics stopped talking about postmodernism some decades ago, and are baffled at its return to fame in news and popular culture.
As final words, I say only this: its resurgence is not postmodern. It is merely confusing. -
If you've ever read a textbook that in some way deals with postmodernism, then you've probably encountered the introductory statement that the word consists of two parts - post and modernism. Post- as a prefix means that whatever it is fixed to happened in the past. When it is fixed to modernism, we get a word that means "the stuff that happened after modernism". Modernism came first, then postmodernism - in that order.
There are two main reasons for including introductory remarks of this kind. The first is that it has become tradition and convention at this point, and it's easier to latch on to what has already been established than to be creative. The second is that you cannot treat postmodernism as an entity unto itself - it has to be understood in relation to what came before. If you do not understand modernity, you will not understand postmodernity. The one came from the other, and it could not have happened in any other way.
It is vitally important to underscore this intimate relationship. It is a historical progression which is not merely chronological - the tendencies and practices set in motion in the modern time period kept going in the postmodern time period. They are linked, similar and connected.
The modern project was (and is) one of enlightened critical thinking. Traditional institutions, mainly those of monarchies and churches, were no longer to be seen as the absolute authorities when it came to the truth. Instead of relying on ancient authorities (or very present authorities, as it were), the moderns wanted to rely on science and reason.
An example of this shift from ancient authority to a more modern way of thinking is Galileo and the notion that the Earth goes around the sun. Using the tools at hand, Galileo figured out that Earth is not the center of the solar system. The traditional authorities, who held that the Earth was in fact the center, did not agree, and much ado was made about it. In the end, you know how it all turned out.
This ambition to test things by means of science and reason wasn't limited to one person and one particular way of looking at things. Over time, it became the default mode for everything - everything could be questioned, measured, re-examined and put to the test. Those things that were found to not hold up to the standards of scientific testing were thrown out, and those things that did hold up were expanded upon.
The scientific implications of this are fairly obvious: you can get a whole lot more done if you are allowed to freely use the scientific method, without having to make sure everything you find corresponds to what the authorities want you to say. Science builds on science alone, and its findings are all the more robust for it.
The social implications, however, are less straightforward. If long-held beliefs about the cosmos as a whole could be questioned and challenged, then so could long-held beliefs about things of a smaller and more private nature. If the church was wrong about the Earth being at the center of the solar system, then it might also be wrong about marriage, sexuality, and other social institutions. Everything is up for questioning. Everything.
This process of questioning everything kept going, and over time more and more things that were once taken for granted were put to the task of defending themselves. Everything that was once solid melted away, and what came instead was something completely different. Where once kings and bishops ruled, there are now scientists and bureaucrats. And marketers.
Mind you, this is all part of modernity. This is the part that came before postmodernism became a thing. Postmodernism is what happened after this process had been around for a while and become the status quo.
The thing about questioning everything is that you can't really keep doing it forever. At some point, you arrive at the conclusion that some questions have been answered once and for all, and thus that there is no need to go back to them. You begin to take things for granted, and enshrine them as the way things are supposed to be. There are other, more important things to do than reinventing the wheel. There is an order to things and a tradition to consider, both of which are as they should be. The product of modernity is a new range of authorities which dictate what is to be taken for granted and what is to be questioned.
Postmodernism is a return to the very modern urge to question everything and make present institutions answer for themselves. It is, in essence, a return to the modern impulse to trust reason and science rather than tradition or authority - even if these very same traditions and authorities have used reason and science in the process of becoming what they are. But instead of asking whether the Earth revolves around the sun or not, it asks: why do we do the things we do the way we do them, and might there not be a better way to go about it?
Postmodernism happened after the modern project. Post-modernism. But it is still very modern. It is modernity turned upon itself.
If you, after having read this, are slightly more confused about postmodernism, then that is good. It will have primed you for this next statement:
Academics stopped talking about postmodernism some decades ago, and are baffled at its return to fame in news and popular culture.
As final words, I say only this: its resurgence is not postmodern. It is merely confusing. -
Friday, February 17, 2017
All is good that is good
It is often said that it is impossible to argue about taste. De gustibus non est disputandum. Some people like some things, other people like other things, and no amount of arguing is going to change this one indisputable state of things. This is where it is at, and thus here we are.
Nevertheless, we often find ourselves in situations where we want to convey why we like something. In matters of literal taste, the argument is simple: just present the person we want to convince with a tasting of the good stuff, and let the taste buds do their thing. Either we succeed or we do not; the outcome depends entirely on factors outside our control. Regardless of outcome, the attempt was made.
When it comes to more abstract things, such as music or writing, a similar approach is also available. Give someone a tasting of the music and writing, and see how they react. Either they get it, and your work is done, or they don't get it, and -
It is possible you at this point want to argue why that thing you like is good. Why the poem your friend is utterly indifferent to is actually amazing, why that song owns the sky and everything below it - why they should like it, too.
This situation presents something of a problem. If you really really like something, then its awesomeness is so self-evident and obvious that it is difficult to find some mean of reducing it to mere words or communicative motions. No discursive gesture would convey just how good it is, and attempts to convey it anyway often stray into unrelated territories, causing confusion or disagreement. Which, one might reasonably assume, is the opposite of what you wanted to accomplish.
A first move from here might be to simply state that you like the thing. This may or may not be useful information to the other person - it all depends on your particular relationship and suchlike. But it provides a baseline for further attempts to convey the goodness.
A second move might be to say that someone else likes the thing. Preferably, this third person is someone you both like and acknowledge as someone whose opinion matters. If they like it, then there's got to be something to it, right?
A third move might be to make a more generalized claim about mass (or niche) appeal. If it's famous, then it must be good, or it wouldn't be famous; if it's niche, then it must also be good, as it is an expression of the virtues of the niche.
As lines of argument go, these are rather flawed. But they are also very common. They are human.
Thing is. Giving reasons for why things are good or bad is hard. There are no readily available frameworks for it, and those frameworks that do exist require a non-trivial amount of effort to get in to. Most of them hide behind camouflage strategies such as the name "literary critique", and get progressively more invisible from there.
Maybe the proper thing to do is to cut our friends some slack. Give them the benefit of the doubt when their eyes get that enthusiastic gleam. -
Nevertheless, we often find ourselves in situations where we want to convey why we like something. In matters of literal taste, the argument is simple: just present the person we want to convince with a tasting of the good stuff, and let the taste buds do their thing. Either we succeed or we do not; the outcome depends entirely on factors outside our control. Regardless of outcome, the attempt was made.
When it comes to more abstract things, such as music or writing, a similar approach is also available. Give someone a tasting of the music and writing, and see how they react. Either they get it, and your work is done, or they don't get it, and -
It is possible you at this point want to argue why that thing you like is good. Why the poem your friend is utterly indifferent to is actually amazing, why that song owns the sky and everything below it - why they should like it, too.
This situation presents something of a problem. If you really really like something, then its awesomeness is so self-evident and obvious that it is difficult to find some mean of reducing it to mere words or communicative motions. No discursive gesture would convey just how good it is, and attempts to convey it anyway often stray into unrelated territories, causing confusion or disagreement. Which, one might reasonably assume, is the opposite of what you wanted to accomplish.
A first move from here might be to simply state that you like the thing. This may or may not be useful information to the other person - it all depends on your particular relationship and suchlike. But it provides a baseline for further attempts to convey the goodness.
A second move might be to say that someone else likes the thing. Preferably, this third person is someone you both like and acknowledge as someone whose opinion matters. If they like it, then there's got to be something to it, right?
A third move might be to make a more generalized claim about mass (or niche) appeal. If it's famous, then it must be good, or it wouldn't be famous; if it's niche, then it must also be good, as it is an expression of the virtues of the niche.
As lines of argument go, these are rather flawed. But they are also very common. They are human.
Thing is. Giving reasons for why things are good or bad is hard. There are no readily available frameworks for it, and those frameworks that do exist require a non-trivial amount of effort to get in to. Most of them hide behind camouflage strategies such as the name "literary critique", and get progressively more invisible from there.
Maybe the proper thing to do is to cut our friends some slack. Give them the benefit of the doubt when their eyes get that enthusiastic gleam. -
Wednesday, February 8, 2017
A thought
The strange thing about thoughts is that most of them are irrelevant. You think them, they flow through the mechanisms of cognition, and then nothing. Nothing comes of it. In the grand scheme of things, whatever thought happened in those irrelevant moments could be replaced by any other thought, and nothing would have changed. Thoughts occupy time, and that is about all they do.
Except, of course, when they do more than that.
Thing is. Most thoughts are never recorded. They happen, take place, and are gone. Some of them are important, some are irreverent, some would make a difference if only they were jotted down somewhere.
But we never get around to thinking we ought to record them. And then they are gone.
Just thought I'd remind you that you still have the option.
Except, of course, when they do more than that.
Thing is. Most thoughts are never recorded. They happen, take place, and are gone. Some of them are important, some are irreverent, some would make a difference if only they were jotted down somewhere.
But we never get around to thinking we ought to record them. And then they are gone.
Just thought I'd remind you that you still have the option.
Subscribe to:
Posts (Atom)