Monday 20 July 2015

Virtual Travel

I went to a family gathering this weekend, about two hundred miles from the places I call 'home'. This is actually the first time this decade I've been that far from home (I don't get about much). The journeys there and back were seven and five hours respectively in a car, and were the longest car journeys I've been on in the same period.

Travel is starting to become a theme of my writing about video games. Many of my favourite in-game moments have to do with travelling through virtual worlds. A big part of my excitement for upcoming games like Tales of Zestiria, Xenoblade Chronicles X and Final Fantasy XV is that they offer vast new worlds to explore.

By contrast, I really don't like to do anything in the real world that's vaguely analogous to my virtual explorations. I don't like hiking, I don't like driving (or being a passenger, since I can't legally drive myself anywhere), and I generally don't like to travel. There are a variety of reasons, but the biggest is quite simply that real travel is a lot of effort.

I want to see mountains like the ones my dad likes to climb without getting sore feet. I want to experience the vast sweeps of landscapes like the ones we drove through this weekend (some of which were very pretty) without the steadily-hardening crick in the base of my spine. I want to float through the world as effortlessly and intangibly as a videogame camera.

And I thought until this weekend that there was no harm in this, provided I kept to my lane and didn't get too whiny when I actually do have to travel. But Actually Travelling, and looking at the landscapes I travelled through with the same critical eye I've been training to look at videogames with, put me ill at ease.

Games, by the limits of their technology and the demands of their audience, compress distance. Sure, you can walk for some hours in a straight line without touching the sides of some recent games. But you could walk the real world for the same length of time and in many places not even leave the valley you start in. I found myself wondering, as we drove over the crest of a hill and the horizon retreated on Friday afternoon, how dishonest it is to indulge in this, and how harmful.

It would be a pretty shallow critique to say that the shortening of distance in games is straightforwardly misrepresentative and creates harmful expectations of travel - no better than tired old arguments that the mere presentation of violence is sufficient to induce people to be more violent. It's more complex than that.

But games are so often about the mastery of space, the individual eventually rising to an effortless, unchallenged mobility. There's no better example of this than the 'airship moment' in JRPGs, the point at which you've explored most of the world on foot and, to avoid forcing you to retread old ground, you get a tool that allows you to hop or float to wherever you want, bypassing even the abstractions that are supposed to add labour and time back into the earlier compressed journeys.

This is all a bit unfocussed and musing-y (which is why it's here and not on my actual games blog). The problem probably has more to do with the way games construct mastery than the way they handle travel and distance. But this weekend, looking out at the same sweeping vista for twenty miles and realising it would take ten times as long to walk it on foot as in-game, I had a moment of very sharp discomfort. I hope I'll be able to hold that in mind as I develop my critical ideas on this theme.

Wednesday 1 July 2015

Feels and Reals


'Reals over feels!' and variants thereof have become a slogan for the internet right in the ongoing shitstorm over whether 'objective' journalism is a thing. Simplistic as it is, it's an expression of an ideology with roots in the work of some much-celebrated twentieth-century philosophers (at least, those in the English or anglocentric tradition), and indicative of a subtle shift they engineered in how we use language about truth and reality.

The general meaning of 'Reals over feels!' is that one kind of proposition, understood as impersonal and objective, should be considered more true, or more worthy of consideration, than another, understood as personal and subjective. Statements about objects independently of some personal perspective are considered accurate; statements about how something seems to some perspective are inaccurate or invalid.

In the journalistic field, the problem with this is that all journalism, no matter how diligent, is perspectival. This issue generalises, though; there are, quite simply, no non-perspectival facts. This is not the same as saying that there are no facts. How we arrive the idea that there are objective facts despite the truth of this claim is a topic to which I'll return later.

Let's start with the basics. We have known since Descartes that the only sure foundation for knowledge is conscious experience – that the only thing that absolutely cannot be doubted is the current content of our conscious fields. I may be only hallucinating that I sit in front of a computer right now, or the image of the computer that I see may be the product of a Matrix-like simulation (or, in Descartes' scenario, a trick played by an evil demon), but I cannot be mistaken that I seem to see the computer; I cannot be mistaken that a computer appears before me.

Absolutely every other thing we take ourselves to know is known by inference, and every means of inference we have, we know to be periodically fallible[1]. So however reliable a claim of knowledge beyond immediate awareness may be, we know there must still be some small chance that it is wrong.

Why emphasise this? Well, as the physical sciences developed in the wake of Descartes, the gulf between direct awareness and the theories of the sciences widened dramatically. The seventeenth century gave us cells, the nineteenth atoms and the twentieth quanta, so tiny that any conventional notion of being directly conscious of them goes out the window.

My point is not to dismiss the theories of science, not at all; just to remind where they come from. What I want to repudiate is the relatively recent philosophical contention that because direct awareness is of appearances that often do not correspond closely to the equivalent deliveries of scientific theory, it is inherently misleading.

The difference is subtle. No-one denies that appearances can be deceiving. The question is whether appearances are inherently deceiving. The shift from the former claim to the latter was accomplished in philosophy largely during the rise of Bertrand Russell, and the fall or at least pushing-aside of the British Idealists and Continental Phenomenologists his dominance replaced.

Idealism (in this, rather than the political, context) is the metaphysical and/or epistemological theory that the world, or at least our knowledge of it, is fundamentally based on experiential/conscious facts. Phenomenology is a philosophical method that requires one to start from what is observed most directly, explain that, and then build on the explanations. Both these positions have clear roots in the idea that conscious awareness comes first.

Russell, along with friends like G.E. Moore and disciples like the young A.J. Ayer, held that these approaches had given rise to obscure and absurd metaphysical systems, convoluted theories that were of little use in actually explaining things. It's true that the phenomenologies and idealisms of the 19th century were complex, but then so is the world[2]. Russell in particular held logical clarity to be the most important virtue of philosophical systems, and was willing to ignore or bury a great many issues that would not submit to logic-based treatments[3].

What all the theories discussed so far, including those of Russell and Moore, have in common is that they are attempts to explain the relationships between the experiences of different people. We take my experience, now, of sitting in front of a computer, to be accurate precisely because if you came and took my seat, you would have a similar experience, and because if I come back to this room later today and sit in this same seat, I will have another, similar, experience.

The standard early modern philosophical explanation of this consistency would be that there is an object, the computer, in this room that produces computer-like experiences for any who sit in front of it. With modern science, however, we know that the object in this room can't be simply described as 'a computer'; it's an immensely complex structure of polymers and electronics, each themselves complex structures of molecules, which are made up of atoms, which are in turn complex structures of subatomic particles and fields that can be described mathematically and logically but not terribly intelligibly.

This sets up a huge discrepancy between the experience of sitting at the computer and the scientific description of what's going on (it gets even worse if you try to factor in a scientific description of me, and/or the process of perception). The tendency of the (anglocentric) philosophers of the 20th century was to argue that this meant the experience was deceptive and false, while the scientific description was accurate and true.

But as I argued here, the scientific description is actually much less useful than the experiential one in most cases. Our knowledge, the everyday stuff that enables us to find our way to the shops and so on, is overwhelmingly experiential in character.

And this hints at another reason for preferring the experiential to the scientific; where scientific knowledge is useful, it is only because of some effect on experience that it enables us to generate. The microscopic precision that allows Intel to inscribe GHz CPUs on a postage stamp is only worth achieving because computers enable wondrous new experiences, whether that means exploring the Mushroom Kingdom or establishing personal relationships that stretch around the globe or even just being able to do your own accounting without needing pages and pages of maths paper.

In truth, all values – not just the emotional or aesthetic, but every kind of utility as well – are values only from some human[4] perspective. Feels are reals, both in the general sense that perspectival facts are real (because they are the only kind of fact), and in the specific sense that emotional perspectives are important, because it is those emotional perspectives from which the values that make anything at all that we do worthwhile spring.

The only question that remains is why, if all this is correct, some people are so convinced that there is an 'objective' perspective, one that is right above all others. I said above that the issue is about the relationship between experiences; Russell and his colleagues came to see the explanation for the relationship as more fundamental than the experiences it relates, but there is another process at work here too.

To examine the relationship among experiences generally, you must have a set of experiences to generalise from. Ideally, as indeed the theory behind the scientific method suggests, this sample will be representative; if it is not, there is a much bigger chance of missing something important. In practice, you cannot include experiences of which you know nothing.

And the philosophers I've discussed here had relatively narrow ranges of experience to draw on. Descartes (and other influential early modern philosophers like Locke, Hume and Kant) lived most of his life in and around the courts of Enlightenment Europe. Russell and his cronies were ensconced in the ivory towers of British academia (and I can tell you from personal experience just how narrow the windows there are).

Not only did these men have a limited range of experiences to draw on, they either had or have subsequently gained a great deal of influence to pronounce with. Their positions, social class, shared ethnicity and so on have made them Great Men with Important Views; people who have differing opinions seem unimportant by contrast. This actually applies to their historical opponents every bit as much as to marginalised people today; one hardly hears the names of Bradley and Meinong in philosophy classrooms anymore.

The tendency of self-proclaimed logical thinkers to exclude dissenting opinions from both history and contemporary debate should by this point be sadly familiar. It's a self-reinforcing process; when dissent has already been shut out once, it is much easier to dismiss a second time. People clinging to Russell's model now, a hundred years down the line, may not even realise how trapped they are in it[5]. Open-minded reflection on the views of people from different backgrounds and demographics is the only antidote.

In summary, the idea of an 'objective fact' is a mirage. The only indubitable propositions are subjective in character. The philosophical models that allow us to link them together into a coherent world are at best intersubjective, a negotiation shaped by social pressures much more than 'purely intellectual' considerations (if there are any such things).





[1] It's worth stressing at this point that awareness is not purely sensory – memory is a kind of awareness, so your memory of an event (though not the event itself, if it is in the past) may serve as the foundation for some knowledge, or at least reasoning).

[2] There's an interesting comparison here with the current state of quantum physics. Its more phenomenological elements – the mechanics that describe and predict actual measurements – are the most accurate science yet developed by man, but the interpretations that seek to explain why those relationships exist... well, here's the Wikipedia page on interpretations of quantum mechanics. Just count how many different interpretations there are, don't try to wrap your head around them all.

[3] This Spockish attitude persists today in the myriad ways our culture insists on quantifiability and computability. Things (like emotions) which are messy to compute tend to be regarded with suspicion.

[4] In the interests of inclusion, this should be 'appropriately human-like perspective', really. We want to be able to extend values and valuation to sentient aliens, sophisticated animals and so on.

[5] Which doesn't excuse their lack of awareness, since they're also likely to have more spare time and money to support self-reflection with than other social groups.

Wednesday 6 May 2015

On Voting

I will be voting in the general election. I will probably vote in every general election in my adult lifetime (this will actually only be my second - I missed out by a month in 2005). I vote in every local election, too. I was going to write a post this week with a thorough and general defence of democratic participation, appealing to cynics and anarchists alike.

But I'm not sure that's helpful, particularly given how miserable the system's current offerings are. I don't have quite the same conviction regarding the importance of voting that I used to; the reasons for my remaining conviction (which is still, obviously, fairly strong) are weaker, narrower, more personal.

I used to argue, when challenged to defend centralised government, that the global scale of contemporary problems like overpopulation and climate change would require a global coordination only possible through centralisation, and that a purely bureaucratic centralisation was at least as dangerous as one with some element of democracy.

But, quite without realising it, I betrayed this argument completely when writing The Second Realm. It's not really emphasised or investigated in the story, but governance in The Second Realm (actually in the First Realm) is localised, with only the very lightest central coordination. Society functions as a network of small communities each communicating with and mutually supporting its neighbours.

Granted, it's a much smaller society with very different problems to ours and a surplus of natural resources, but apparently I don't (universally) believe central, hierarchical governance is necessary. I can at least imagine us surviving without it.

Maybe, then, voting won't always be necessary. I'm still voting this time round, though, for a whole bunch of well-trodden reasons; because the parties aren't all the same, and with a population of 60million to work with even small differences may improve lives for lots of people; because my abstinence would be read by the mass media as apathy, which makes my skin crawl; because over longer terms than the parliamentary, there's at least some reason to think that many small voices add up.

It would be disingenuous to overlook the privilege of my upbringing in this, of course; part of the reason I'm voting, and that I don't feel completely hopeless about it, is that I've grown up with the idea that my voice will be heard and will make a difference. At quite a deep level I'm not inclined to see voting as futile.

But that's also a form of optimism, and it's possible - even important - to be optimistic without being naive.

Tuesday 31 March 2015

Boiled Potatoes and the Analytic Method, part 7

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

In part 5, I evaluated analytic philosophy and the limits of its conception of meaning.

In part 6, I identified the limit that the analytic method places on discourses of morality and responsibility. 

Part 7: What Pieces Are You So Scared Of?

I wasn't expecting to write this part in quite the mode I'm in at the moment. I've been feeling generally pretty positive and upbeat so far this spring, and was looking forward to rounding this series out with a similarly cheerful summation on the theme of healing and embracing a life that values emotional sensation.

But I had a bad weekend in a handful of little ways that left me feeling a bit on the low side. As ever when I get on a downer, I started to pull back from things, and especially from people. Anxiety sets in, loading every potential encounter with a hundred disaster scenarios.

There's a numbing process that's part of this, too. It's a defensive reflex, I think, shutting down the mechanisms of self-regard and self-care that identify the problem to avoid having to think about it. We're supposed to solve problems by disinvesting, stepping outside ourselves to look at them 'objectively'. This is supposed to make solutions clearer and less clouded by emotion. But sometimes the problem is the emotion, more than anything else.

In my head, at least, this sits side-by-side with the analytic method. They present themselves to me as the same process. For years I have embraced them as one, and identified all sorts of objective solutions to my problems - limited budget, for example, or shared living environments that aren't well cared for, or (when I was still living at home) the fact that my parents insist on listening to the radio news four times a day, making it completely inescapable.

The real problem, though, is and has always been the denial of inner sensation, the failure to attend to so many important dimensions of well-being, the determination to rise above 'meat'. I am starting to learn, though. Slowly, I'm thawing out.

It starts, perhaps predictably in my case, with music. Music has always offered the most purely emotional experiences of my life - I don't have the theoretical knowledge to analyse it the way I can tackle novels, films and now to a certain extent also video games. It's in music that I'm normally closest to engaging bodily - while I'm a terrible dancer, I'm also basically incapable of standing still when there's music playing.

And I have some incredibly talented musician friends. Look, I know no-one ever takes my music recommendations, but click that last link and listen to Sam's most recent album. Seriously, it's not long, and the last track is the first piece of music in a decade to bring tears to my eyes. It's five minutes that I can get completely lost in. Sometimes it's good to be lost.

Sometimes getting lost is exactly what I need. Some problems don't need the analytic distance of the cartographer - the map is clear, the map is the problem, the map shows you all too clearly what stands between you and the shining horizon. The map tells you what the walk is like, but sometimes you need to stop thinking about that and walk anyway. That's the point at which the map can't tell you anything useful.

Monday 23 March 2015

Boiled Potatoes and the Analytic Method, part 6

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

In part 5, I evaluated analytic philosophy and the limits of its conception of meaning.

Part 6: On Taking Responsibility

The concept of moral responsibility has been at the heart of my journey through analytic philosophy. The first philosophical system I encountered which inspired and moved me was existentialism, a position that has moral responsibility as its foundation and centrepiece. This stands in direct opposition to the determinism which characterised much of 20th-century analysis.

Science has long been thought to promise a perfect system for predicting human behaviour (I choose my words carefully here, since few practicing scientists have embraced this belief - it belongs more to the realm of 'popular' or at least establishment commentary). It's a classic modernist tenet, and for a while scientific discoveries did seem to be progressing in that direction. Neuroscience and psychology made great strides through the nineteenth century and into the twentieth.

Still, as early as 1942, Isaac Asimov could acknowledge, with his invention of 'psychohistory' in the Foundation short stories, that a truly determinist understanding of human behaviour was out of reach, prohibited by the fundamentally probabilistic character of quantum physics. This is not to claim that prediction of human behaviour is impossible, only that it can never be done with complete certainty.

Philosophers, who have been arguing with Laplace's demon for two centuries now, were slower to catch on. Even ten years ago, when I was in my first year at university, hard determinism was still discussed as a plausible theory, rather than merely a far-fetched possibility. So great was my determination (hah) to hold onto moral responsibility that I once refused to read an assigned article because of its determinist slant, which is about as defiant as I've ever been towards a teacher ever.

Determinism and scientism suit the analytic approach. They are theories of absolute knowledge and certainty, of everything in its place, clear and predictable. In denying the possibility of free will, they deny the meaningfulness of the aesthetic, reducing emotions, beliefs and principles to the purely causal.

This outlook has persisted despite the eventual demise of hard determinism. The philosophers who would have been determinists in a previous generation now begrudgingly begin their papers with 'we know that hard determinism is false, but...' and go on to argue that quantum randomness leaves the defender of free will no better off.

The point is not entirely without merit. Fundamental randomness does not guarantee a meaningful freedom of will. Free will theorists have long held that free will is a necessary condition of moral responsibility. The best they can claim from quantum theory is the existence of a narrow sliver of space in which freedom of the will might hide.

More insidiously, the post-determinists have targeted moral responsibility itself, even as free will theorists began to abandon the connection between will and responsibility (the resulting positions are myriad, and better covered in detail elsewhere). The essence of the new determinist argument concerns motivation, understood as whatever mental state in an agent results in their action.

An agent is morally responsible for an action, the argument goes, if their action is a product of a motivation in an appropriate way (that is, not subject to hypnosis or other control). Motivations, though, are products of the agent's character, and said character is a product of the agent's birth and upbringing. If we are to hold agents responsible for their actions, then, it seems that we must hold them responsible for their upbringing and their ancestry. This, the post-determinists argue, is absurd.

And, on the face of it, it does sound absurd. A person cannot literally be responsible for their own birth - this would distort time itself. This argument, the causal argument, seems to present a profound challenge to the existence of moral responsibility.

And yet... Let us come at this from another angle. Critical theories, such as Marxism, feminism and queer theory, recognise differences among birth circumstances as important social phenomena. The concept of privilege is vital to understanding these models, and their well-grounded demands for social justice.

These days, it is common to hear reactionaries crying that it is not their fault they were born male, or white, or middle class, or straight, or cisgendered, or able-bodied, or neurotypical. Strictly, they are not wrong - but then, you will find no serious feminist arguing that they are. What the reactionaries are doing, though, is relying on the same simplistic, causal understanding of moral responsibility as the post-determinist analysts.

Responsibility for the circumstances of our birth, for the privileges and attitudes therefrom, is something we take. It is not something we are born to, nor something we are morally entitled to ignore. The essence of maturity, of adulthood, is making this transition; this is the sense in which children are innocent.

Practically speaking, the act of taking responsibility consists in critical self-reflection, the willingness to examine our own behaviour and the attitudes which condition it, and the seeking of ways to change them where appropriate. It is the act of taking seriously our relations, both structural and specific, to others, rather than viewing ourselves as isolated particles predestined to bang into one another with whatever arbitrary results a crude social physics dictates.

Theoretically, taking responsibility requires detaching responsibility from the purely causal, embracing the messy illogicality of a putatively free choice to escape the fist of determinism. The result is not a neat theory; it has little of the clarity that analysis craves. But it is honest and liberating, and above all else it allows a hope for general, meaningful change that the determinist mindset can never offer.

(part 7)

Monday 16 March 2015

Boiled Potatoes and the Analytic Method, part 5

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

In part 4, I surveyed the rise of analytic philosophy and attempted to show how it rejects the spiritual and the emotional.

Part 5: Aesthetics and Anaesthetics

I only recently made the etymological connection between 'aesthetics' and 'anaesthetics', but it's hardly an earthshaking revelation. Aesthetics is (roughly) the study of art, a fundamentally sensory thing; anaesthetics make us numb, insensate. The common Greek root originally means perception.

It would not be too far wide of the mark to describe analytic philosophy as anaesthetic. Above all else, what analytic philosophy denies is the subjective. It is the search for objective answers to the grand philosophical questions. The whole analytic construction of 'rationality' opposes the value of personal perspectives, appealing to a transcendent reason which may or may not bear any real connection to the divine intellect of the early modern or classical rationalists.

But analytic philosophy undoubtedly has its advantages. The detachment it advocates can be absolutely crucial for some debates. It's particularly important when responding to criticism; one cannot, after all, take up the point of view of another while clinging to one's own. There are other ways to develop the ability to detach, but practice in the analytic method is a particularly effective and pure one.

(Note: it's far from perfect, as anyone who's ever pricked the ego or threatened the funding of an academic can attest).

And the analytic tradition in philosophy has real triumphs to its name, too; the systems of formal logic developed in the first half of the twentieth century are not just a huge step forward over their arcane predecessors. They are legitimately powerful tools of reasoning, at least within the limit of Gödel's theorem, and underpin much of modern computing.

Another important product of the analytic tradition, one that is rather more complicated to endorse, is its discourse on meaning. This is usually what definitions of analytic philosophy centre on, but the analytic discourse on meaning is almost exclusively linguistic - it concerns words and sentences, spoken and written. In aesthetics, on the other hand, languages are only a small subset of things that mean (the first part of this video has a pretty robust introduction to some of these ideas, referencing the omega of analytic philosophy, Wittgenstein).

And in aesthetics, meaning is a very different beast to the meaning of the analysts. It is lived, experienced, bodily, not a clinical study of how words point to things in the world. Analysts have devoted a great deal of work to establishing what it means to say something exists; in aesthetics, the question is simply 'is it felt?'

The modern technophile's - my - obsession with transcending 'meat' (as William Gibson perfectly put it in Neuromancer) is born of this analytic understanding of meaning, thought and reason. We disdain bodily hedonism for the 'higher pleasures' of the mind, and in doing so fail to realise that our 'higher pleasures' are really just contempt for other ways of seeing the world, other tools that are in their own way as valuable and in many ways richer than those we have learned.

Aesthetic comprehension, in a way, is a much more basic part of the human condition than analytic. This, perhaps, explains some of our disdain; a baby can feel, but only a sophisticated adult can 'really think'. That we can believe this while yearning for our lost, or innocent, or joyful childhoods is a testament to the spectacular power of the (archetypally white, male etc.) privileged ego.

(part 6)

Monday 9 March 2015

Words Matter

(content warning: discussion of ableist terms, reference to other slurs)

I changed the URL and title of this blog yesterday, to remove the ableist slur 'stupid'. I apologise wholeheartedly for not doing this sooner and for failing to treat this issue with the gravitas it deserves until now.

The rest of this post is addressed to anyone who thinks this is making mountains out of molehills, or that I needn't have bothered making the change.

Let me start with the obvious: words matter. They have power. I'd be a pretty poor writer if I didn't believe that. And power is always dangerous - not necessarily always harmful, but always accompanied by the danger of causing harm.

Words can become harmful in lots of ways, but one of the most serious is when used to justify (or in the justification of) harmful policies. We rightly regard racial slurs like the n-words as harmful because of their association with governmental policies and societal patterns of slavery and segregation - policies and patterns with costs both measurable (in death and injury figures) and immeasurable (in lost human potential and complex, oppressive legacies).

Why, then, is 'stupid' harmful? It is, after all, a very common word, and one not normally connected to any great opprobrium.

The simplest answer to this is a direct comparison with racist language. Constructions of intelligence have sometimes been used as viciously as constructs of race to justify policies every bit as horrible. The eugenics movements of the early 20th century are the best examples of this - in the 20s and 30s, many 'developed' nations including Britain and America forcibly sterilised people who failed to meet certain standards of 'intelligence' (usually measured with IQ tests - the exact purpose and value of which remains controversial to this day). More famously, Nazi Germany sent people to concentration camps and even gas chambers on 'intelligence' grounds as well as racial.

It's generally good policy to not throw around as insults words that the Nazis used to justify genocide.

One final thing; I want to point out how easy it is to overlook this issue. When I started this blog, I was twenty-three, already in possession of a master's degree and well on my way to a doctorate - hardly able to claim general ignorance, and yet I had no idea that my choice of phrase (a reference to Bill Clinton's famous campaign policy from 1992) could be harmful. Worse, I was literally working in disability support for students at the time - none of my (actually quite limited) training had addressed this issue.

And it gets even worse than that, because a year or two later I was working with a student whose course included modules of disability studies and special educational needs. There were several lectures about ableist language, including specific problems with the language of intelligence, and I still didn't see a problem with my own blog title. It's very easy to dismiss issues when they require you to change.

Learning to rid my everyday vocabulary of words like 'stupid' - to put them in the procribed category where they belong - is not easy. But there are plenty of better words, both as insults and to refer to things that are strange/absurd. We can - and should - live without words that are imbued with such harm.

Here's a great resource for examining ableism generally, and here's their excellent collection of articles addressing specific ableist terms.

Thursday 5 March 2015

41.62MB

'IT WILL CHANGE YOUR LIFE', thundered a friend of mine on Twitter when I said something about finally getting a smartphone. I took the plunge in January, at last feeling I have enough spare cash - over a long enough time-frame - to make being able to keep up with a 24-month contract a safe bet. I looked forward to joining the truly modern part of the modern age, the edge where we're beginning to bleed into cyberpunk, the networked species.

And yet, here at the end of my first monthly billing cycle, I've used barely 2% of my 2GB data limit - 41.62MB, to be precise. Obviously, part of that is that I'm new to this device and don't really know what it can do, so I'm not yet using it for many of the things it could be.

That's comforting, but it's the minor part of this issue. The perspicuous truth is sadder; I'm simply not mobile enough, in my day-to-day life, to get much out of mobile computing. I spend the majority of my time within ten feet of a high-powered PC with a cabled connection to a fibre broadband router that gives me download speeds in the region of 9MB/s. When I'm out of the house, I'm walking to places, most of which are workplaces of one or other kind.

A mobile phone cannot change a stationary life. And while the extent to which I don't get out much is perhaps a bit disheartening, it's equally true that I get most of the things that other people do on their smartphones on my PC. Crucially, it's when I'm on my PC that I'm most connected to the rest of the world.

That's what really matters with mobile communications technology, after all - how much more communication it enables. Being romantic and optimistic, we could say it's how much closer together it brings us, the potential to blur the edges not just of communities but ultimately of individuals as well. Talking about the rise of analytic philosophy last week, I mentioned Leibniz; his philosophical system, the 'monadology', posits human minds/spirits as the building blocks of the universe, with space, time and everything that fills them emerging out of the phenomenal (sensory/felt) tensions between us. I've always liked that image as a way of thinking about humans in networks.

I feel that way even when I feel disconnected from those networks. And maybe that's where the greater sadness resides in my current situation (I realise, writing that, that this is all terribly self-pitying, so sorry, I guess). If I already have all the benefits of mobile technology - something I can't anymore deny - then I can't blame any disconnect on technological barriers anymore.

And indeed, I am trying to reach out more, to engage more, to communicate whether from my desk or my pocket. So while the smartphone itself isn't going to change my life, it might yet prompt me to make some changes.

If nothing else, I've been able to spend hours laughing at this game about an enormous, hilariously fragile fish.

Wednesday 25 February 2015

Boiled Potatoes and the Analytic Method, part 4

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

In part 3, I blamed everything on boiled potatoes (and allowing my everyday life to become too bland).

Part 4: A History of Bertrand Russell's History of Western Philosophy

Bertrand Russell's A History of Western Philosophy is a landmark text. Russell's position as its author - author of one of the most influential histories of philosophy - is a testament to his stature and import in the first half of the twentieth century. If anyone is the father of 'analytic' philosophy, it is Russell; at very least, he was the first patriarch of its fractious family.

History is written by the victors.

Russell's career was built, founded, on the strength (or at least the success) of his attacks on the philosophies that preceded him; the British idealism of his teachers, and the late phenomenology of Brentano and Meinong that paralleled it. By the end of the 1920s, analytic philosophy was well-established, with Russell at its head.

His opponents were not just defeated, they were dead; Meinong died in 1920, at 67. Bradley, greatest of the British idealists, hung on until 1924. Analytic philosophy delivered triumph after triumph in logic and language, most notably in modernising formal systems for logic which had languished in an Aristotelian mode long into the Enlightenment. Since those formal systems underpin the computation sustaining this blog post, we can hardly reject the analytic approach outright.

But it bears asking what was lost to its triumph. Analytic philosophy is a cold, clinical thing, characterised by abstraction, a devotion to clarity pursued by stripping an object of any context that might introduce ambiguity. This is the mindset that numbed my body to serve my mind. This is the approach that relegates emotion to a backwater, nothing more than a hazard to reason.

The archetypal rationalists of the early modern period - Descartes, Spinoza and Leibniz - would have had no truck with this division. For them, there was no great conflict between mind and spirit (mind and body might be a different matter, but body-as-pertaining-to-felt-emotion would have been spiritual to them, not 'merely animal' if there was such a thing). Their tradition, and the work of those who inherited it, from Kant all the way down to Bradley and Meinong, is one of unified, harmonious worlds in which things can only be understood as they are in relation to one another.

It is very hard, when tackling the metaphysics of the post-Leibnizians, not to chuckle, not to view their spirituality as naive, archaic, a product of a 'less enlightened era' in which people still believed in wooly notions and lacked clarity of thought. It is easy to see these men as clinging to religion in the face of marching progress. To do so is, at the very least, to overlook how many of them flirted with outright heresy in challenging the established religions of their times; Spinoza was outright excluded from the Jewish community of Amsterdam, and their sanction against him stands to this day.

While it would be presumptuous of me to present this as an account of the origins of modern critical thought, there are definite links; Marx and Freud, for example, both draw on ideas from Hegel which are fundamentally legacies of Leibniz - ideas that are political, economic and psychological cognates of the metaphysics of Bradley and Meinong. Marx in particular went on to influence a broad range of modern critiques not just in matters of economic class but also the discourse around race, gender, sexuality and disability.

Even the fact that, in anglocentric culture, we view 'philosophy' as something esoteric and removed from daily life can be attributed to analytic philosophy, a product of a simplistic and privileging attitude to the academy and 'academics'. What I hope I have shown, or at least plausibly suggested, is that philosophy is lived, is at the foundation of how we live, is stitched through life and culture in a way that is shaped by but also helps shape everyone who participates in it. The shape it has fitted me into has not been kind, and I am in so many ways one of the fortunate ones.

(part 5)

Thursday 19 February 2015

Boiled Potatoes and the Analytic Method, part 3

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

In part 1, I discussed the specific experience that led me to seek counselling.

In part 2, I talked about a lack of emotional sensation that I discovered during my counselling sessions.

Part 3: The Problem with British Food

Boiled potatoes are non-food. Without either flavour or texture, they are sustenance without experience, matter without properties, as close to the Lockean idea of the bare particular (no, that's not a euphemism, though I've just realised I missed out on a hell of a joke lecturing about them last week) as occurs in real life.

At least, they are when I cook them. I'm aware that various interesting things can be done with boiled potatoes, but I've never had much success when trying. It all seemed more effort than the marginally-improved results were worth.

I ate a lot of boiled potatoes during my PhD years. Money was tight, and I am a coward in the kitchen. Boiled potatoes are a very safe option for student cooking - it's not like they can get any blander from being overcooked, right? Yes, I could have mixed things up sometimes with rice or noodles, but that would have meant keeping rice and/or noodles in stock - more diversity of food means more money spent.

And I didn't really care that they were bland. I viewed eating - everything related to sustenance, basically - as a chore, something to be minimised. That doesn't just mean the simplest cooking possible, it also means the least attention-demanding food. The blandness itself became a kind of virtue, a way of reacting against my limited means; 'I can't afford good food? Well I DON'T CARE, SO THERE!'.

(Sidebar: I wasn't poor - in all sorts of structural ways, from parental support to a fees grant without which I wouldn't even have been able to start the PhD, I was well-off. But I was strapped for cash on a day-to-day basis for most of the four-and-a-half years).

Lots of other elements of my daily routine were similarly, deliberately anemic. I didn't care about them. I cared about the things that I thought 'enriched' my life - my work, my studies, my writing, music and gaming. All those things did, of course, greatly enrich my life. They all mattered to me, and still do.

But the quotidian stuff isn't meaningless, and one of the things I learned in counselling was how much I couldn't 'rise above it'. Quite the opposite, in fact - it dragged me down. Initially, I clung to rigid domestic routines to keep my budget under control, a strategy that worked but at a cost. The routine itself began to the object of my clinging, though, and therein became a problem.

When the disruption of decorating began to stress me out last summer, I initially identified my shattered routine as the cause of my mounting anxiety. I felt that if I could just get things back in order, I would stabilise. Only after the discomfort had almost boiled over into meltdown did I start to think that perhaps the routine itself - a rigid sequence of bland, boiled-potato nonexperiences whose only value to me was their place in the order - might be the problem.

I'm not actually eating much more healthily these days (and indeed, I'm still eating some of the same stuff - no more boiled potatoes, though). But I do try to think about what I'd like to eat before making decisions about buying meals. It wasn't hard to start developing actual preferences again.

(part 4)

Monday 9 February 2015

Piano

Sometime in the next month or so, it will be twenty years since I had my first piano lesson. That's the point I think it's reasonable to call the point at which I first played the instrument (or indeed any instrument), rather than just sitting at it and poking keys to extract sounds.

There was a piano at home before I was born, so I grew up with it there as a piece of furniture. I don't remember ever not being allowed to play it, though obviously my efforts at a very young age were at best unsophisticated. The family collection of 'embarassing/endearing stories about Rik's childhood' includes several of my 'compositions'. Whatever my ambitions, I was no Mozart.

I asked for piano lessons from pretty young; my parents didn't cave until I was seven. Probably wise, since I was a pretty faddy, impetuous child, and it was to be at least a decade before I stopped resenting having to practice daily.

Thinking about it, I really don't have many memories that I can clearly point to as coming from before I started learning piano. That's not claiming any miraculous memory-enhancing powers for music, just that my recollection is pretty scattered from being younger than 7.

What I'm getting at is that I've been a pianist for a long time - that part of my self-image is very deeply ingrained. It might have petered out for me when I left home and my parents' piano, but I asked for a portable, digital piano for my 18th birthday to take to university with me. The entire family clubbed together, to the tune of £800, to make sure I had a decent model.

Even that piano will have been mine for a decade this summer. She's sat behind me right now, and I still play pretty much every day (I'm - very slowly - working my way through learning Mussorgsky's 'Pictures at an Exhibition', and after over a decade I've got about a third of it down). And playing has shaped my life in a lot of ways that might not be obvious.

It's not just that I love music, understand some part of how music is constructed and produced, enjoy creating music and find solace in the sounds. It's not just that some of my most important social relationships are and have been musical (pretty much the only way in which I 'get out of the house' these days is going to gigs).

It's also that the way I learn is shaped by a musical paradigm - regular, consistent practice, stepping up one cautious level at a time. I do not thrive when thrown in the deep end. I approach almost all tasks like performances, with meticulous preparation, often to a fault. It can drain my confidence and feed my anxieties, sometimes, since life often doesn't offer much preparation time, but it has its upsides too, when it works.

There's also the fact that twenty years of training my fingers to be clever and independent has real benefits (yeah, make your jokes - honestly, they give my sex life far more credit than it deserves). I never had to learn to touch-type; I just kinda picked it up as I went along. I never need to look at the keyboard anymore. Seven of the letter keys on this keyboard have had their markings rubbed completely off by time and I only struggle when I have to stop and think about where I'm putting my hands.

Manual dexterity shapes a lot of my attachment to video games as well. I get a real kick out of the way my hands climb around a controller in the flow of play. My favourite games tend to be those where the interface is slick enough that I feel like my fingers are extending into the game world, the game character's contortions a manifestation of my own prestidigitation.

I don't really have a message or an argument today. Just 'yay piano', I guess. That'll do.

Tuesday 3 February 2015

Boiled Potatoes and the Analytic Method, part 2

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

As for what boiled potatoes have to do with anything? Wait and see... 

In part 1, I discussed the specific experience that led me to seek counselling.

Part 2: A Body with No Answers

I'm not going to go through everything I discussed in counselling. Not all of it is relevant, a great deal of it is probably extremely tedious, and the conclusions are likely obvious to all except the protagonist. My counsellor, Jules, was brilliant at drawing me out, getting me to reflect on myself without too much criticality. She didn't try to diagnose or explain, but let me draw my own conclusions and thus internalise each successive realisation.

I learned - perhaps it would be better to say 'reinterpreted' - a lot about myself in those five hours of discussions, but the standout experience is one that happened several times. When I was struggling, either for words or in discomfort, Jules would ask 'How are you feeling right now?' I never had an immediate answer.

In fact, I didn't really have an answer at all. Feelings are embodied things - they happen in the 'gut', the 'heart', sometimes the spine or the back of the neck. Jules would ask me, and (the first few times) specifically direct my attention to bodily sensation. I would frown, expecting an immediate answer (who doesn't know how they're feeling at a given moment?). When that didn't happen, I would interrogate my body, a technique I've learned for fiction writing.

And there would be nothing there. There were physical sensations - the chair, sometimes a headache or a dry throat, ordinary itches or aches - but no emotional ones. What I could identify of my emotions - usually a sense of dread about where a question might lead, how I might be pressured to change my behaviour - were 'head' things, and not sensory. It was the racing-thought, future-chasing anxiety seeded by stereotypes of therapeutic exercises ('Feeling lonely, you say? Okay, GO INTO TOWN AND START ASKING RANDOM STRANGERS FOR A HUG'), something that for all its unpleasantness is almost entirely mind, not body.

Trying to describe the silence in place of expected sensation is difficult at the best of times. I managed to be intellectually disturbed by the solid flatness of my chest - not cold or hard, like stone, just... there, like a well-plastered, plain-painted wall - but couldn't even feel afraid of it.

Occasionally, on the cusp of some realisation, there would be a vertiginous moment, a yawning, teetering on the edge of a bigger, more daunting perspective. That, at least, was a sensation, though mainly around the crown of my skull, sometimes spilling into my eyes as a headrush. It was all I ever managed to report to Jules.

I was self-reflecting the way I'd learned to reflect on everything else - Analysis, with a capital, historical A, a clinical process of standing outside an idea, surgically peeling away its context, tracing each vein and neuron one at a time. There's a time and place for that, perhaps, even when the idea is your own self, but it cannot, must not, be your only paradigm for thinking.

(part 3)

Tuesday 27 January 2015

Boiled Potatoes and the Analytic Method, part 1

I found myself in need of counselling last year. The counselling I received was extremely helpful, but it's only as, in the intervening time, I've started to study critical perspectives from gender and race discourse in depth that I've been able to understand the wider context of my difficulties. These approaches emphasise connectedness; the marketing of children's toys, for example, contributes to a domestication of women that in turn commodifies their sexuality and devalues their consent, leading to rape culture.

By contrast, the idiom of 'analytic philosophy', the tallest and remotest of the academic ivory towers, to which I've given a decade of my life and all my adulthood, puts detachment and abstraction foremost. It was detachment and abstraction - an overdose of both - that led me to counselling. What follows is a reflection on that journey.

As for what boiled potatoes have to do with anything? Wait and see...

Part 1: To Paint a Comfort Zone, First You Must Destroy It

First, the journey itself, or at least the closing chapter of it. This, by the way, is not a dramatic or melodramatic story. Probably it's quite underwhelming. It has no histrionics, no blubbering collapses, and the longest redemptive journey involved walking round the corner from my department building to the university's counselling service.

Proportionately to that, it starts with decorating. Having made this rather optimistic post about how my bout of decorating last summer might go, things actually went pretty well for most of the process. The schedule was met, and by the Sunday of the week after that posting, I'd finished all the decorating work. All that remained was the carpet, which was to be delivered and fitted, along with a carpet for the adjacent bedroom, on the Monday.

And then, about lunchtime on Sunday, we spotted that the boiler, which is in the other bedroom, had leaked a few spots of water from what looked like a badly-corroded valve.

Obviously, there was no way we were going to put a new carpet into a room where a boiler might need a valve replacing (where, indeed, the whole boiler might turn out to need replacing - it's a pretty old one, though - *touch wood* - still reliable). And it was a Sunday, so reaching the carpet fitter to discuss arrangements with him was going to take a while.

I can't quite put into words how I felt about this (more on this point in a later part). But to resort to tired metaphors, a stone sank into my gut. My chest felt tight, and I found my jaw clenching a lot. Even thinking about the emotional state I was in then is making me feel a bit hollow now. In retrospect, it should have been a warning, but I was a little too self-absorbed to notice (if that even makes sense - too self-absorbed to notice my own emotional state?)

But it gets worse, because I wasn't the person dealing directly with any of the people who needed to be contacted about the carpet and the boiler. All that was handled by one of my housemates, the one whose bedroom had the boiler in it. I tried not to pester her, I promise I tried, but it still got to the point that I almost drove her to tears by passing my stress onto her.

Perhaps oddly, it was the break in tension that brought matters to a head. When she finally managed to get confirmation from the carpet fitters that they would be happy to come and fit just my carpet on the Monday, and do the other one at a later date, it was my expression of relief that finally pushed her to tell me to back off.

I spent the next fifteen minutes shivering in my temporary bedroom, fighting off a panic attack. A mild one, by the standards of some I've had. It was half an hour or more before I even managed to apologise.

AND EVEN THEN, I was only thinking about maybe seeking counselling, not really sure what I should be seeking counselling for. Being a rationalist is no guarantee of always being rational; being a lover of wisdom is no guarantee of always being wise. These revelations have a significant role to play in what's to come, but for now suffice it to say that I was eventually convinced to make good on the counselling idea.

(part 2)

Tuesday 6 January 2015

Everyday sexism (that I am guilty of) part 2

Actually, this time it's not just sexism - it's every other dimension of privilege as well.

I'm working on a lengthy and complicated thing about white male identity and 'gamers' - my identity, basically. What I'm trying to do with it is address self-identified gamers defensive of our identity on the grounds that it's the only thing we have. I examine why it's possible to feel this, and how to think more broadly about our identity.

But it's really hard to do that without feeling embattled. 'Gamer' is an identity with a lot of really toxic associations. 'White' and 'male' are even worse, both having a long history of oppression and brutality. The urge is always there to get defensive, to rationalise or try to explain away my association with those identities. It's the urge to mansplain, whitesplain etc. (I'm not sure that 'gamesplain' is a thing yet, or just regarded as a combination of 'all the above') -  call it xsplaing in general.

The problem with xsplaining is difficult to state succinctly. It's most problematic when a privileged person butts into a conversation about a problematic pattern of privileged behaviour to explain why - even if not done in an explicitly abusive way, this reinforces existing power dynamics by demanding that every conversation be limited by our comfort. It also equates our discomfort with the actual harm suffered by other groups, which is dismissive of their experience as well as flat-out inaccurate.

Another problem is in demanding 'they' solve 'our' problems - the attitude of 'if you don't like it, you tell us what to do'. We're adults. If someone criticises us, we've got to be able to take responsibility for that. Before demanding specific attention from someone - adding to the burden you've already imposed on them - do some googling, or at least some self-reflection, to try to understand the problem.

This goes doubly for issues of identity. The piece I'm writing is an attempt to collect some criticism of 'gamer' and develop from that a better model of the identity. I don't agree 100% with everything I'm quoting, so there is some editorialising, but my primary purpose is not to refute or dispute those criticisms; it's to identify what we can learn from them.

So I have to be very careful of where I'm pointing my arguments. How often do I have to check for xsplaining? Every. Damn. Sentence. That's really what I want to get at here (as with last time out); this isn't something to only worry about occasionally. It's not even limited to times when you're actively engaging with someone from a different background (though that's when it's at its absolute most important).

It's so hard to resist the urge to make excuses, to haggle, to move from addressing the problem to denying it. And this is in an article specifically addressed to our concerns - I'm not trying to join an existing debate (though I am responding to one). It's even harder when engaging with people 'live'. But you can't learn or grow while rationalising; xsplaining serves your ego at the expense of your mind - not to mention at the expense of other people's peace of mind.