Wednesday, February 10, 2016

Theory: In Fiction, Curiosity Is Equal With Conflict

You've probably seen this talk, from a few years ago:

I've come to the conclusion that curiosity is as important as conflict in storytelling.

First, consider genre fiction. What would the British murder mystery be, without curiosity? Or consider what William Gibson said:
I wanted the reader to feel con­stantly somewhat disoriented and in a foreign place, because I assumed that to be the highest pleasure in reading stories set in imaginary futures.
Mysteries run on the "whodunnit" question. Science fiction runs on a more ambient curiosity, diffused to the setting rather than localized in a very specific piece of the plot. You're constantly trying to find out how this future setting differs from your relatively mundane reality. Curiosity drives horror fiction as well; imagine a horror story which started out like this:
There's a very specific type of monster that a lot of people don't know about. It's invulnerable to bullets, so shooting at it won't help you, but it's vulnerable to fire, so if you set it on fire, you'll be fine. It's nocturnal, so you might not be able to tell how big it is when you see it; fortunately, we can tell you that it's about eight feet tall, but only weighs about a hundred pounds. It attacks seemingly random individuals on a seemingly random schedule. However, there's a simple principle which allows you to predict whom it will attack, and when.
That would not be an effective horror story. It's more like an animal control manual. Every time you get the facts, the monster gets less scary. When the attacks seem random, that's terrifying. When you can call them ahead of time, they're not. This fundamental fact is the reason why horror video games can degrade into action video games which merely have unsettling artwork: once you understand the monster's mechanics, it's less of a monster, and more just an ugly problem.

The way horror uses curiosity sits in the middle between the very diffuse way sci-fi uses curiosity, and the very concentrated way mystery uses it. With mystery, you want an exact piece of the plot. With sci-fi, you want the world around the story. And with horror, you never find out enough information that you can imagine solving the problem until the characters are trapped in a situation where they wouldn't have access to the solution. But all three of these genres require unanswered questions to operate.

Everything I've ever read on narrative has said that conflict's essential. I've never seen anything which acknowledged the role of curiosity. Never any mention of the balancing act you have to play between revealing too little or too much.

The thing that made me absolutely certain that curiosity is as fundamental and essential as conflict was the television adaptation of The Expanse. As an avid fan of the books, I enjoyed the first few episodes despite their many flaws, but grew more and more frustrated with the show's inferiority to the books. I re-read the first two books just to get the taste of the show out of my mouth, and then I began re-reading the first book again.

This time, I've set up a spreadsheet and I'm filling it out chapter by chapter. The spreadsheet tracks what questions are raised in each chapter, what questions are answered, and — perhaps most importantly — what question each chapter ends on. Because in my re-reading, I noticed that "end on a question" seems to be a core organizing principle in these books. Most chapters end on cliffhangers, and a chapter which doesn't end on a cliffhanger will still at least end on a question.

There's also a column in my spreadsheet for "box within a box," because — to use JJ Abrams's term — The Expanse series of novels doesn't just have you constantly wondering "what's in the box?" Nearly every time you find out, in these books, what you find inside the box is almost always another box. And you usually find that box inside another box right at the end of a chapter. The books switch protagonists on a chapter-by-chapter basis, and every chapter opens by addressing some of the questions raised in the last chapter which "starred" that particular protagonist. Chapters also typically answer a previous question, then raise a low-stakes question, and then open up several new questions, amping up the stakes until they get to a cliffhanger, at which point the chapter ends.

It's a very addictive experience, and it's a cycle which continues throughout the book. The Expanse novels use these Matryoshka stacks of boxes within boxes as a propulsion mechanism, driving you from the end of one chapter into the beginning of the next, making these books extremely difficult to put down. Typically, when a new Expanse novel comes out, I read the whole thing in less than a day, putting aside just about everything else in my life.

I don't write as much fiction as I'd like, so I probably won't have time to apply this insight until 2017. But whatever I write next is going to steal a simple rule from The Expanse: end every scene on a question.

Is Twitter Optimizing For Users Who Even Exist?

A widely dreaded new Twitter feature became a reality today, but it's optional.
You follow hundreds of people on Twitter — maybe thousands — and when you open Twitter, it can feel like you've missed some of their most important Tweets. Today, we're excited to share a new timeline feature that helps you catch up on the best Tweets from people you follow.

Here's how it works. You flip on the feature in your settings; then when you open Twitter after being away for a while, the Tweets you're most likely to care about will appear at the top of your timeline – still recent and in reverse chronological order.
It's good to see that Twitter's notoriously ever-changing and tone-deaf management is listening, a little, for a change. But there are obviously better things Twitter could be doing with its energy here, and by Twitter's own reasoning, this only solves problems for a subset of its user base:
You follow hundreds of people on Twitter — maybe thousands — and when you open Twitter, it can feel like you've missed some of their most important Tweets.
How big is that subset? Who has this problem?

Let's assume for the sake of argument that "important Tweets" is even a meaningful phrase, that a tweet which is important can exist, and that capitalizing "Tweet" is an honest example of clear writing. It's obviously a deliberate attempt to avoid trademark genericization, but let's just pretend that anyone else but Twitter employees, anywhere in the universe, ever capitalizes "tweet," for the sake of argument, and further assume that "important Tweets" are a thing which really exist.

Let's give Twitter all this bullshit that they're trying to get away with, and then just ask: does their argument even make sense under its own false assumptions? Who out there is bummed that they missed an important "Tweet" because they follow thousands of people on Twitter?

Friday, January 22, 2016

DJ Arx: Jigsaw Shadow

A new drum and bass track from my alter ego.

Friday, January 15, 2016

Twitter, The Invisible Razorblade Tornado

I got into a Twitter argument with somebody today because I tweeted a link to their tweet, with some commentary. Not going to link up any more details, because I don't want to have random Twitter fights. But the commentary was mild, acknowledging a minor point of contention which might have been raised by a subset of my followers. It existed purely to fend off these minor points of contention. It might have done that job; I don't know. I do know that the person who posted the original tweet took it as an attack of some kind, and responded furiously with I think four new tweets going into incredible detail about how she didn't owe anybody a list of caveats and exceptions in the context of a 140-character microblogging format. While this assertion was true, it also seemed batshit insane.

Here's the thing.

This person was a black woman, and (as you already know) she was posting opinions on the Internet. I've never been a black woman posting opinions on the Internet myself, but everything I've ever read on the subject, by those who have had that experience, strongly suggests that being a black woman posting opinions on the Internet means you encounter ferocious, hateful criticism with literally every tweet you make. So even though this woman's response seemed batshit insane to me, in the context of how she interacted with my tweets, it was probably a completely reasonable misunderstanding on her end. It was a batshit insane way to interact with me, in my opinion, but I very strongly suspect that it was a completely reasonable way for her to interact with her timeline.

There are two things to think about here. The first is that, when you build social software, you're building proxy objects that people interact with instead of interacting with human beings. The second is that Twitter's failure to fully consider the consequences of this fact have led Twitter to become an automatic gaslighting machine. Step one, you're subjected to ferocious hatred. Step two, you encounter a very mild point of disagreement. Step three, in context, this mild disagreement looks like more ferocious hatred, so you respond, quite reasonably, with fierce defiance. Step four, the person who mildly disagreed is now receiving a wildly disproportionate degree of fierce defiance for no readily apparent reason. So they decide that you're fucking nuts. Step five, they tell you you're fucking nuts.

Boom. You have now been gaslighted by a completely sincere and previously disinterested individual who, up to the time of the unintentional gaslighting, harbored no ill intention towards you whatsoever. And this cycle repeats all the fucking time. In this way, if you're subject to harassment on Twitter, Twitter's terrible lack of insight into its own social affordances automatically converts random disinterested people into a crowd of gaslighters.


If you encounter this kind of seeming paranoia on Twitter, please keep in mind, you may be communicating with a completely reasonable person who is trapped inside an invisible tornado of razorblades (with apologies to Adam Wiggins, who used to have a blog with the same name, and who I'm stealing a phrase from). Obviously, this is a story of how I failed to resist the incentives that drive this terrible automatic gaslighting machine, and became part of the problem, rather than part of the solution. But I hope it can serve as some kind of mea culpa, and some kind of warning or cautionary tale, both for anybody else on Twitter, and anybody else in the business of creating software. The past few years have really demonstrated that failing to think through the social affordances of a platform, and failing to listen to your users when they report unintended side effects, can have absolutely terrible consequences.

DJ Arx: Rise Up (Video)

I made a music video for the moombahton track I put out last week.

Tuesday, January 12, 2016

DJ Arx: Rise Up

My alter ego DJ Arx put out a new moombahton track.

I like moombahton a lot. It has a lot of the lurid sound design and overall cartoonishness of brostep, aka American dubstep — which seems to have more in common with chiptunes and happycore than its source, British dubstep — but runs on a sexier beat and is pretty much the only EDM subgenre which really feels like it was designed for dancing.

As you can see from the embed, DJ Arx remains faceless for the moment, but one thing at a time.

Depression Quest

The award-winning game that sparked a bazillion sea lions is, at least in its web incarnation, a beautiful little experiment, a throwback to the mid-90s, before the dot-com hustle began in earnest - the days of alt.adjective.noun.verb.verb.verb, when the web was spare and tiny, yet filled with bizarre experiments blurring the lines between poetry, fanzines, and hypertext. The thing it reminds me most of is Carl Steadman's, which was a weird sort of requiem for a failed relationship, in the form of an alphabetical catalog.

It also vaguely reminds me of the small interactive fiction scene, which started with text games like Adventure and Zork, and still continues today with fun little toys like Lost Pig (And Place Under Ground), where you play a dim-witted orc who wanders into a dungeon by accident, or the Machiavellian Varicella, where you play a Venetian palace bureaucrat out to seize control of a kingdom.

It's virtually impossible to escape awareness of the weird festival of hatred and threats which accreted around the main developer of Depression Quest, yet it's actually quite easy for the game itself to sail completely under the radar. This is kind of backwards, to say the least. If you're interested in this kind of thing, it's worth it to play the game for a minute.

It's kind of just a Choose Your Own Adventure with some musical accompaniment and some very simple, primitive stats relating to your depression: how severe it is, whether you're taking any medications for it, whether or not you're in therapy, and what effect the therapy is having. The text is kind of enormous.

At first, I did my best to read every word, and make choices in character. The depression got worse, and there's a lovely sincerity to the game, which, unfortunately, meant that the character's in-game hopelessness started seeping into me in real life, as the player. So I switched strategies, skimmed the text, and made my choices not based on how I felt the character would react, but what seemed like the right thing to do. My reasoning was, "fuck it, this is going to be negative, might as well get through it feeling good about how I handled it."

That, of course, might actually be the point of the game. Every time I did it, the depression eased up. Winning the game is actually really easy - do the right thing, even if it seems like it'll be hard or risky for the character.

I wrestled with depression during my teens and early 20s, although I don't know if it ever got as severe as true clinical depression. Maybe it was this memory, maybe it was the writing, maybe it's just the years of acting classes turning me into someone very emotional, but I actually had a hint of tears in my eyes when I got to the end of Depression Quest and won.

Certainly, this game is not for everybody, as a variety of intense overreactions (to say the least) have already very conclusively shown, and on a programming level, all it really consists of is text and links. However, if you like good writing, it's pretty great, in a small and modest way.

Monday, January 4, 2016

Paul Graham Doesn't Write Essays

The noted weenis Paul Graham wrote a pair of blog posts yesterday which have seen celebrated, accurate, and well-deserved rebuttals. But nearly every person who disagreed with Mr. Graham has persisted in indulging the man's pretensions, by referring to his blog posts as essays. Even people who urged Mr. Graham to check his truly towering and gigantic levels of privilege accorded him the privilege of referring to his blog posts as essays.

He does not write essays. And Mr. Graham has enough privilege. You don't need to afford him even more. Stop fucking doing it.

Paul Graham first caught attention with his writing by publishing a book of what were arguably essays. At least, the book had a bunch of chapters, and no predominant theme, so calling it a book of essays was good enough. In this book, he included a chapter called The Age of the Essay, in which he argued that his style of writing would come to define our age (which I sadly must admit might be true) and further that his chapters were essays (which is questionable). He never published another book of essays, but he later began referring to his subsequent inferior, rambling blog posts as essays as well.

I'm willing to concede that the chapters in his book, Hackers and Painters, were indeed essays. It might be true, and I'm happy to call it close enough. But in referring to his blog posts as essays, I first noticed how dishonest Mr. Graham was being when I prepared a dramatic reading of his worst writing ever, the blog post Let The Other 95% Of Great Programmers In. This blog post is absolutely not an essay, by Mr. Graham's own definition.

In The Age of the Essay, Graham argues that schools teach you to only write essays about English literature, rather than just about any topic. (I'm very glad to say that this was certainly not true of my education.) He then continues:
The other big difference between a real essay and the things they make you write in school is that a real essay doesn't take a position and then defend it...

Defending a position may be a necessary evil in a legal dispute, but it's not the best way to get at the truth...

The sort of writing that attempts to persuade may be a valid (or at least inevitable) form, but it's historically inaccurate to call it an essay. An essay is something else...

Essayer is the French verb meaning "to try" and an essai is an attempt. An essay is something you write to try to figure something out.
In Let the Other 95% of Great Programmers In, Mr. Graham takes a position and defends it. There is not the slightest hint of exploring a question or trying to figure anything out. He knew what his conclusion would be, and he made an argument. That blog post was a polemic, not an essay.

Note also that a polemic on a blog is usually called a fucking blog post, not a polemic, because most motherfuckers don't even know what the word polemic means.

The two posts he wrote recently, which pissed so many people off, were not essays either. They were very obvious propaganda pieces.

And when somebody writes a propaganda piece on their blog, you might, in a subtle analysis, refer to it as a propaganda piece, but your default term for it should be fucking "blog post."


This person is a BLOGGER. He asserts an undeserved and arrogant level of privilege when he asks you to speak of the essays on his blog as essays, rather than blog posts. But that's just being rude, not dishonest. When he blogs polemics and propaganda pieces, and asks you to refer to his polemics and propaganda pieces as if they were essays, even when they are not — EVEN BY HIS OWN DEFINITION — then you are just handing away shit-tons of privilege to somebody who already has far more than enough.


Wednesday, December 30, 2015

Coping With What Apple's Become

A few years ago, I got annoyed with my iPhone for some reason. It might have been that the center button died, or it might have been the design catastrophe known as iOS 7, but either way, I'd had enough. I switched to a 1990s-style flip-phone which I bought at Best Buy for maybe $10. People kept giving me funny looks, so last year I bought an iPod Touch to see if I could tolerate iOS. The good news: I can tolerate it, on a $200 device. The bad news: I wouldn't pay a penny more. The 90s flip-phone is less irritating, to me. No tragically broken design, and no iTunes. The flip-phone's design is crap too, of course, but it's not tragic crap from a company that should know better; it's just cheap crap. My phone cost me less than a visit to my local comic book store usually does, so I don't mind.

Also, I use the iPod Touch as a social media quarantine device; its main purpose is to isolate social media onto just one machine, so that, if I want to concentrate, I can put that machine away, or go somewhere else and leave it at home. I still occasionally use Twitter on my laptop, but only because I've got a new project where I'm tweeting images I've created in Cinema 4D once every day. I've ordered a Lightning SD card reader for my iPod Touch; when I get it, my plan is to use an SD card to transfer files off the laptop onto the iPod Touch, move the Cinema 4D project off of Twitter and onto Instagram, and permanently hosts-ban Twitter on my laptop.

I prefer this nano-sneakernet approach to Dropbox; I won't use Dropbox because of their privacy policies, their irritating interface, and their connection to the prominent war criminal Condoleezza Rice. And I don't use iCloud, either, not just because of all the horror stories on Twitter and elsewhere of it destroying people's music collections, but also because that happened to somebody I know personally.

Recently my external hard drive for iTunes died. As a result, my music ownership fractured across several devices. I'm scared to sync anything to or from the old iPhone, because it means I'd risk iTunes doing something stupid to my music in the absence of the expected drive. (In fact, that's the real problem; Apple software's gotten so aggressively stupid that I just don't trust it any more, not even with utterly unremarkable, basic tasks.) So I'll probably have to write some software which manually removes the audio files. That software will also have to rename the files as well, since Apple obfuscates the names, but that's not hard; I've solved that problem before. Meanwhile, though, everything I buy on Beatport is on my main laptop; everything I buy on iTunes is on my iPod Touch, or at least one of my two iPads. (Both iPads run iOS 6, btw, because I just haven't been able to get over the awfulness of iOS 7 and up.)

Consequently, I'm just about guaranteed to pick up a ten-year-old iPod on eBay and switch to Swinsian.

About a year ago, I discovered that a guy I know in Los Angeles had switched from Apple to Linux — I think Ubuntu — and taken his whole company with him. They have just one person using OS X now, and that only for QA reasons. I worry that sooner or later, I'll have to make the same move. But I can't — not really. It would only solve my OS problems in dev work and social media. I make music using Ableton Live, I do 3D modeling and animation in Cinema 4D, I make videos in After Effects, I make images in Photoshop and Illustrator, and I make books in InDesign. Moving away from Apple isn't feasible for most of these areas.

I'm really not sure what to do about this, and it's very frustrating.

Update: I have an old MacBook Air running Snow Leopard. I keep it on Snow Leopard, because it's got an old 32-bit app that isn't good enough to upgrade, but is good enough to keep. Just needed that box today, for the first time in months; the recent App Store certificate expiration fiasco broke the app that I needed. The new version doesn't support Snow Leopard, of course, because it's ancient. It also carries a price of $30, and what I get for that $30 is the restoration of functionality which Apple improperly disabled. If Apple robs me of this money, I'll carry on living, but I can hardly call it a consensual exchange.

Monday, December 28, 2015

Soon To Celebrate My Tenth Year Of Not Being An Apple Developer

Been thinking about making this for a while. Finally sat down and drew it.
Some years, I went through this cycle several times. Maybe even monthly, when the iPad was brand new.

The things that give me pause are hopefully pretty obvious:
The things that make my eyes glow while little hearts and stars flutter around me are probably all equally obvious too. With Swift becoming open source, it's not inconceivable that the cycle might end. But it's been a stable pattern for almost a decade now. In fact, Swift becoming open source seems more like the kind of thing which would extend the cycle's lifespan than the kind of thing which would bring it to a happy conclusion. I'm pretty sure I've got another ten years of this ahead of me.