Thursday, October 16, 2014

GlideRoom: Hangouts Without The Hangups

Many years ago, a friend of mine took a picture of me with a dildo stuck to my face.

The worst thing about this picture is that I didn't have it on my hard drive. I found it via Google Images. But the best part is it took me at least 3 minutes to find it, so, by modern standards, it's pretty obscure. Or at least it was, until I put it here on my blog again.

(Actually, the worst thing about this image is that I just found out it's now apparently being used to advertise porn sites, without my knowledge, consent, or participation.)

Anyway, back in the day, this picture went on Myspace, because of course it did. And eventually the friend who took this picture became a kindergarten teacher, while I became a ridiculously overrated blogger. That's not just an opinion, it's a matter of fact, because Google over-emphasizes the importance of programmer content, relative to literally any other kind of content, when it computes its search rankings. And so, through the "magic" of Google, the first search result for my former photographer's name - and she was by this point a kindergarten teacher - was this picture on my Myspace page.

She emailed me like, "Hi! It's been a while. Can you take that picture down?"



And of course, the answer was no, because I hadn't used Myspace in years, and I didn't have any idea what my password was, and I didn't have the same email address any more, and I didn't even have the computer I had back when Myspace existed. Except it turned out that Myspace was still existing for some reason, and was maybe causing some headaches for my friend as well. I have to tell you, if you're worried that you might have accidentally fucked up your friend's career in a serious way, all because you thought it would be funny to strap a dildo to your face, it doesn't feel awesome.

(And by the way, I'm pretty sure she's a great teacher. You shouldn't have to worry that some silly thing you did as a young adult, or in your late teens, would still haunt you five to fifteen years later, but that's the Internet we built by accident.)

So I went hunting on Myspace for how to take a picture down for an account you forgot you had, and Myspace was like, "Dude, no problem! Just tell us where you lived when you had that account, and what your email address was, and what made-up bullshit answers you gave us for our security questions, since nobody in their right minds would ever provide accurate answers to those questions if they understood anything at all about the Internet!"



So that didn't go so well, either. I didn't know the answers to any of those questions. I didn't have the email address any more, and I had no idea what my old physical address was. I would have a hard time figuring out what my current address is. Probably, if I needed to know that, I might be able to find it in Gmail. That's certainly where I would turn first, because Google has eaten my ability to remember things and left me a semi-brainless husk, as most of you know, because it's done the same thing to you, and your friends, and your family.

Speak of the devil - around this time, Google started pressuring everybody in the fucking universe to sign up for Google Plus, Larry Page's desperate bid to turn Google into Facebook, because who on earth would ever be content to be one of the richest people in the history of creation, if Valleywag stopped paying attention to you for five whole minutes?

My reaction when Google's constantly like, "Hey Giles, you should join Google Plus!"



Since then, my photographer/teacher friend fortunately figured out a different way to get the image off Myspace, and I made it a rule to avoid Google Plus. Having had such a negative experience with Myspace, I took the position that any social network you join creates presence debt, like the technical debt incurred by legacy code - the nasty, counterproductive residue of a previous identity. So I was like, fuck Google Plus. I lasted for years without joining that horrible thing, but I finally capitulated this summer. I joined a company called Panda Strike, and a lot of us work remote (myself included), so we periodically gather via Google Hangouts to chat and convene as a group.

But just because I had consented to use Hangouts, that didn't mean I was going down without a fight.

When I "joined" Google Plus, I first opened up an Incognito window in Chrome. Then I made up a fake person with fake biographical attributes and joined as that person. Thereafter, whenever I saw a Google Hangouts link in IRC or email, I would first open up an Incognito window, then log into Google Plus "in disguise," and then copy/paste the Hangouts URL into the Incognito window's location textfield, and then - and only then - enter the actual Hangout.

This is, of course, too much fucking work. But at least it's work I've created for myself. Plenty of people who are willing to go along with Google's bullying approach to selling Google Plus still get nothing but trouble when they try to use Hangouts.









Protip: don't even tolerate this bullshit.

Imagine how amazing it would be if all you needed to join a live, ongoing video chat was a URL. No username, no password, no second-rate social network you've been strong-armed into joining (or pretending to join). Just a link. You click it, you're in the chat room, you're done.

Panda Strike has built this site. It's called GlideRoom, and it's Google Hangouts without the hangups, or the hassle, or indeed the shiny, happy dystopia.



Clicking "Get A Room" takes you to a chat room, whose URL is a unique hash. All you do to invite people to your chat room is send them the URL. You don't need to authorize them, authenticate them, invite them to a social network which has no other appealing features (and plenty of unappealing ones), or jump through any other ridiculous hoops.



We built this, of course, to scratch our own itch. We built this because URLs are an incredibly valuable form of user interface. And yes, we built it because Google Plus is so utterly bloody awful that we truly expect its absence to be a big plus for our product.

So check out Glideroom, and tweet at me or the team to let us know how you like it.

Tuesday, October 14, 2014

How Much Of "Software Engineering" Is Engineering?

When you build some new piece of technology, you're (arguably) doing engineering. But once you release it into the big wide world, its path of adoption is organic, and sometimes full of surprises.

Quoting Kevin Kelly's simultaneously awesome and awful book What Technology Wants, which I reviewed a couple days ago:

Thomas Edison believed his phonograph would be used primarily to record the last-minute bequests of the dying. The radio was funded by early backers who believed it would be the ideal device for delivering sermons to rural farmers. Viagra was clinically tested as a drug for heart disease. The internet was invented as a disaster-proof communications backup...technologies don't know what they want to be when they grow up.

When a new technology migrates from its intended use case, and thrives instead on an unintended use case, you have something like the runaway successes of invasive species.

In programming, whether you say "best tool for the job" or advocate your favorite One True Language™, you have an astounding number of different languages and frameworks available to build any given application, and their distribution is not uniform. Some solutions spread like wildfire, while others occupy smaller niches within smaller ecosystems.

In this way, evaluating the merits of different tools is a bit like being an exobiologist on a strange planet made of code. Why did the Ruby strain of Smalltalk proliferate, while the IBM strain died out? Oh, because the Ruby strain could thrive in the Unix ecosystem, while the IBM strain was isolated and confined within a much smaller habitat.

However, sometimes understanding technology is much more a combination of archaeology and linguistics.

Go into your shell and type man 7 re_format.

DESCRIPTION
Regular expressions (``REs''), as defined in IEEE Std 1003.2 (``POSIX.2''), come in two forms: modern REs (roughly those of egrep(1); 1003.2 calls these ``extended'' REs) and obsolete REs (roughly those of ed(1); 1003.2 ``basic'' REs). Obsolete REs mostly exist for backward compatibility in some old programs; they will be discussed at the end.


This man page, found on every OS X machine, every modern Linux server, and probably every iOS or Android device, describes the "modern" regular expressions format, standardized in 1988 and first introduced in 1979. "Modern" regular expressions are not modern at all. Similarly, "obsolete" regular expressions are not obsolete, either; staggering numbers of people use them every day in the context of the find and grep commands, for instance.

To truly use regular expressions well, you should understand this; understand how these regular expressions formats evolved into sed and awk; understand how Perl was developed to replace sed and awk but instead became a very popular web programming language in the late 1990s; and further understand that because nearly every programming language creator acquired Perl experience during that time, nearly every genuinely modern regular expressions format today is based on the format from Perl 5.

Human languages change over time, adapting to new usages and stylings with comparative grace. Computer languages can only change through formal processes, making their specifics oddly immortal (and that includes their specific mistakes). But the evolution of regular expressions formats looks a great deal like the evolution which starts with Latin and ends with languages like Italian, Romanian, and Spanish - if you have the patience to dig up the evidence.

So far, I have software engineering including the following surprising skills:
  • Exobiology
  • Archaeology
  • Linguistics
There's more. There's so much more. For example, you need to extract so much information from the social graph - who uses what technologies, and what tribes a language's "community" breaks down to - that it would be easy to add anthropology to the list. You can find great insights on this in presentations from Francis Hwang and Sarah Mei.

Sunday, October 12, 2014

Kevin Kelly "What Technology Wants"

Kevin Kelly's What Technology Wants advocates the idea that technology is an adjunct to evolution, and an extension of it, so much so that you can consider it a kingdom of life, in the sense that biologists use the term. Mr. Kelly draws fascinating parallels between convergent evolution and multiple discovery, and brings a ton of very interesting background material to support his argument. However, I don't believe he understands all the background material, and I almost feel as if he's persuading me despite his argument, rather than persuading me by making his argument.

So I recommend this book, but with a hefty stack of caveats. Mr. Kelly veers back and forth between revolutionary truths and "not even wrong" status so rapidly and constantly that you might as well consider him to be a kind of oscillator, producing some sort of waveform defined by his trajectory between these two extremes. The tone of this oscillator is messianic, prophetic, frequently delusional, but also frequently right. The insights are brilliant but the logic is often terrible. It's a combination which can make your head spin.

The author seems to either consider substantiating his arguments beneath him, or perhaps is simply not familiar with the idea of substantiating an argument in the first place. There are plenty of places where the entire argument hinges on things like "somebody says XYZ, and it might be true." No investigation of what it might mean instead if the person in question were mistaken. This is a book which will show you a graph with a line which wobbles so much it looks like a sine wave, and literally refer to that wobbling line as an "unwavering" trend.

He also refers to "the optimism of our age," in a book written in 2010, two years after the start of the worst economic crisis since the Great Depression. The big weakness in my oscillator metaphor, earlier, is that it is an enormous understatement to call the author tone-deaf.



Then again, perhaps he means the last fifty years, or the last hundred, or the last five hundred. He doesn't really clarify which age he's referring to, or in what sense it's optimistic. Or maybe when he says "our age," the implied "us" is not "humanity" or "Americans," but "Californians who work in technology." Mr. Kelly's very much part of the California tech world. He founded Wired, and I actually pitched him on writing a brief bit of commentary in 1995, which Wired published, and that was easily the coolest thing that happened to me in 1995.

Maybe because of that, I'm enjoying this book despite its flaws. It makes a terrific backdrop to Charles Stross's Accelerando. It's full of amazing stuff which is arguably true, very important if true, and certainly worth thinking about, either way. I loved Out Of Control, a book Mr. Kelly wrote twenty years ago about a similar topic, although of course I'm now wondering whether I was less discerning in those days, or if Mr. Kelly's writing went downhill. Take it with a grain of salt, but What Technology Wants is still worth reading.

Returning again to the oscillator metaphor, if a person's writing about big ideas, but they oscillate between revolutionary truths and "not even wrong" status whenever they get down to the nitty-gritty details, then the big ideas they describe probably overlap the truth about half the time. The question is which half of this book ultimately turns out to be correct, and it's a very interesting question.

Shell Scripting: Also Essential For Animators

I'm taking classes in the motion graphics and animation software Adobe After Effects. It needs a cache, and I've put its cache on an external hard drive, to avoid wasting laptop drive space. But I sometimes forget to plug that hard drive in, with the very annoying result that After Effects "helpfully" informs me that it's using a new cache location. I then immediately quit After Effects, plug in the hard drive, re-launch the software, and re-supply the correct cache location in the application's preferences.

Obviously, the solution was to remove After Effects from the OS X Dock, which is a crime against user experience anyway, and replace the dock's launcher icon with a shell script. The shell script only launches After Effects if the relevant hard drive is present and accounted for.

("Vanniman Time Machine" is the name of the hard drive, because reasons.)

Thursday, October 9, 2014

I've Created A Monster



I built it in After Effects and Photoshop, working from a Digital Tutors tutorial by Kori Valz.

Sunday, October 5, 2014

Backstory For An Anime Series

Many think there is only one Kanye. They are mistaken. There is a Kanye East. There are Kanyes North and South. On the day which the prophets have spoken of, the world will be ready, and the lost Kanyes of legend will return. A great evil will threaten the realm, and the four Kanyes will merge as one to form a Kanye Voltron, and fight fiercely and with great valor for the future of all humanity.

Friday, October 3, 2014

One Way To Understand Programming Languages

I'm learning to play the drums, and I got a good DVD from Amazon. It starts off with a rant about drum technique.

The instructor mentions the old rule of thumb that you're best to avoid conversations about religion and politics, and says that he thinks drum technique should be added to the list. He says that during the DVD, he'll tell you that certain moves are the wrong moves to make, but that any time he says that, it really means that the given move is the wrong move to make in the context of the technique he's teaching.

He then goes on to give credit to drummers who play using techniques that are different from his, and to say that it's your job as a drummer to take every technique with a grain of salt and disavow the whole idea of regarding any particular move as wrong. Yet it's also your job as a student of any particular technique to interpret that technique strictly and exactly, if you want to learn it well enough to use it. So when you're a drummer, the word "wrong" should be meaningless, yet when you're a student, it should be very important.

Programming has this tension also. If you're a good programmer, you have to be capable of understanding both One True Way fanatacism and "right tool for the job" indifference. And you have to be able to use any particular right tool for a job in that particular's tool One True Way (or choose wisely between the options that it offers you).

Thursday, September 25, 2014

Agile Is Overripe

Haters Welcome


I wrote a blog post criticizing Scrum, and a bunch of people read it. A lot of people seemed to be talking about it too. I started regularly seeing 50+ notifications when I signed into Twitter, which was a lot for me.

There weren't a lot of people defending Scrum. Most of the tweets looked like this:



Of the tweets which defended Scrum, they mostly looked like this example of the No True Scotsman fallacy:






I've seen this from people who are old enough to know better, including one Agile Manifesto co-author, so it's entirely possible there's a little war afoot in the world of Scrum, over how exactly to define the term. Sorry, Scrum hipsters, but if there is indeed such a war, you either are losing it, or (more probably) you already lost it, years ago. I'm going to use the term as it's commonly understood; if you have an issue with the default understanding of the term, I recommend you take it up with Google, Wikipedia, scrumalliance.org, scrummethodology.com, and so on and so forth. I don't care enough to differentiate between Scrum Lite and Scrum Classic, because they both taste like battery acid to me.

However, I did get one person - literally only one person - telling me that Scrum actually works, and that includes planning poker:



(As it happens, it's someone I know personally, and respect. Everyone should watch his 2009 CUSEC presentation, because it's deep and brilliant.)

Another critic ultimately led me to this blog post by Martin Fowler, written in 2006:

Drifting around the web I've heard a few comments about agile methods being imposed on a development team by upper management. Imposing a process on a team is completely opposed to the principles of agile software, and has been since its inception...

a team should choose its own process - one that suits the people and context in which they work. Imposing an agile process from the outside strips the team of the self-determination which is at the heart of agile thinking.


I'm hoping to find out more, later, about what it's like when you're on a Scrum team and it actually works. To be fair, not every Scrum experience I've had has been a nightmare of dysfunction; I just think the successes owe more to the teams involved than to the process. And regarding Fowler's blog post, a lot of the people who endorsed my post seemed to do so angrily. So I would guess that many, many of these "fuck yeah" tweets came from people who had Scrum imposed on them, rather than choosing it. And therefore I think both of these areas of criticism are worth listening to.

However, of all the criticisms of my blog post that I saw, literally every single one overlooked what is, in my opinion, my most important criticism of Scrum: that its worst aspects stem from flaws in the Agile Manifesto itself.

Quoting the original post:

I don't think highly of Scrum, but the problem here goes deeper. The Agile Manifesto is flawed too. Consider this core principle of Agile development: "business people and developers must work together."

Why are we supposed to think developers are not business people?

...

The Agile Manifesto might also be to blame for the Scrum standup. It states that "the most efficient and effective method of conveying information to and within a development team is face-to-face conversation." In fairness to the manifesto's authors, it was written in 2001, and at that time git log did not yet exist. However, in light of today's toolset for distributed collaboration, it's another completely implausible assertion...

In addition to defying logic and available evidence, both these Agile Manifesto principles encourage a kind of babysitting mentality.


Sorry, Agile, I am in fact both a business person, and a developer, at the same time. Since my business involves computers, being competent to use them only supercharges my business mojo. This is how I achieve the state of MAXIMUM OVERBUSINESS.



More seriously, I recently started a new job at a company called Panda Strike; our CEO convinced me that the real value in the Agile Manifesto was that it facilitated a change in business culture which was actually inevitable due to a technological shift which happened first.

Moore's Law Created Agile


Agile development replaced waterfall development, an era of big design up front. In waterfall development, you gather requirements, write a spec, get approval on the spec, build your software to match that spec, then throw it over a wall to QA, and only show it to your users once you're done. It's important to realize that big design up front powered a ton of incredible success stories, including putting astronauts on the moon, plus nearly everything in software before the late 80s or early 90s, with the possible exception of the Lisp machine.

I don't want to bring back that era, but to be fair, we lost some things in this paradigm shift. And I think it's pretty easy to imagine how rapid prototyping, iterative development, and YAGNI might all be inappropriate for putting astronauts on the moon. That kind of project wouldn't fit a "design as you go" mentality. It would look like something out of The Muppet Show, except people would die.

In the very early days of computing, you'd spend a lot of time working out your algorithm before turning it into a stack of punch cards, because you wouldn't get a lot of chances to run your code; any error was very expensive.



Big design up front made an enormous amount of sense when the machinery of computing was itself enormous also. But that machinery isn't enormous any more, and hasn't been enormous for a long time. According to someone who's done the math:

a tweaked Motorola Droid is capable of scoring 52 Mflop/s which is over 15 times faster than the 1979 Cray 1 CPU. Put another way, if you transported that mobile phone back to 1987 then it would be on par with the processors in one of the fastest computers in the world of the time, the ETA 10-E, and [those] had to be cooled by liquid nitrogen.

Like all benchmarks, however, you need to take this one with a pinch of salt... the underlying processors of our mobile phones are probably faster than these Java based tests imply.


In between the day of the Cray supercomputer and the modern landscape of mobile phones which can run synthesizers good enough for high-profile album releases and live performances, there was the dawn of the personal computer. As the technology got smaller, faster, and cheaper, Moore's Law rendered a whole lot of management practices obsolete. Development cycles of two entire years were common at the time, but new teams using new technology could churn out solutions in months rather than years, and PowerBuilder developers launched a revolution underneath COBOL devs, starting around 1991, in the same way Rails developers later dethroned Java, starting around 2005, after it became possible to build simple web apps in minutes, rather than months.



In our lifetimes, it may become possible for software-generating software to churn out new apps in seconds, rather than minutes, and if/when that occurs, the culture of the tech industry (which, by then, may be equal to the set of all industries) will need to change again. It's hard to see that far with accuracy, but as far I know, there are basically just two ways a business culture can transform: evolution and persuasion. Evolution is where every business which ignores the new reality just fucking dies.


Persuasion is where you come up with a way to sell a new idea to your boss. This is pretty much what the Agile Manifesto was for. In the early days of Agile, the idea that your boss would force it on you was a contradiction in terms. Either you forced it on your boss, or it just didn't happen at all.

Obviously, times have changed. Quoting Dave Thomas, one of the Agile Manifesto's original authors:

The word "agile" has been subverted to the point where it is effectively meaningless, and what passes for an agile community seems to be largely an arena for consultants and vendors to hawk services and products.

So I think it is time to retire the word "Agile."


Epic Tangent: Ontology Is Overrated


One of the best tech talks I've ever heard, "Ontology Is Overrated" by Clay Shirky, covers a related topic. It's ancient in web terms, hailing from all the way back in 2005, when Flickr and del.ici.ous were discovering the incredible power of tagging, something we now take for granted. The talk includes an interpretation of why Google crushed Yahoo, during the early days of Web search engines. A sea change in technology brought with it a philosophical sea change, which Yahoo ignored - even going so far as to re-establish obsolete limitations - and which Google exploited.


I'll summarize the talk, since text versions don't appear to be online any more. You can still read a summary, however, or download the original audio, which I definitely recommend. It's a talk which stuck with me for almost ten years, and I've heard and given many other talks during that time.

When you look at the Dewey decimal system, which librarians use for storing books on shelves, it looks like a top-down map of all ideas. But it fails very badly as a map of all ideas. Its late 19th-century roots often become visible.

Consider how the Dewey decimal system categorizes books on religion, in 2014:
  • 200 Religion
  • 210 Natural theology
  • 220 Bible
  • 230 Christian theology
  • 240 Christian moral & devotional theology
  • 250 Christian orders & local church
  • 260 Christian social theology
  • 270 Christian church history
  • 280 Christian denominations & sects
  • 290 Other & comparative religions
Asian religions get the number 299, and they have to share it with every tribal and/or indigeneous religion in Australia, Africa, and the Americas, as well as Satanism, Discordianism, Rastafarianism, and Pastafarianism. Buddhism, however, shares the number 294 with every other religion which originated in India. So at best that's a number and a half, out of 100 available, for Asian religion, and all associated topics. Asia contains about 60% of the world's population.

As a map of all the ideas about religion, this is horribly distorted, but it's not actually a map of ideas about religion. It's really just a list of categories of physical books in the collections of American libraries.

Before Google existed, Yahoo first arose as a collection of links, and soon grew large enough to be unwieldy - at which point, Yahoo hired an ontologist and categorized its links into 14 top-level categories, creating in effect a Dewey decimal system for the web. But Yahoo innovated a little, bringing in an element of Unix. If you clicked on the top-level category "Entertainment," you'd get a "Books@" link, where the little "@" suffix served to indicate a symlink. Clicking that would land you in "Books and Literature," a subcategory of "Arts," because according to Yahoo, "Books" were not really a subcategory of "Entertainment."



Librarians use a similar workaround in their systems, namely the fractional decimals which indicate subcategories, so you can say (for example) that a book is about Asia, and about religion. These workarounds are inevitable, because (for example) books can be both literature and entertainment. Or, to be more general, categories are social fictions, and to put a book about Asian religion in the Asia category, rather than the religion category, is to say that its Asian-ness is more important than its religion-ness. The hierarchical nature of ontology means it always imposes the priorities of whichever authority or authorities created the hierarchy in the first place. But with a library, you have an excuse, because a physical book can only be in one place at a time. With web links, there's no excuse.

So, rather than applying this legacy physical-shelf-locating paradigm to a set of web pages, Google allowed you to simply search the entire web. You could never expect librarians to pre-construct a subcategory called "books which contain the words 'Minnesota' and 'obstreperous,'" but Google users in 2005 could work with exactly that subcategory any time they wanted. Flickr and del.icio.us took these ideas much further, creating ad hoc quasi-ontologies by allowing users to tag things however they wanted, and then aggregating these tags, and deriving insight from them.

(Today, unfortunately, you might not get results containing both "Minnesota" and "obstreperous" if you searched Google for those words. Google's lost a tremendous amount of signal through its use of latent semantic indexing to detect synonyms, and to other, similar compromises. This diminishes the Google praise factor in Shirky's talk, but doesn't harm his overall argument in any important way. What does suggest a possible need for revision is the emergence of filter bubbles, where companies try to pre-emptively derive user-generated categories, and then confine you to them, based on what category of user they estimate you to be. Filter bubbles thus impose a new kind of crowd-sourced ontology, which holds serious dangers for democracy.)

Anyway, although this was a fantastic talk, the main point I want to make is that Google defeated Yahoo here by recognizing the whole concept of ontology for the unnecessary, inessential historical relic that it was. Google even briefly used the DMOZ project, an open-source categorization of everything on the web - yes, this actually existed, and it started life with the name Gnuhoo, because of course it did - but dumped DMOZ because nobody even used it when they could just search instead. Ontology is overrated, and Yahoo's failure to recognize that cost them an enormous market.

The Agile Manifesto existed because developers and consultants had begun to recognize that many ideas in tech management were unnecessary, inessential historical relics. Although it opposed these ideas, it didn't even argue that they should be thrown out entirely, just that they were overrated.



Remember, waterfall development reigned supreme. The Agile Manifesto did a great thing in improving working conditions for a lot of programmers, and in achieving new success stories that would have been impossible under the old paradigm. But I can't praise the Agile Manifesto for tearing down the status quo without also acknowledging that over time, it has become the new status quo, and we will probably have to tear it down too.

Synchrony Is The New Ontology


The most obvious flaw in the Agile Manifesto is the claim that face-to-face conversation is the best way for developers to communicate. It's just not true. There's a reason we write code onto screens, rather than dictating it into microphones. Face-to-face communication has a lot of virtues, and there are certainly times when it's necessary, but it's not designed to facilitate extremely detailed changes in extremely large code bases, and tools which are designed for that purpose are often superior for the task.



Likewise, I don't want to valorize a tired and harmful stereotype here, but there's a lot of development work where you can go days without needing to talk to anyone else for more than a few moments.

In many industries, companies just do not need to have synchrony or co-location any longer. This is an incredible development which will change the world forever. Do not expect the world of work to look the same in 20 years. It will not.


It's not just programming. Overpriced gourmet taco restaurants no longer need locations.

In 2001, when the Agile Manifesto was written, Linux was already a massive success story for remote work and asynchronous development. But it was just one such story, and somewhat anomalous. In 2014, nobody on the web is building a business without open source. Because of that fact, and because of the fact that just about every open source project runs on remote work and asynchronous development, we can also say that there are very, very few new technology companies today which do not already depend on the effectiveness of remote work and asynchronous dev, because these businesses would fall apart without their open source foundations, and those foundations were built with remote work and async dev.

The bizarre thing about most companies in this category, however, is that although they absolutely depend on the success of remote work and async dev, and although they absolutely and literally could not exist without the effectiveness of remote work and async dev, they nonetheless require their employees to all work in the same place at the same time.

Consider that GitHub's a distributed company where a lot of people work remote. Consider also that a lot of startups run development entirely through GitHub. This means a lot of CTOs will happily bet their companies on libraries and frameworks developed remotely, and a product which was developed remotely, yet they don't do remote dev when it comes to running their own companies.

Yahoo put ontology onto its web links simply because it never questioned the common assumption that if you want to navigate a collection of information, you do so by organizing that information into a hierarchy.

Why do tech companies have offices?

In this case, the Agile Manifesto just went stale. It's just a question of the passage of time. The apps, utilities, and devices we have for remote collaboration today are straight-up Star Trek shit by 2001 standards.



In 2001, when the Manifesto was written, you could argue against Linux as a model for development in general. Subversion was still new. Java (developed at a corporation, inside office buildings) was arguably superior to Perl, which was probably the best open source alternative at the time. There weren't profitable, successful companies built this way. You could call Linux a fluke. But we have profitable, successful, remote-oriented companies today, and legions of successful open source projects have validated the model as well.

A software development process that doesn't acknowledge this technological reality is just silly.

Two-year development cycles and big design up front were to 1990s programming as ontology was to 1990s web directories. They were ideas that had to die and Agile was right to clear them away. But that's what synchrony and co-location are today, and the Agile Manifesto advocates in favor of both.

And this synchrony thing isn't the only problem in the Agile Manifesto. I may blog in future about the other, deeper problems in the Manifesto; I already covered the "businesspeople vs. developers" problem in the Scrum post.

Monday, September 22, 2014

A Pair of Quick Animations

Five seconds or less, done in Adobe After Effects.

I made the music for this one. No special effects, just shapes, luma masks, and blending modes.



This one is mostly special effects.


Wednesday, September 17, 2014

Why Scrum Should Basically Just Die In A Fire

Conversations with Panda Strike CEO Dan Yoder inspired this blog post.

Scrum, the Agile methodology allegedly favored by Google and Spotify, is a mess.

Consider story points. If you're not familiar with Scrum, here's how they work: you play a game called "Planning Poker," where somebody calls out a task, and then counts down from three to one. On one, the engineers hold up a card with the number of "story points" which represents the relative cost they estimate for performing the task.

So, for example, a project manager might say "integrating our login system with OpenAuth and Bitcoin," and you might put up the number 20, because it's the maximum allowable value.



Wikipedia describes the goal of this game:

The reason to use Planning Poker is to avoid the influence of the other participants. If a number is spoken, it can sound like a suggestion and influence the other participants' sizing. Planning Poker should force people to think independently and propose their numbers simultaneously. This is accomplished by requiring that all participants show their card at the same time.

I have literally never seen Planning Poker performed in a way which fails to undermine this goal. Literally always, as soon as every engineer has put up a particular number, a different, informal game begins. If it had a name, this informal game would be called something like "the person with the highest status tells everybody else what the number is going to be." If you're lucky, you get a variant called "the person with the highest status on the dev team tells everybody else what the number is going to be," but that's as good as it gets.

Wikipedia gives the alleged structure of this process:
  • Everyone calls their cards simultaneously by turning them over.
  • People with high estimates and low estimates are given a soap box to offer their justification for their estimate and then discussion continues.
  • Repeat the estimation process until a consensus is reached. The developer who was likely to own the deliverable has a large portion of the "consensus vote", although the Moderator can negotiate the consensus.
  • To ensure that discussion is structured; the Moderator or the Project Manager may at any point turn over the egg timer and when it runs out all discussion must cease and another round of poker is played. The structure in the conversation is re-introduced by the soap boxes.
In practice, this "soap box" usually consists of nothing more than questions like "20? Really?". And I've never seen the whole "rinse and repeat" aspect of Planning Poker actually happen; usually, the person with lower status simply agrees to whatever the person with higher status wants the number to be.

In fairness to everybody who's tried this process and seen it fail, how could it not devolve? A nontechnical participant has, at any point, the option to pull out an egg timer and tell technical participants "thirty seconds or shut the fuck up." This is not a process designed to facilitate technical conversation; it's so clearly designed to limit such conversation that it almost seems to assume that any technical conversation is inherently dysfunctional.

It's ironic to see conversation-limiting devices built into Agile development methodologies, when one of the core principles of the Agile Manifesto is the idea that "the most efficient and effective method of conveying information to and within a development team is face-to-face conversation," but I'll get to that a little later on.

For now, I want to point out that Planning Poker isn't the only aspect of Scrum which, in my experience, seems to consistently devolve into something less useful. Another core piece of Scrum is the standup meeting.



You probably know this, but just in case, the idea is that the team for a particular project gathers daily, for a quick, 15-minute meeting. This includes devs, QA, project manager(s), designers, and anyone else who will be working to make the project succeed, or who needs to stay up-to-date with the project's progress. The standup's designed to counter an even older tradition of long, stultifying, mandatory meetings, where a few people talk, and everybody else loses their whole day to no benefit whatsoever. Certainly, if you've got that going on in your company, anything which gets rid of it is an improvement.

However, as with Planning Poker, the 15-minute standup decays very easily. I've twice seen the "15-minute standup" devolve into half-hour or hour-long meetings where everybody stands, except for management.

At one company, a ponderous, overcomplicated web app formed the centerpiece of the company's Scrum implementation. Somebody had to sit to operate this behemoth, and since that was an informal privilege, it usually went to whomever could command it. In other words: management.

At another company, the Scrum decay took a different route. As with the egg timer in Planning Poker, Scrum standups offer an escape clause. In standups, you can defer discussion of involved topics to a "parking lot," which is where an issue lands if it's too complex to fit within the meeting's normal 15-minute parameters (which also include some constraints on what you can discuss, to prevent talkative or unfocused people from over-lengthening the meeting).

At this second company, virtually everything landed in the parking lot, and it became normal for the 15-minute standup to be a 15-minute prelude to a much longer meeting. We'd just set the agenda during the standup, and the parking lot would be the actual meeting. These standups typically took place in a particular person's office. Since arriving at the parking lot meant the standup was over, that person, whose office we were in, would feel OK about sitting down in their own, personal chair. But the office wasn't big enough to bring any new chairs into, so everyone else had to stand. The person whose office we were always in? A manager.

Scrum's standups are designed to counteract an old tradition of overly long, onerous, dull meetings. However, at both these companies, they replaced that ancient tradition with a new tradition of overly long, onerous, dull meetings where management got to sit down, and everybody else had to stand. Scrum's attempt at creating a more egalitarian process backfired, twice, in each case creating something more authoritarian instead.

To be fair to Scrum, it's not intended to work that way, and there's an entire subgenre of "Agile coaching" consultants whose job is to repair broken Scrum implementations at various companies. This is pure opinion, but my guess is that's a very lucrative market, because as far as I can tell, Scrum implementations often break.


I recommend just skimming the first few seconds of this.

Scrum's ready devolution springs from major conceptual flaws.

Scrum's an Agile development methodology, and one of its major goals is sustainable development. However, it works by time-boxing efforts into iterations of a week or two in length, and refers to these iterations as "sprints." Time-boxed iterations are very useful, but there's a fundamental cognitive dissonance between "sprints" and "sustainable development," because there is no such thing as a sustainable sprint.


This man's pace is probably not optimized for sustainability.

Likewise, your overall list of goals, features, and work to accomplish is referred to as the "backlog." This is true even on a greenfield project. On day 1, you have a backlog.

Another core idea of the Agile Manifesto, the allegedly defining document for Agile development methodologies: "working software is the primary measure of progress." Scrum disregards this idea in favor of a measure of progress called "velocity." Basically, velocity is the number of "story points" successfully accomplished divided by the amount of time it took to accomplish them.

As I mentioned at the top of the post, a lot of this thinking comes from conversations with my new boss, Panda Strike CEO Dan Yoder. Dan told me he's literally been in meetings where non-technical management said things like, "well, you got through [some number] story points last week, and you only got through [some smaller number] this week, and coincidentally, I noticed that [some developer's name] left early yesterday, so it looks pretty easy who to blame."

Of course, musing, considering, mulling things over, and coming to realizations all constitute a significant amount of the actual work in programming. It is impossible to track whether these realizations occur in the office or in the shower. Anecdotally, it's usually the shower. Story points, meanwhile, are completely made-up numbers designed to capture off-the-cuff estimates of relative difficulty. Developers are explicitly encouraged to think of story points as non-binding numbers, yet velocity turns those non-binding estimates into a number they can be held accountable for, and which managers often treat as a synonym for productivity. "Agile" software exists to track velocity, as if it were a meaningful metric, and to compare the relative velocity of different teams within the same organization.

This is an actual thing which sober adults do, on purpose, for a living.

"Velocity" is really too stupid to examine in much further detail, because it very obviously disregards this whole notion of "working software as a measure of progress" in favor of completely unreliable numbers based on almost nothing. (I'm not proud to admit that I've been on a team where we spent an entire month to build an only mostly-functional shopping cart, but I suppose it's some consolation that our velocity was acceptable at the time.)

But, just to be clear, one of velocity's many flaws is that different teams are likely to make different off-the-cuff estimates, as are different members of the same team. Because of this, you can only really garner anything approaching meaningful insight from these numbers if you compare the ratio of estimated story points to accomplished story points on a per-team, per-week basis. Or, indeed, a per-individual, per-week one. And even then, you're more likely to learn something about a team's or individual's ability to make ballpark estimates than their actual productivity.

Joel Spolsky has an old but interesting blog post about a per-individual, velocity-like metric based on actually using math like a person who understands it, not a person who regards it as some kind of incomprehensible yet infallible magic. However, if there's anything worth keeping in the Agile Manifesto, it's the idea that working software is the primary measure of progress. Indeed, that's the huge, hilarious irony at the center of this bizarre system of faux accountability: with the exception of a few Heisenbugs, engineering work is already inherently more accountable than almost any other kind of work. If you ask for a feature, your team will either deliver it, or fail to deliver it, and you will know fairly rapidly.

If you're tracking velocity, your best-case scenario will be that management realizes it means nothing, even though they're tracking it anyway, which means spending money and time on it. This useless expense is what Andy Hunt and Dave Thomas termed a broken window in their classic book The Pragmatic Programmer - a sign of careless indifference, which encourages more of the same. That's not what you want to have in your workplace.



Sacrificing "working software as a measure of progress" to meaningless numbers that your MBAs can track for no good reason is a pretty serious flaw in Scrum. It implies that Scrum's loyalty is not to the Agile Manifesto, nor to working software, nor high-quality software, nor even the success of the overall team or organization. Scrum's loyalty, at least as it pertains to this design decision, is to MBAs who want to point at numbers on a chart, whether those numbers mean anything or not.

I've met very nice MBAs, and I hope everyone out there with an MBA gets to have a great life and stay employed. However, building an entire software development methodology around that goal is, in my opinion, a silly mistake.

The only situation I can think of where a methodology like Scrum could have genuine usefulness is on a rescue engagement, where you're called in as a consultant to save a failing project. In a situation like this, you can track velocity on a team basis to show your CEO client that development's speeding up. Meanwhile, you work on the real question, which is who to fire, because that's what nearly every rescue project comes down to.

In other words, in its best-case scenario, Scrum's a dog-and-pony show. But that best-case scenario is rare. In the much more common case, Scrum covers up the inability to recruit (or even recognize) engineering talent, which is currently one of the most valuable things in the world, with a process for managing engineers as if they were cogs in a machine, all of equal value.

And one of the most interesting things about Scrum is that it tries to enhance the accountability of a field of work where both failure and success are obvious to the naked eye - yet I've never encountered any similarly elaborate system of rituals whose major purpose is to enhance the accountability of fields which have actual accountability problems.



Although marketing is becoming a very data-driven field, and although this sea change began long before the Web existed at all - Dan Kennedy's been writing about data-driven marketing since at least the 1980s - it's still a fact that many marketers do totally unaccountable work that depends entirely on public perception, mood, and a variety of other factors that are inherently impossible to measure. The oldest joke in marketing: "only half my advertising works, but I don't know which half."

And you never will.


YouTube ads have tried to sell me a service to erase the criminal record I don't have. They've reminded me to use condoms during the gay sex that I don't have either. They've also tried to get me to buy American trucks and country music, neither of which will ever happen. No disrespect to the gay ex-convicts out there who do like American trucks and country music, assuming for the sake of argument that this demographic even exists, it's just not my style. Similarly, Facebook's "targeted" ads usually come from politicians I dislike, and Google's state-of-the-art, futuristic, probablistic, "best-of-breed" ads are worse. The only time they try to sell me anything I even remotely want is when I've researched something expensive but decided not to buy it yet. Then the ad follows me around every web site I visit for the next month.


Please buy it. Please. You looked at it once.

Even in 2014, marketing involves an element of randomness, and probably always will, until the end of time.

Anyway, Scrum gives you demeaning rituals to dumb down your work so that people who will never understand it can pretend to understand it. Meanwhile, work which is genuinely difficult to track doesn't have to deal with this shit.

Why?

I don't think highly of Scrum, but the problem here goes deeper. The Agile Manifesto is flawed too. Consider this core principle of Agile development: "business people and developers must work together."

Why are we supposed to think developers are not business people?

If you join (or start) a startup, you may have to do marketing before your company can hire a marketing person. The same is true for accounting, for sales, for human resources, and for just about anything that any reasonable person would call business. You're in a similar situation if you freelance or do consulting. You're definitely in a better position for any of these things if you hire someone who knows what they're doing, of course, but there's a large number of developers who are also business people.

Perhaps more importantly, if you join or start a startup, you can knock the engineering out of the park and still end up flat fucking broke if the marketing people don't do a good job. But you're probably not going to demand that your accountants or your marketing people jump through bizarre, condescending hoops every day. You're just going to trust them to do their jobs.

This is a reasonable way to treat engineers as well.

By the way, despite that little Dilbert strip a few paragraphs above, my job title at Panda Strike is Minister of Propaganda. I'm basically the Director of Marketing, except that to call yourself a Director of Marketing is itself very bad marketing when you want to communicate with developers, who traditionally mistrust marketing for various reasons (many of them quite legitimate). This is the same reason the term "growth hacker" exists, but as a job title, that phrase just reeks of dishonesty. So I went with Minister of Propaganda to acknowledge the vested interest I have in saying things which benefit my company.

However, despite having marketing responsibilities, my first act upon joining Panda Strike was to write code which evaluates code. I tweaked my git analysis scripts to produce detailed profiles of the history of many of the company's projects, both open source and internal products, so that I could get a very specific picture of how development works at Panda Strike, and how our projects have been built, and who built them, and when, and with which technologies, and so on.

As an aside, I first developed this technique on a Rails rescue project. It was the first thing I did on the project, but the CTO, having an arrogant and aloof attitude, had no idea. So on my first day, after I did this work, he introduced me to the rest of the team, telling me their names, but nothing else about them. But I recognized the names from my analysis of the git log. I noticed that the number one JavaScript committer had a cynical and sarcastic expression, that most of the team had three commits or less, and that the number one Ruby committer wasn't anywhere in the building.

This CTO who had told me nothing then said to me, "OK, dazzle me." As you can imagine, I did not dazzle him. I fired him. (Or, more accurately, I and my colleagues persuaded his CEO to fire him.)


Anway, the whole point of this is simple: there's absolutely no reason to assume that a developer is not a business person. It's a ridiculous assumption, and the world is full of incredibly successful counterexamples.

The Agile Manifesto might also be to blame for the Scrum standup. It states that "the most efficient and effective method of conveying information to and within a development team is face-to-face conversation." In fairness to the manifesto's authors, it was written in 2001, and at that time git log did not yet exist. However, in light of today's toolset for distributed collaboration, it's another completely implausible assertion, and even back in 2001 you had to kind of pretend you'd never heard of Linux if you really wanted it to make sense.

Well-written text very often trumps face-to-face communication. You can refer to well-written text later, instead of relying on your memory. You can't produce well-written text unless you think carefully. Also, technically speaking, you can literally never produce good code in the first place unless you produce well-written text. There are several great presentations from GitHub on the value of asynchronous communication, and they're basically required viewing for anybody who wants to work as a programmer, or with programmers.

In fact, GitHub itself was built without face-to-face communication. Basecamp was built without face-to-face communication as well. I'm not saying these people never met each other, but most of the work was done remote. Industrial designer Marc Newson works remote for Apple, so his work on the Apple Watch may also have happened without face-to-face communication. And face-to-face communication plays a minimal role in the majority of open source projects, which usually outperform commercial projects in terms of software quality.

In addition to defying logic and available evidence, both these Agile Manifesto principles encourage a kind of babysitting mentality. I've never seen Scrum-like frameworks for transmuting the work of designers, marketers, or accountants into cartoonish oversimplifications like story points. People are happy to treat these workers as adults and trust them to do their jobs.

I don't know why this same trust does not prevail in the culture of managing programmers. That's a question for another blog post. I suspect that the reasons are historical, and fundamentally irrelevant, because it really doesn't matter. If you're not doing well at hiring engineers, the answer is not a deeply flawed methodology which collapses under the weight of its own contradictions on a regular basis. The answer is to get better at hiring engineers, and ultimately to get great at it.

I may do a future blog post on this, because it's one of the most valuable skills in the world.



Credit where credit's due: the Agile Manifesto helped usher in a vital paradigm shift, in its day.