Chuck Jones is a vengeful god
Given my recent work schedule, this recent XKCD comic seems especially poignant. Pursuit of unattainable perfection? Why yes, I am working with CSS.
Given my recent work schedule, this recent XKCD comic seems especially poignant. Pursuit of unattainable perfection? Why yes, I am working with CSS.
Just for one day though. Apologies for the missing entry yesterday and the lame entry today. Time is still tight, so I’ll just throw out a link to 5 Questions Season Two of Heroes Had Better F#@king Answer.
Unlike a certain show about people stranded on a mysterious island that we won’t name, by the end of its first season NBC’s hit series Heroes had managed to neatly wrap up the vast majority of its plot threads and running storylines. The cheerleader was saved; the sword was retrieved; and the exploding man was stopped. We didn’t watch the finale of the mystery island show that we’re not naming, but we wouldn’t be surprised if Locke was left speechless by the sight of Patrick Duffy in the shower. Had it all been a dream?
Some questions I have: Will they finally just get rid of Ali Larter’s dumbass subplot? Which lame, cliched plot element will they get me to fall for anyway?
Update: The answer to my second question: Amnesia.
As I mentioned earlier in the week, my schedule is pretty tight so my time for writing (and just about everything else) has been drastically reduced. So I’m just going to introduce my 2007 fantasy football team, the Star Wars Kids. I know most of my readers aren’t big sports fans, but I can probably dash this off in a half hour, which I actually have enough time for. So I did very well last year, but my team peaked early and lost in the first round of the playoffs.
I was a little worried about this year. First, I had almost no time to prepare for the draft, which isn’t usually a good sign. Second, the team I drafted seemed to be relying on a lot of “comeback” seasons (players who had a bad season or two due to injury or due to their team’s performance, but who could make a comeback this year). Third, I ended up with a lackluster defense and my bench is a little weak. This is due to my position in the draft. I was last but the draft is a snake, so I had the 12th and 13th pick, but then had to wait for another 2 rounds for my next pick (36 overall). This position has its advantages, but it also meant that when a run on Defense/Special Teams happened, I ended up with scraps. Fourth, as an Eagles fan, I was frustrated by the fact that I ended up with Terrell Owens. He’s a great performer, but on a personal level, I hate him. And he plays for the mortal enemy of the Eagles. I also have the Cowboys defense & special teams. Put simply, when the Eagles play the Cowboys, I’m going to be pretty conflicted.
Anyway, after one and half weeks here, it seems that the team I drafted is doing quite well for itself. Many of my gambles are paying off, and I may have underestimated some of my “sure things.” So here’s my team:
Just know that he’ll never be on my team. I can’t root for him. It’s not in me. When TO does something good, I don’t want to feel happy.
I don’t like rooting for him either. Makes me feel dirty. But he was a steal when I picked him up in the draft, and he’s paid off in spades. *sigh*
So there you have it, the 2007 Star Wars Kids. So far, they’ve performed far beyond expectations, putting up a league high (tied for first, actually) 107 fantasy points last week. This week, they look even better, putting up 117 points so far, and Brady still has a half game left and Akers plays tomorrow night. There are still lots of things that could go wrong, and I could peak early like I did last season, but I’m still happy with my team’s performance. I took a lot of gambles and picked several sleepers, and it looks like they’re all paying off… so far.
Update: Greg’s draft didn’t go as well as mine, but I think he’ll make due.
A few weeks ago, I wrote about how context matters when consuming art. As sometimes happens when writing an entry, that one got away from me and I never got around to the point I originally started with (that entry was originally entitled “Referential” but I changed it when I realized that I wasn’t going to write anything about references), which was how much of our entertainment these days references its predecessors. This takes many forms, some overt (homages, parody), some a little more subtle.
I originally started thinking about this while watching an episode of Family Guy. The show is infamous for its random cutaway gags – little vignettes that have no connection to the story, but which often make some obscure reference to pop culture. For some reason, I started thinking about what it would be like to watch an episode of Family Guy with someone from, let’s say, the 17th century. Let’s further speculate that this person isn’t a blithering idiot, but perhaps a member of the Royal Society or something (i.e. a bright fellow).
This would naturally be something of a challenge. There are some technical explanations that would be necessary. For example, we’d have to explain electricty, cable networks, signal processing and how the television works (which at least involves discussions on light and color). The concept of an animated show, at least, would probably be easy to explain (but it would involve a discussion of how the human eye works, to a degree).
There’s more to it, of course, but moving past all that, once we start watching the show, we’re going to have to explain why we’re laughing at pretty much all of the jokes. Again, most of the jokes are simply references and parodies of other pieces of pop culture. Watching an episode of Family Guy with Isaac Newton (to pick a prominent Royal Society member) would necessitate a pause just about every minute to explain what each reference was from and why Family Guy’s take on it made me laugh. Then there’s the fact that Family Guy rarely has any sort of redeemable lesson and often deliberately skews towards actively encouraging evil (something along the lines of “I think the important thing to remember is that it’s ok to lie, so long as you don’t get caught.” I don’t think that exact line is in an episode, but it could be.) This works fine for us, as we’re so steeped in popular culture that we get the fact that Family Guy is just lampooning of the notion that we could learn important life lessions via a half-hour sitcom. But I’m sure Isaac Newton would be appalled.
For some reason, I find this fascinating, and try to imagine how I would explain various jokes. For instance, the episode I was watching featured a joke concerning “cool side of the pillow.” They cut to a scene in bed where Peter flips over the pillow and sees Billy Dee Williams’ face, which proceeds to give a speech about how cool this side of the pillow is, ending with “Works every time.” This joke alone would require a whole digression into Star Wars and how most of the stars of that series struggled to overcome their typecasting and couldn’t find a lot of good work, so people like Billy Dee Williams ended up doing commercials for a malt liquor named Colt 45, which had these really cheesy commercials where Billy Dee talked like that. And so on. It could probably take an hour before my guest would even come close to understanding the context of the joke (I’m not even touching the tip of the iceberg with this post).
And the irony of this whole thing is that jokes that are explained simply aren’t funny. To be honest, I’m not even sure why I find these simple gags funny (that, of course, is the joy of humor – you don’t usually have to understand it or think about it, you just laugh). Seriously, why is it funny when Family Guy blatantly references some classic movie or show? Again, I’m not sure, but that sort of humor has been steadily growing over the past 30 years or so.
Not all comedies are that blatant about their referential humor though (indeed, Family Guy itself doesn’t solely rely upon such references). A recent example of a good referential film is Shaun of the Dead, which somewhow manages to be both a parody and an example of a good zombie movie. It pays homage to all the classic zombie films and it also makes fun of other genres (notably the romantic comedy), but in doing so, the filmmakers have also made a good zombie movie in itself. The filmmakers have recently released a new film called Hot Fuzz, which attempts the same trick for action movies and buddy comedies. It is, perhaps, not as successful as Shaun, but the sheer number of references in the film is astounding. There are the obvious and explicit ones like Point Break and Bad Boys II, but there are also tons of subtle homages that I’d wager most people wouldn’t get. For instance, when Simon Pegg yells in the movie, he’s doing a pitch perfect impersonation of Arnold Schwarzenegger in Predator. And when he chases after a criminal, he imitates the way Robert Patrick’s T-1000 runs from Terminator 2.
References don’t need to be part of a comedy either (though comedies seem to make the easiest examples). Hop on IMDB and go to just about any recent movie, and click on the “Movie Connections” link in the left navigation. For instance, did you know that the aformentioned T2 references The Wizard of Oz and The Killing, amongst dozens of other references? Most of the time, these references are really difficult to pick out, especially when you’re viewing a foreign film or show that’s pulling from a different cultural background. References don’t have to be story or character based – they can be the way a scene is composed or the way the lighting is set (i.e. the Venetian blinds in Noir films).
Now, this doesn’t just apply to art either. A lot of common knowledge in today’s world is referential. Most formal writing includes references and bibliographies, for instance, and a non-fiction book will often assume basic familiarity with a subject. When I was in school, I was always annoyed at the amount of rote memorization they made us do. Why memorize it if I could just look it up? Shouldn’t you be focusing on my critical thinking skills instead of making me memorize arbitrary lists of facts? Sometimes this complaining was probably warranted, but most of it wasn’t. So much of what we do in today’s world requires a well-rounded familiarity with a large number of subjects (including history, science, culture, amongst many other things). There simply isn’t any substitute for actual knowledge. Though it was a pain at the time, I’m glad emphasis was put on memorization during my education. A while back, David Foster noted that schools are actually moving away from this, and makes several important distinctions. He takes an example of a song:
Jakob Dylan has a song that includes the following lines:
Cupid, don’t draw back your bow
Sam Cooke didn’t know what I know
Think of how much you need to know in order to understand these two simple lines:
1)You need to know that, in mythology, Cupid symbolizes love
2)And that Cupid’s chosen instrument is the bow and arrow
3)Also that there was a singer/songwriter named Sam Cooke
4)And that he had a song called which included the lines “Cupid, draw back your bow.”
… “Progressive” educators, loudly and in large numbers, insist that students should be taught “thinking skills” as opposed to memorization. But consider: If it’s not possible to understand a couple of lines from a popular song without knowing by heart the references to which it alludes–without memorizing them–what chance is there for understanding medieval history, or modern physics, without having a ready grasp of the topics which these disciplines reference?
And also consider: in the Dylan case, it’s not just what you need to know to appreciate the song. It’s what Dylan needed to know to create it in the first place. Had he not already had the reference points–Cupid, the bow and arrow, the Sam Cooke song–in his head, there’s no way he would have been able to create his own lines. The idea that he could have just “looked them up,” which educators often suggest is the way to deal with factual knowledge, would be ludicrous in this context. And it would also be ludicrous in the context of creating new ideas about history or physics.
As Foster notes, this doesn’t mean that “thinking skills” are unimportant, just that knowledge is important too. You need to have a quality data set in order to use those “thinking skills” effectively.
Human beings tend to leverage knowledge to create new knowledge. This has a lot of implications, one of which is intellectual property law. Giving limited copyright to intellectual property is important, because the data in that property eventually becomes available for all to built upon. It’s ironic that educators are considering less of a focus on memorization, as this requirement of referential knowledge has been increasing for some time. Students need a base of knowledge to both understand and compose new works. References help you avoid reinventing the wheel everytime you need to create something, which leads to my next point.
I think part of the reason references are becoming more and more common these days is that it makes entertainment a little less passive. Watching TV or a movie is, of course, a passive activity, but if you make lots of references and homages, the viewer is required to think through those references. If the viewer has the appropriate knowledge, such a TV show or movie becomes a little more cognitively engaging. It makes you think, it calls to mind previous work, and it forces you to contextualize what you’re watching based on what you know about other works. References are part of the complexity of modern Television and film, and Steven Johnson spends a significant amout of time talking about this subject in his book Everything Bad is Good for You (from page 85 of my edition):
Nearly every extended sequence in Seinfeld or The Simpsons, however, will contain a joke that makes sense only if the viewer fills in the proper supplementary information — information that is deliberately withheld from the viewer. If you haven’t seen the “Mulva” episode, or if the name “Art Vandelay” means nothing to you, then the subsequent references — many of them arriving years after their original appearance — will pass on by unappreciated.
At first glance, this looks like the soap opera tradition of plotlines extending past the frame of individual episodes, but in practice the device has a different effect. Knowing that George uses the alias Art Vandelay in awkward social situations doesn’t help you understand the plot of the current episode; you don’t draw on past narratives to understand the events in the present one. In the 180 Seinfeld episodes that aired, seven contain references to Art Vandelay: in George’s actually referring to himself with that alias or invoking the name as part of some elaborate lie. He tells a potential employer at a publishing house that he likes to read the fiction of Art Vandelay, author of Venetian Blinds; in another, he tells an unemployment insurance caseworker that he’s applied for a latex salesman job at Vandelay Industries. For storytelling purposes, the only thing that you need to know here is that George is lying in a formal interview; any fictitious author or latex manufacturer would suffice. But the joke arrives through the echo of all those earlier Vandelay references; it’s funny because it’s making a subtle nod to past events held offscreen. It’s what we’d call in a real-world context an “in-joke” — a joke that’s funny only to people who get the reference.
I know some people who hate Family Guy and Seinfeld, but I realized a while ago that they don’t hate those shows because of the contents of the shows or because they were offended (though some people certainly are), but rather becaues they simply don’t get the references. They didn’t grow up watching TV in the 80s and 90s, so many of the references are simply lost on them. Family Guy would be particularly vexing if you didn’t have the pop culture knowledge of the writers of that show. These reference heavy shows are also a lot easier to watch and rewatch, over and over again. Why? Because each episode is not self-contained, you often find yourself noticing something new every time you watch. This also sometimes works in reverse. I remember the first time I saw Bill Shatner’s campy rendition of Rocket Man, I suddenly understoood a bit on Family Guy which I thought was just a bit based on being random (but was really a reference).
Again, I seem to be focusing on comedy, but it’s not necessarily limited to that genre. Eric S. Raymond has written a lot about how science fiction jargon has evolved into a sophisticated code that implicitely references various ideas, conventions and tropes of the genre:
In looking at an SF-jargon term like, say, “groundcar”, or “warp drive” there is a spectrum of increasingly sophisticated possible decodings. The most naive is to see a meaningless, uninterpretable wordlike noise and stop there.
The next level up is to recognize that uttering the word “groundcar” or “warp drive” actually signifies something that’s important for the story, but to lack the experience to know what that is. The motivated beginning reader of SF is in this position; he must, accordingly, consciously puzzle out the meaning of the term from the context provided by the individual work in which it appears.
The third level is to recognize that “ground car” and “warp drive” are signifiers shared, with a consistent and known meaning, by many works of SF — but to treat them as isolated stereotypical signs, devoid of meaning save inasmuch as they permit the writer to ratchet forward the plot without requiring imaginative effort from the reader.
Viewed this way, these signs emphasize those respects in which the work in which they appear is merely derivative from previous works in the genre. Many critics (whether through laziness or malice) stop here. As a result they write off all SF, for all its pretensions to imaginative vigor, as a tired jumble of shopworn cliches.
The fourth level, typical of a moderately experienced SF reader, is to recognize that these signifiers function by permitting the writer to quickly establish shared imaginative territory with the reader, so that both parties can concentrate on what is unique about their communication without having to generate or process huge expository lumps. Thus these “stereotypes” actually operate in an anti-stereotypical way — they permit both writer and reader to focus on novelty.
At this level the reader begins to develop quite analytical habits of reading; to become accustomed to searching the writer’s terminology for what is implied (by reference to previous works using the same signifiers) and what kinds of exceptions and novelties convey information about the world and the likely plot twists.
It is at this level, for example, that the reader learns to rely on “groundcar” as a tip-off that the normal transport mode in the writer’s world is by personal flyer. At this level, also, the reader begins to analytically compare the author’s description of his world with other SFnal worlds featuring personal flyers, and to recognize that different kinds of flyers have very different implications for the rest of the world.
For example, the moderately experienced reader will know that worlds in which the personal fliers use wings or helicopter-like rotors are probably slightly less advanced in other technological ways than worlds in which they use ducted fans — and way behind any world in which the flyers use antigravity! Once he sees “groundcar” he will be watching for these clues.
The very experienced SF reader, at the fifth level, can see entire worlds in a grain of jargon. When he sees “groundcar” he associates to not only technical questions about flyer propulsion but socio-symbolic ones but about why the culture still uses groundcars at all (and he has a reportoire of possible answers ready to check against the author’s reporting). He is automatically aware of a huge range of consequences in areas as apparently far afield as (to name two at random) the architectural style of private buildings, and the ecological consequences of accelerated exploitation of wilderness areas not readily accessible by ground transport.
While comedy makes for convenient examples, I think this better illustrates the cognitive demands of referential art. References require you to be grounded in various subjects, and they’ll often require you to think through the implications of those subjects in a new context. References allow writers to pack incredible amounts of information into even the smallest space. This, of course, requires the consumer to decode that information (using available knowledge and critical thinking skills), making the experience less passive and more engaging. Use references will continue to flourish and accellerate in both art and scholarship, and new forms will emerge. One could even argue that aggregation in various weblogs are simply exercises in referential work. Just look at this post, in which I reference several books and movies, in many cases assuming familiarity. Indeed, the whole structure of the internet is based on the concept of links — essentialy a way to reference other documents. Perhaps this is part of the cause of the rising complexity and information density of modern entertainment. We can cope with it now, because we have such systems to help us out.
Last week, I hastily threw together a post on Coke, including some thoughts on Coke vs. Pepsi, the advertising of both brands, and Passover Coke. I’ve run across several people commenting on my post or similar issues over the past week.
Costco has conformed to CA and U.S. rules, such as CRV (the sort-of deposit you pay for the bottle) and “nutrition” labeling, so everything appears to be nice and legal. Of course you could always get your sugar water fix at some smaller grocers or taquerias by buying surprisingly expensive “bootlegged” bottles one at a time, but Costco will let Cokeheads stock up by the case at a relatively low price.
The Mexican Coke adds another wrinkle into the mix: they come in glass bottles, which supposedly make the coke taste better. I’m going to need to stock up on some regular Coke, Passover Coke, Mexican Coke, and sure, let’s throw some Pepsi into the mix, and do a double blind test to see which cola tastes the best. Alas, this will have to wait for next year… [link via Kottke]
And speaking of beer, I spent the previous weekend in Cooperstown. Sure, we visited the Baseball Hall of Fame Mvsevm, but the highlight of the trip for me was a visit to the Brewery Ommegang. It’s a surprisingly small operation, but that makes sense when you realize that it’s an expensive Belgian-style microbrew. I’m not a beer expert, but I think I’ve tried more varieties than your average person, and these are my absolute favorite beers of all time. Ommegang only makes 5 varieties, but they are all fantastic. Alas, you have to pay for that quality, but it’s worth it. In any case, the tour ends with a beer tasting and you can buy some beer at a slight discount, which I did, giving me this:
Awesome. Ok, I cheated a little. I already had the normal size bottles on the left, but still, that’s an impressive array of beer. Looks like I’ve got some work to do!
I love Coca-Cola. I hate Pepsi. I probably wouldn’t feel like that if it weren’t for my parents. My brother prefers Pepsi. For reasons beyond my understanding, my parents nurtured this conflict. This is strange, since they generally just bought what was on sale (and we were growing up during the whole cola wars episode, so there were lots of sales). This manifested in various ways throughout the years, but the end result is that our preferences polarized. When I go to a restaurant and ask for a “Coke” and they ask if Pepsi is ok, I generally change my order to something else (root beer, water, etc…) Now, I’m not rude or even very confrontational about it, but this guy sure is:
“I’d like a Coca-Cola, please,” I told the waiter.
“Will Pepsi be OK?” he replied.
“No, I’d like a Coke,” I said.
“We serve only Pepsi products,” he stammered.
“Does anyone ever ask for a Coke?” I asked.
“All the time,” he said, “but we serve Pepsi.”
“Could you run down to the 7-11 and get me a Coke — they have plenty over there?” I asked with a smile.
Now, I’ve seen people say “No, Pepsi is not ok,” but asking for the waitress to run down to the 7-11 is pure, diabolical genius. Still, most of us Coke fiends aren’t rude about our preferences. Take John Scalzi, who wrote a great Essay on Coca-Cola a while ago, and delved into the advertising of Coke and Pepsi:
I think there really is something to how Coke positions itself. One hates to admit that one is influenced by corporate branding — it means that those damned advertisers actually managed to do their job — but what can you say. It works. Since Coke is the market leader, it doesn’t spend any time as far as I can see banging on Pepsi or other brands; its ads stick to their knitting, which is making sure that people feel that Coke is part of everyday life — and at some point during your day, you’re probably going to have a Coke. It’s inevitable. And hey — that’s okay. That’s as it should be, in fact. I don’t know that I would call Coke’s ads soft sells (after all, they brand the product literally up the wazoo), but I don’t find the advertising utterly annoying.
Which brings us back to Pepsi. Pepsi is eternally positioning itself as the outsider — “Pepsi Generation,” “Generation Next,” so on and so forth. Always young, always fun, always mildly rebellious, yadda yadda yadda. Since one goes in knowing that Pepsi is a multibillion-dollar corporation, I’ve always found the rebellion angle amusing (and not just in Pepsi’s case — if you’re a company that’s big enough to advertise your wares every single day on national networks, you’ve gotten just a bit beyond being the rebel’s choice, now, haven’t you?). Being a rebel doesn’t really work for me — most of what is positioned as being a rebel is actually not rebellion, merely sullenness and inarticulateness. And really, I’m just too bourgeois for that at this point in my life. … Besides, Pepsi can’t seem to advertise itself without bringing up the point that Coke exists, and is the better-selling brand.
And it goes on for a bit too. Great article.
This year, I learned about the existence of Passover Coke. The current Coke formula uses corn syrup as a sweetener because it’s cheaper than pure cane sugar, but since it’s not Kosher to eat corn during Passover, Coke makes some special batches of cola using pure cane sugar. It’s only available in limited quantities for a few weeks a year (you can tell because it’s got a yellow cap and Hebrew writing on it). I didn’t get a chance to do a taste test this year, but Widge did, and he says that people prefer Passover Coke to regular Coke. This, of course, leads him to make the obvious suggestion:
Look. I know it’s easier to work with and cheaper and all that good stuff. But let’s face it: consumers are trying to get away from the high fructose stuff. I don’t pretend to even understand all the health controversy that’s going on, I tried to read up on the Wikipedia article before writing this and it mentioned “plasma triacylglycerol” and my eyes sort of glazed over (mmmm, glaze). It sounds like something the crew of Star Trek Voyager would seek out while being chased by cauliflower-headed aliens. But forget all that: it just freaking tastes better. That’s all I care about, because if I was really concerned about my health, why would I be drinking Coke?
No offense.
Anyway, it’s obvious you can make the stuff. It’s obvious there’s a market. I know just what to do: make a huge deal about how you believe in consumer choice and the market deciding things and release it as Coca-Cola Prime. Hell, if it’s more expensive, charge more for it. Think about it: GET PRIMED WITH COKE. See? I’m giving you a campaign for free!
I’d buy it. Good stuff.
Yes, time is still short these days, so just a few links featuring lots and lots of pictures:
That’s all for now. Sorry for the lameness of recent bloggery, but again, time is short.
Time is short this week (I know, what else is new, but it’s especially short this week) so I’ll just point to some of the funniest photoshopped 300 parodies I’ve seen [via NeedCoffee]. I’ve included one of my favorites below, but I think the best one is in an animated gif about halfway down the page.
Some other good ones: King Leonidas, Zidane Style, Wile E. Coyote, and This is Ping Pong. Oh, and of course, the PG Version.
As I waded through dozens of recommendations for Anime series (thanks again to everyone who contributed), I began to wonder about a few things. Anime seems to be a pretty vast subject and while I had touched the tip of the iceberg in the past, I really didn’t have a good feel for what was available. So I asked for recommendations, and now I’m on my way. But it’s not like I just realized that I wanted to watch more Anime. I’ve wanted to do that for a little while, but I’ve only recently acted on it. What took so long? Why is it so hard to get started?
This isn’t something that’s limited to deciding what to watch either. I find that just getting started is often the most difficult part of a task (or, at least, the part I seem to get stuck on the most). Sometimes it’s difficult to deal with the novelty of a thing, other times a project seems completely overwhelming. But after I’ve begun, things don’t seem so novel or overwhelming anymore. I occasionally find myself hesitant to start a new book or load up a new video game, but once I do, things flow pretty easily (unless the book or game is a really bad one). I have a bunch of ideas for blog posts that I never get around to attacking, but usually once I start writing, ideas flow much more readily. At work, I’ll sometimes find myself struggling to get started on a task, but once I get past that initial push, I’m fine. Sure, there are excuses for all of these (interruptions, email, and meetings, for instance), but while they are sometimes true obstacles, they often strike me as rationalizations. Just getting started is the problem, but once I get into the flow, it’s easy to keep going.
Joel Spolsky wrote an excellent essay on the subject called Fire and Motion:
Many of my days go like this: (1) get into work (2) check email, read the web, etc. (3) decide that I might as well have lunch before getting to work (4) get back from lunch (5) check email, read the web, etc. (6) finally decide that I’ve got to get started (7) check email, read the web, etc. (8) decide again that I really have to get started (9) launch the damn editor and (10) write code nonstop until I don’t realize that it’s already 7:30 pm.
Somewhere between step 8 and step 9 there seems to be a bug, because I can’t always make it across that chasm.For me, just getting started is the only hard thing. An object at rest tends to remain at rest. There’s something incredible heavy in my brain that is extremely hard to get up to speed, but once it’s rolling at full speed, it takes no effort to keep it going.
It’s an excellent point, and there does seem to be some sort of mental inertia at work here. But why? Why is it so difficult to get started?
When I think about this, I realize that this is a relatively new phenomenon for me. I don’t remember having this sort of difficulty ten years ago. What’s different? Well, I’m ten years older. The conventional wisdom is that it becomes more difficult to learn new things (i.e. to start something new) as you get older. There is some supporting evidence having to do with how the human brain becomes less malleable with time, but I’m not sure that paints the full picture. I think a big part of the problem is that as I got older, my standards rose.
Let me back up for a moment. A few years ago, a friend attempted to teach me how to drive a stick. I’d driven a automatic transmission my whole life up until that point, so the process of learning a manual transmission proved to be a challenging one. The actual mechanics of it are pretty straightforward and easily internalized. Sitting down and actually doing it, though, was another story. Intellectually, I knew what was going on, but it can be a little difficult to overcome muscle memory. I had a lot of trouble at first (and since I haven’t driven a stick since then, I’d probably still have a lot of trouble today) and got extremely frustrated. My friend (who had gone through the same thing herself) laughed at it, making my lack of success even more infuriating. Eventually she explained to me that it wasn’t that I was doing a bad job. It was that I was so used to being able to pick up something new and run with it, that when I had to do something extra challenging that took a little longer to pick up, I became frustrated. In short, I had higher standards for myself than I should have.
I think, perhaps, that’s why it’s difficult to start something new. It’s not that learning has become harder, it’s that I’ve become less tolerant of failure. My standards are higher, and that will sometimes make it hard to start something. This post, for example, has been brewing in my head for a while, but I had trouble getting started. This happens all the time, and I’ve actually got a bunch of ideas for posts stashed away somewhere. I’ve even written about this before, though only in a tangential way:
This weblog has come a long way over the three and a half years since I started it, and at this point, it barely resembles what it used to be. I started out somewhat slowly, just to get an understanding of what this blogging thing was and how to work it (remember, this was almost four years ago and blogs weren’t nearly as common as they are now), but I eventually worked up into posting about once a day, on average. At that time, a post consisted mainly of a link and maybe a summary or some short commentary. Then a funny thing happened, I noticed that my blog was identical to any number of other blogs, and thus wasn’t very compelling. So I got serious about it, and started really seeking out new and unusual things. I tried to shift focus away from the beaten path and started to make more substantial contributions. I think I did well at this, but it couldn’t really last. It was difficult to find the offbeat stuff, even as I poured through massive quantities of blogs, articles and other information (which caused problems of it’s own). I slowed down, eventually falling into an extremely irregular posting schedule on the order of once a month, which I have since attempted to correct, with, I hope, some success. I recently noticed that I have been slumping somewhat, though I’m still technically keeping to my schedule.
Part of the reason I was slumping back then was that my standards were rising again. The problem is that I want what I write to turn out good, and my standards are high (relatively speaking – this is only a blog, after all). So when I sit down to write, I wonder if I’ll actually be able to do the subject justice. At a certain point, though, you just have to pull the trigger and get started. The rest comes naturally. Is this post better than I had imagined? Probably not, but then, if I waited until it was perfect, I’d never post anything (and plus, that sorta defeats the purpose of blogging).
One of the things I’ve noticed since changing my schedule to post at least twice a week is that it forces me to lower my standards a bit, just so that I can get something out on time. Back when I started the one post a week schedule, I found that those posts were getting pretty long. I thought they were pretty good too, but as time went on, I wasn’t able to keep up with my rising expectations. There’s nothing inherently wrong with high expectations, but I’ve found it’s good every now and again to adjust course. Even a well made clock drifts and must be calibrated from time to time, and so we must calibrate ourselves from time to time as well.
Update 3.15.07: It occurs to me that this post is overly-serious and may give you the wrong idea. In the comments, Pete notes that watching Anime is supposed to be fun. I agree wholeheartedly, and I didn’t mean to imply differently. The same goes for blogging – I wrote a decent amount in this post about how blogging is difficult for me, but that’s not really the right way to put it. I enjoy blogging too, that’s why I do it. Sometimes I overthink things, and that’s probably what I was doing in this post, but I think the main point holds. Learning can be impaired by high standards.
Roy over at 79Soul has started a series of posts dealing with Intellectual Property. His first post sets the stage with an overview of the situation, and he begins to explore some of the issues, starting with the definition of theft. I’m going to cover some of the same ground in this post, and then some other things which I assume Roy will cover in his later posts.
I think most people have an intuitive understanding of what intellectual property is, but it might be useful to start with a brief definition. Perhaps a good place to start would be Article 1, Section 8 of the U.S. Constitution:
To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries;
I started with this for a number of reasons. First, because I live in the U.S. and most of what follows deals with U.S. IP law. Second, because it’s actually a somewhat controversial stance. The fact that IP is only secured for “limited times” is the key. In England, for example, an author does not merely hold a copyright on their work, they have a Moral Right.
The moral right of the author is considered to be — according to the Berne convention — an inalienable human right. This is the same serious meaning of “inalienable” the Declaration of Independence uses: not only can’t these rights be forcibly stripped from you, you can’t even give them away. You can’t sell yourself into slavery; and neither can you (in Britain) give the right to be called the author of your writings to someone else.
The U.S. is different. It doesn’t grant an inalienable moral right of ownership; instead, it allows copyright. In other words, in the U.S., such works are considered property (i.e. it can be sold, traded, bartered, or given away). This represents a fundamental distinction that needs to be made: some systems emphasize individual rights and rewards, and other systems are more limited. When put that way, the U.S. system sounds pretty awful, except that it was designed for something different: our system was built to advance science and the “useful arts.” The U.S. system still rewards creators, but only as a means to an end. Copyright is granted so that there is an incentive to create. However, such protections are only granted for “limited Times.” This is because when a copyright is eternal, the system stagnates as protected peoples stifle competition (this need not be malicious). Copyright is thus limited so that when a work is no longer protected, it becomes freely available for everyone to use and to build upon. This is known as the public domain.
The end goal here is the advancement of society, and both protection and expiration are necessary parts of the mix. The balance between the two is important, and as Roy notes, one of the things that appears to have upset the balance is technology. This, of course, extends as far back as the printing press, records, cassettes, VHS, and other similar technologies, but more recently, a convergence between new compression techniques and increasing bandwidth of the internet created an issue. Most new recording technologies were greeted with concern, but physical limitations and costs generally put a cap on the amount of damage that could be done. With computers and large networks like the internet, such limitations became almost negligible. Digital copies of protected works became easy to copy and distribute on a very large scale.
The first major issue came up as a result of Napster, a peer-to-peer music sharing service that essentially promoted widespread copyright infringement. Lawsuits followed, and the original Napster service was shut down, only to be replaced by numerous decentralized peer-to-peer systems and darknets. This meant that no single entity could be sued for the copyright infringement that occurred on the network, but it resulted in a number of (probably ill-advised) lawsuits against regular folks (the anonymity of internet technology and state of recordkeeping being what it is, this sometimes leads to hilarious cases like when the RIAA sued a 79 year old guy who doesn’t even own a computer or know how to operate one).
Roy discusses the various arguments for or against this sort of file sharing, noting that the essential difference of opinion is the definition of the word “theft.” For my part, I think it’s pretty obvious that downloading something for free that you’d normally have to pay for is morally wrong. However, I can see some grey area. A few months ago, I pre-ordered Tool’s most recent album, 10,000 Days from Amazon. A friend who already had the album sent me a copy over the internet before I had actually recieved my copy of the CD. Does this count as theft? I would say no.
The concept of borrowing a Book, CD or DVD also seems pretty harmless to me, and I don’t have a moral problem with borrowing an electronic copy, then deleting it afterwords (or purchasing it, if I liked it enough), though I can see how such a practice represents a bit of a slippery slope and wouldn’t hold up in an honest debate (nor should it). It’s too easy to abuse such an argument, or to apply it in retrospect. I suppose there are arguments to be made with respect to making distinctions between benefits and harms, but I generally find those arguments unpersuasive (though perhaps interesting to consider).
There are some other issues that need to be discussed as well. The concept of Fair Use allows limited use of copyrighted material without requiring permission from the rights holders. For example, including a screenshot of a film in a movie review. You’re also allowed to parody copyrighted works, and in some instances make complete copies of a copyrighted work. There are rules pertaining to how much of the copyrighted work can be used and in what circumstances, but this is not the venue for such details. The point is that copyright is not absolute and consumers have rights as well.
Another topic that must be addressed is Digital Rights Management (DRM). This refers to a range of technologies used to combat digital copying of protected material. The goal of DRM is to use technology to automatically limit the abilities of a consumer who has purchased digital media. In some cases, this means that you won’t be able to play an optical disc on a certain device, in others it means you can only use the media a certain number of times (among other restrictions).
To be blunt, DRM sucks. For the most part, it benefits no one. It’s confusing, it basically amounts to treating legitimate customers like criminals while only barely (if that much) slowing down the piracy it purports to be thwarting, and it’s lead to numerous disasters and unintended consequences. Essential reading on this subject is this talk given to Microsoft by Cory Doctorow. It’s a long but well written and straightforward read that I can’t summarize briefly (please read the whole thing). Some details of his argument may be debateable, but as a whole, I find it quite compelling. Put simply, DRM doesn’t work and it’s bad for artists, businesses, and society as a whole.
Now, the IP industries that are pushing DRM are not that stupid. They know DRM is a fundamentally absurd proposition: the whole point of selling IP media is so that people can consume it. You can’t make a system that will prevent people from doing so, as the whole point of having the media in the first place is so that people can use it. The only way to perfectly secure a piece of digital media is to make it unusable (i.e. the only perfectly secure system is a perfectly useless one). That’s why DRM systems are broken so quickly. It’s not that the programmers are necessarily bad, it’s that the entire concept is fundamentally flawed. Again, the IP industries know this, which is why they pushed the Digital Millennium Copyright Act (DMCA). As with most laws, the DMCA is a complex beast, but what it boils down to is that no one is allowed to circumvent measures taken to protect copyright. Thus, even though the copy protection on DVDs is obscenely easy to bypass, it is illegal to do so. In theory, this might be fine. In practice, this law has extended far beyond what I’d consider reasonable and has also been heavily abused. For instance, some software companies have attempted to use the DMCA to prevent security researchers from exposing bugs in their software. The law is sometimes used to silence critics by threatening them with a lawsuit, even though no copright infringement was committed. The Chilling Effects project seems to be a good source for information regarding the DMCA and it’s various effects.
DRM combined with the DMCA can be stifling. A good example of how awful DRM is, and how DMCA can affect the situation is the Sony Rootkit Debacle. Boing Boing has a ridiculously comprehensive timeline of the entire fiasco. In short, Sony put DRM on certain CDs. The general idea was to prevent people from putting the CDs in their computer and ripping them to MP3s. To accomplish this, Sony surreptitiously installed software on customer’s computers (without their knowledge). A security researcher happened to notice this, and in researching the matter found that the Sony DRM had installed a rootkit that made the computer vulnerable to various attacks. Rootkits are black-hat cracker tools used to disguise the workings of their malicious software. Attempting to remove the rootkit broke the windows installation. Sony reacted slowly and poorly, releasing a service pack that supposedly removed the rootkit, but which actually opened up new security vulnerabilities. And it didn’t end there. Reading through the timeline is astounding (as a result, I tend to shy away from Sony these days). Though I don’t believe he was called on it, the security researcher who discovered these vulnerabilities was technically breaking the law, because the rootkit was intended to protect copyright.
A few months ago, my windows computer died and I decided to give linux a try. I wanted to see if I could get linux to do everything I needed it to do. As it turns out, I could, but not legally. Watching DVDs on linux is technically illegal, because I’m circumventing the copy protection on DVDs. Similar issues exist for other media formats. The details are complex, but in the end, it turns out that I’m not legally able to watch my legitimately purchased DVDs on my computer (I have since purchased a new computer that has an approved player installed). Similarly, if I were to purchase a song from the iTunes Music Store, it comes in a DRMed format. If I want to use that format on a portable device (let’s say my phone, which doesn’t support Apple’s DRM format), I’d have to convert it to a format that my portable device could understand, which would be illegal.
Which brings me to my next point, which is that DRM isn’t really about protecting copyright. I’ve already established that it doesn’t really accomplish that goal (and indeed, even works against many of the reasons copyright was put into place), so why is it still being pushed? One can only really speculate, but I’ll bet that part of the issue has to do with IP owners wanting to “undercut fair use and then create new revenue streams where there were previously none.” To continue an earlier example, if I buy a song from the iTunes music store and I want to put it on my non-Apple phone (not that I don’t want one of those), the music industry would just love it if I were forced to buy the song again, in a format that is readable by my phone. Of course, that format would be incompatible with other devices, so I’d have to purchase the song again if I wanted to listen to it on those devices. When put in those terms, it’s pretty easy to see why IP owners like DRM, and given the general person’s reaction to such a scheme, it’s also easy to see why IP owners are always careful to couch the debate in terms of piracy. This won’t last forever, but it could be a bumpy ride.
Interestingly enough, distributers of digital media like Apple and Yahoo have recently come out against DRM. For the most part, these are just symbolic gestures. Cynics will look at Steve Jobs’ Thoughts on Music and say that he’s just passing the buck. He knows customers don’t like or understand DRM, so he’s just making a calculated PR move by blaming it on the music industry. Personally, I can see that, but I also think it’s a very good thing. I find it encouraging that other distributers are following suit, and I also hope and believe this will lead to better things. Apple has proven that there is a large market for legally purchased music files on the internet, and other companies have even shown that selling DRM-free files yields higher sales. Indeed, the emusic service sells high quality, variable bit rate MP3 files without DRM, and it has established emusic as the #2 retailer of downloadable music behind the iTunes Music Store. Incidentally, this was not done for pure ideological reasons – it just made busines sense. As yet, these pronouncements are only symbolic, but now that online media distributers have established themselves as legitimate businesses, they have ammunition with which to challenge the IP holders. This won’t happen overnight, but I think the process has begun.
Last year, I purchased a computer game called Galactic Civilizations II (and posted about it several times). This game was notable to me (in addition to the fact that it’s a great game) in that it was the only game I’d purchased in years that featured no CD copy protection (i.e. DRM). As a result, when I bought a new computer, I experienced none of the usual fumbling for 16 digit CD Keys that I normally experience when trying to reinstall a game. Brad Wardell, the owner of the company that made the game, explained his thoughts on copy protection on his blog a while back:
I don’t want to make it out that I’m some sort of kumbaya guy. Piracy is a problem and it does cost sales. I just don’t think it’s as big of a problem as the game industry thinks it is. I also don’t think inconveniencing customers is the solution.
For him, it’s not that piracy isn’t an issue, it’s that it’s not worth imposing draconian copy protection measures that infuriate customers. The game sold much better than expected. I doubt this was because they didn’t use DRM, but I can guarantee one thing: People don’t buy games because they want DRM. However, this shows that you don’t need DRM to make a successful game.
The future isn’t all bright, though. Peter Gutmann’s excellent Cost Analysis of Windows Vista Content Protection provides a good example of how things could get considerably worse:
Windows Vista includes an extensive reworking of core OS elements in order to provide content protection for so-called “premium content”, typically HD data from Blu-Ray and HD-DVD sources. Providing this protection incurs considerable costs in terms of system performance, system stability, technical support overhead, and hardware and software cost. These issues affect not only users of Vista but the entire PC industry, since the effects of the protection measures extend to cover all hardware and software that will ever come into contact with Vista, even if it’s not used directly with Vista (for example hardware in a Macintosh computer or on a Linux server).
This is infuriating. In case you can’t tell, I’ve never liked DRM, but at least it could be avoided. I generally take articles like the one I’m referencing with a grain of salt, but if true, it means that the DRM in Vista is so oppressive that it will raise the price of hardware… And since Microsoft commands such a huge share of the market, hardware manufacturers have to comply, even though a some people (linux users, Mac users) don’t need the draconian hardware requirements. This is absurd. Microsoft should have enough clout to stand up to the media giants, there’s no reason the DRM in Vista has to be so invasive (or even exist at all). As Gutmann speculates in his cost analysis, some of the potential effects of this are particularly egregious, to the point where I can’t see consumers standing for it.
My previous post dealt with Web 2.0, and I posted a YouTube video that summarized how changing technology is going to force us to rethink a few things: copyright, authorship, identity, ethics, aesthetics, rhetorics, governance, privacy, commerce, love, family, ourselves. All of these are true. Earlier, I wrote that the purpose of copyright was to benefit society, and that protection and expiration were both essential. The balance between protection and expiration has been upset by technology. We need to rethink that balance. Indeed, many people smarter than I already have. The internet is replete with examples of people who have profited off of giving things away for free. Creative Commons allows you to share your content so that others can reuse and remix your content, but I don’t think it has been adopted to the extent that it should be.
To some people, reusing or remixing music, for example, is not a good thing. This is certainly worthy of a debate, and it is a discussion that needs to happen. Personally, I don’t mind it. For an example of why, watch this video detailing the history of the Amen Break. There are amazing things that can happen as a result of sharing, reusing and remixing, and that’s only a single example. The current copyright environment seems to stifle such creativity, not the least of which because copyright lasts so long (currently the life of the author plus 70 years). In a world where technology has enabled an entire generation to accellerate the creation and consumption of media, it seems foolish to lock up so much material for what could easily be over a century. Despite all that I’ve written, I have to admit that I don’t have a definitive answer. I’m sure I can come up with something that would work for me, but this is larger than me. We all need to rethink this, and many other things. Maybe that Web 2.0 thing can help.
Update: This post has mutated into a monster. Not only is it extremely long, but I reference several other long, detailed documents and even somewhere around 20-25 minutes of video. It’s a large subject, and I’m certainly no expert. Also, I generally like to take a little more time when posting something this large, but I figured getting a draft out there would be better than nothing. Updates may be made…
Update 2.15.07: Made some minor copy edits, and added a link to an Ars Technica article that I forgot to add yesterday.