Archive

Archive for the ‘game development’ Category

Video games as art

April 6th, 2011 No comments

One of the issues I have a fairly big stake in is the intellectual argument about video games as art. I seriously doubt that in this short blog post, I’ll even scratch the surface of my thoughts on this subject, but I feel like I should get something down.

I follow Roger Ebert’s twitter feed, and I really appreciate his commentary on film and his opinions, but one place we absolutely differ is when it comes to video games as art. Mr. Ebert recently tweeted a writer from his site, Michael Mirasol, and the discussion that his defense of video games had generated (read it here). Given that I’m usually surrounded dally by like-minded opinions at work all day, it hadn’t really occurred to me how the “mainstream” feels about the video game/art debate. Mirasol reminded me that Ebert’s himself has said that video games will never be considered art. I don’t think I can, in the space of this particular post (and while I’m on my iPad), explain why he’s wrong, but what I can do is probably provide a bit of framework for the debate.

I graduated from UC Santa Barbara with a degree in film and media studies. The reason I chose to pursue a film major is not because I love film or because I wanted an easy major (prior to majoring in film, I’d finished 3/4ths of a pure mathematics degree), but because the educational establishment has not yet recognized video games as a legitimate field of study. Yet film and video games are essentially one and the same beast.

Even a cursory analysis of the history of film shows that it grew from idle children’s distractions (like the zoopraxiscope, a “flip book” on a wheel) to inventions that combined the best technological advancements (and minds- Thomas Edison had a large hand in the birth and popularization of the medium) of a generation to provide… What content? 5 second salacious clips of two relatively old people kissing? A 30 second staged film of a train arriving? No sound or color…? Hammy overacting…? People rail against violent video games, but film can claim such minor shames as reigniting the Ku Klux Klan within 20 years of its inception. Video games are approximately 40 years old and have yet to reinspire an entire generation to hate again.

Any scholar looking at this upstart medium in the early 20th century: the lurer of children, the domain of Jews (when being a non-Christian was anathema – and yes, I’m familiar with research that showed there was not any real disproportionate ownership of Jewish cinemas than other groups, but we’re talking about perception here), the purveyor of moral turpitude, the vehicle of pornography and cheap thrills, would be hard pressed to say that by the end of that century, that it would be cemented in the minds of the upper echelons of the educational establishment that film is, of course, an art.

Of course, it would be unfair to judge cinema by what it was at it’s start to what it became. But the visionaries of the day certainly saw it’s potential. And scholars saw the way human subjectivity was expressed through the medium- how a story told by this director or acted out by that actress was different than the story expressed by another director or actress, and how meaning was made by that transaction. Scholars saw how the audience surrendered themselves to the screen, and imagined themselves as the camera, their eyes becoming the objective lens that swept through scenes. Scholars, as they are wont to do, saw penises where there were only spires, great communist struggles where there were only bored office workers and patriarchical oppression in Rudolph Valentino in a bathing suit (newsflash: Rudy was a dude)… The point is, they saw themselves in it. They saw the mirror that is art.

To make a one-to-one comparison between film and video games would be illogical, though. Each is a medium that has different potentials, different means of achieving that. Film purists (who I will define as people who believe film to be an art to the exclusion of video games) will mention that video games embody a competitive aspect that forces it away from a representation of reality, for instance. Yet they do not point to the fact that the narrative structure of (nearly all) film is in no way reflective of reality, which is not always parceled out into neat heaps of acts. Film purists also tend to be largely ignorant of the massive body of theory in game design, which sets up a relationship between the audience and the game designer, who is communicating the vision not just of his or her self, but the vision of hundreds of creative professionals who work incredibly hard to craft an experience. Whether that experience is the thrill of completing a goal or the melancholy of loss (both of which are typical abstract goals in both film and video games), it is a focused experience intended to invoke a real reaction in its consumer. If there was another point to art, someone’d better inform me.

Categories: game design, game development Tags:

[REDACTED] News

April 1st, 2011 No comments

So I haven’t been able to write for a while. February was a tremendously busy month for me. At Appiction, we both closed and finished off the biggest single-app deal that we’ve ever done in a project that… I can’t talk about yet. (But I was happy to be the lead designer of it!) Another one of our Appiction apps that I designed has been getting some major ink, but I don’t think I’m officially allowed to say anything about it yet either. And earlier this week I was happy to do the dev handoff for another project I designed that I’m not allowed to talk about (but honestly, I should be able to… it’s the coolest little application for its niche ever).

And my friends at OAK9 have been putting in some of the best work in that I’ve ever seen on an iPhone video game for an upcoming action title that… I can’t talk about yet. (But I was happy to be the lead designer of it, too!)

And meanwhile, production has kicked up on my desk as I’m working on the iPad (read: HD/enhanced) version of a game that has been bumping around at Appiction since I’ve been here (back in September). I was able to join the design team for the iPhone version, but the iPad version? I’m developing it myself. So believe me, I’m super excited about it and would love to talk about it, but I can’t yet.

So, that’s what I’ve been up to lately. Informative, no? 🙂

What iPad’s Rumored 260 dpi Display Might Mean for Developers

January 27th, 2011 No comments

According to MacRumors, the iPad 2 which should be coming along later this quarter, is scheduled to have a 260 pixels-per-inch “Retina” display, though there is some dispute over whether or not the display, which has a lower pixel density, really counts as a retina display.

As a guy who (happily) used a netbook for about a year and a half as his primary computer, I think people are missing the forest for the trees. A display at 260 dpi, at 9″ yields a screen resolution of 2048×1536! My netbook, and just about every one of them of its generation (frustratingly) is only 1024×600. Doubling that resolution (and nearly tripling it on the vertical end) will make a no-doubt better, more crisp viewing experience. I didn’t dive into the first iPad because of its myriad shortcomings (and a lack of initial content), but having played with them at work and enjoyed the now flourishing application support it has, I am probably going to (possibly literally) be in line for the next gen iPad as soon as it’s available.

When I started at Appiction, they had me slicing images for development, and a lot of the projects that I got into for graphic design came in a mad flurry, where I was getting and slicing up 2 or 3 projects per week. For those who don’t know, slicing involves taking those mocked-up final art pieces and making images out of them that can be used for development. Since Appiction rarely designs things with the standard Coco toolkit available in Apple’s Interface Builder (instead relying on flashier designs even for mundane things like navbars), we would have to export out fancy task bars for the developers. It took a bit longer (one method, the Interface Builder method, literally involves dragging and dropping the navbar you want, the other, what we did on projects, involves creating an entire new object from the image of a navbar), but the results are typically more visually interesting, especially compared to Apple’s standard apps, which can look a bit more mundane by their familiarity.

In slicing applications up, it was a joy to learn that the iPhone 4’s resolution was precisely double the iPhone 3GS and below, from 640 x 960 to 320 x 480. This allowed a very simple scheme for developing applications, where we would export out both sizes for development. If you’ve ever wondered why on your iPhone 4 some of the un-updated app icons for retina display look fuzzy, that’s why. There is a lot more detail that can be shown on the iPhone 4’s screen, since each pixel is doubled.

What this means for an iPad at 260 dpi is an extremely high resolution interface, but for developers, a relatively minor headache. Apple supports the double resolution with the “@2x” extension on file names, allowing devs to simply label a navbar at 130 dpi (on the original iPad) simply “navbar.png” and the 260 dpi “Retina” display “navbar@2x.png”. The compiler handles the rest. Simple!

My friends have been asking me to talk a bit more about Android but the thing with Android is that this ease simply doesn’t exist. While Android has a bit more robust feature set for slicing images and buttons that is integrated all over the application (for instance, you can make a start-up screen for an application compatible with every Android device, past, present and future, by instituting some smarter design standards that I’ve tried to help start here at Appiction), it’s not so simple in a fully immersive application where you are redesigning the entire interface, such as a game.

Android resolutions don’t neatly scale up – rather Android supports resolutions like 640×800 and 640 x 854 simultaneously. Some of this is philosophically consistent, but in terms of providing the same robust application space (especially in gaming) that you see on the iOS platforms, I think the jury is still out. I think it’s a bit unwise of Google to not instill some controls on the types of products their OS runs on, or alternatively, to provide a space in which users recognize that their device is not a “universal” device, but that it will have the horsepower and specs to run some applications, but not others. Kind of like what existed with Windows gaming in the 90’s, and to some extent, today.

What I do know is that as an iOS developer and designer, I’m heartened to hear that the iPad 2 might simply double the original iPad resolution. Doing so would demonstrate the sort of forward-thinking that will allow development on iOS to thrive for years to come.

Sim Me: Game Dev Story on iPhone

January 4th, 2011 3 comments

There are times where life gets very meta – like when you’re watching yourself watch TV on TV, for instance, or when you’re playing a game about developing games while you’re at work developing games. I think Lewis Black said it best: “”And I believe that the brain is so smart that when it watches you watch yourself watch you watch yourself do something you’re not supposed to be fucking doing it says you are so stupid I will kill you.

But where Mr. Black is the eternal pessimist, I continue to be a beacon of positivity. Cue  Game Dev Story (iTunes) by Kairosoft: another in a line of repeated task games, similar to EA’s fantastic Lemonade Tycoon (also for iPhone) game. In this one, instead of building a lemonade empire, you’re building a game studio from the ground up. Read more…

Categories: game design, game development, review Tags: