Archive

Archive for April, 2011

Designing Scaling Projects

April 7th, 2011 No comments

>A lot of times, people have ideas for apps or games that ultimately will rely on a lot of actual content. This generally is user-generated content, which is a smart model, as it has worked for tons of Web 2.0 companies such as Blogger, MySpace and Facebook. These sites work by providing potential users with rich content that they can access just by joining in, and existing users the ability to add to a content base that they can claim ownership of (this is my profile, these are my blog posts, or these are my friends).

The problem, of course, is that these potential apps don’t (yet) have any users, and thus, they have no content to draw new users in with, and nothing for anyone to yet take ownership of.

I have some ideas for how new applications can overcome this hurdle, but to protect client ideas, I’ll throw out a mythical application. Lets give it a really Web 2.0-y name, “CHECKR”. CHECKR is an application that let’s you scan product bar codes and write/share quick reviews of the product with people within the CHECKR universe or your friends on social media outlets.

Now obviously, when CHECKR first launches, it has 0 users and 0 reviews unique to CHECKR. How would you, as a user, feel if you downloaded this promising new application, only to find that there’s a vast wasteland of content? This dearth of content must be hidden at all costs!

One way to get started is to link your application up with a set of already-there data. Many services have APIs that would let you pull location data or information from them, within their guidelines. For instance, a similar database to what CHECKR is attempting to build can be found on Amazon.com, when a product is pulled up. Amazon has done a fantastic job of cultivating a culture of reviewers at the ready, and all of that data is sitting there on Amazon’s servers, hopefully waiting to be exploited with an API. Even if Amazon doesn’t allow that data to be mined, it’s almost assured they have competitors who do. Services like Yahoo! BOSS allow you to aggregate and shape your own custom search engine using data already mined. Parsing and presenting this data (initially) or using it as a “Reviews from Web” button (after version 2) would help ensure your users can use the app.

Second, sites like Amazon, but even more importantly, applications in this generation, generally do a very good job incorporating game elements into their review processes. Giving users virtual currency, badges, differently-colored usernames, special titles, and other minor rewards helps integrate them into the community you’re building.

Finally, a lot of applications (a LOT of applications) are ultimately drawing some marketing data from it’s users. The demographic data that would be required is hard to get… If you had an app you’d never heard of, and they suddenly asked your first name, last name, a photo, gender, home address, etc, what would your reaction be? For most people, it would be to run, but I bet many of you have shared that much and perhaps more to Facebook. The difference between your idea and Facebook is that Facebook has already demonstrated value. It’s like a pool party that you’re invited to where you can see everyone’s already jumped in and is having fun. It’s rather natural to want to join in.

Once CHECKR has demonstrated a use, and if I design it in such a way that people naturally want to give it good data instead of garbage data, people will populate it with rich analytics data. When people put their high school yearbook photos on Facebook, they are thinking that they’re sharing it with friends, not with Facebook. No one wants to tell Foursquare where they are, but they don’t mind telling their friends where to find them. If your users need to input their real names to ensure their friends can find them, or want to put their address in so they can find the hottest local reviews, then you’ve succeeded in rolling in the analytics into the application’s design naturally.

However, one thing to note is that as you move along through the process, that you roll out features that might “expose” your application’s lack of early content. While it may be cool a year down the line to have a badge for people who have done something in the application 25 times, to make that a search filter criterion in the application’s first iteration is to both put a feature that’s impossible to use, but also a feature that gives away your application’s newness.

Another way that you can avoid this is by building a scaling function directly into your application from the start: to say that when you have 0-10,000 users, the application is more forgiving with when it, say, leaves a posting active or how long it leaves a spot on a map marked. When it has 10,000-100,000 users, it can be less forgiving, and when it has more than 100,000, the application can assume the info is essentially in real time. In CHECKR, for instance, maybe I put a little pin on a map where someone has submitted a review “recently”. When I have only a few users, “recently” could mean within the last week. When CHECKR is a household name and verb, I can let reviews only stay on my map for a few hours. My goal, either way, is to hopefully ensure that the application looks the same to a guy who found it on day 1 as the guy who found it 3 years after it launches.

And ultimately, that’s my goal- I want to appear to be a tightly formed system that users will trust and populate. While I have moral qualms about harvesting analytical data from users, most app makers don’t. Yet, if they force users to give that data, they will end up with unsuccessful apps from at least one standpoint (either they get garbage data, no users or both).

Video games as art

April 6th, 2011 No comments

One of the issues I have a fairly big stake in is the intellectual argument about video games as art. I seriously doubt that in this short blog post, I’ll even scratch the surface of my thoughts on this subject, but I feel like I should get something down.

I follow Roger Ebert’s twitter feed, and I really appreciate his commentary on film and his opinions, but one place we absolutely differ is when it comes to video games as art. Mr. Ebert recently tweeted a writer from his site, Michael Mirasol, and the discussion that his defense of video games had generated (read it here). Given that I’m usually surrounded dally by like-minded opinions at work all day, it hadn’t really occurred to me how the “mainstream” feels about the video game/art debate. Mirasol reminded me that Ebert’s himself has said that video games will never be considered art. I don’t think I can, in the space of this particular post (and while I’m on my iPad), explain why he’s wrong, but what I can do is probably provide a bit of framework for the debate.

I graduated from UC Santa Barbara with a degree in film and media studies. The reason I chose to pursue a film major is not because I love film or because I wanted an easy major (prior to majoring in film, I’d finished 3/4ths of a pure mathematics degree), but because the educational establishment has not yet recognized video games as a legitimate field of study. Yet film and video games are essentially one and the same beast.

Even a cursory analysis of the history of film shows that it grew from idle children’s distractions (like the zoopraxiscope, a “flip book” on a wheel) to inventions that combined the best technological advancements (and minds- Thomas Edison had a large hand in the birth and popularization of the medium) of a generation to provide… What content? 5 second salacious clips of two relatively old people kissing? A 30 second staged film of a train arriving? No sound or color…? Hammy overacting…? People rail against violent video games, but film can claim such minor shames as reigniting the Ku Klux Klan within 20 years of its inception. Video games are approximately 40 years old and have yet to reinspire an entire generation to hate again.

Any scholar looking at this upstart medium in the early 20th century: the lurer of children, the domain of Jews (when being a non-Christian was anathema – and yes, I’m familiar with research that showed there was not any real disproportionate ownership of Jewish cinemas than other groups, but we’re talking about perception here), the purveyor of moral turpitude, the vehicle of pornography and cheap thrills, would be hard pressed to say that by the end of that century, that it would be cemented in the minds of the upper echelons of the educational establishment that film is, of course, an art.

Of course, it would be unfair to judge cinema by what it was at it’s start to what it became. But the visionaries of the day certainly saw it’s potential. And scholars saw the way human subjectivity was expressed through the medium- how a story told by this director or acted out by that actress was different than the story expressed by another director or actress, and how meaning was made by that transaction. Scholars saw how the audience surrendered themselves to the screen, and imagined themselves as the camera, their eyes becoming the objective lens that swept through scenes. Scholars, as they are wont to do, saw penises where there were only spires, great communist struggles where there were only bored office workers and patriarchical oppression in Rudolph Valentino in a bathing suit (newsflash: Rudy was a dude)… The point is, they saw themselves in it. They saw the mirror that is art.

To make a one-to-one comparison between film and video games would be illogical, though. Each is a medium that has different potentials, different means of achieving that. Film purists (who I will define as people who believe film to be an art to the exclusion of video games) will mention that video games embody a competitive aspect that forces it away from a representation of reality, for instance. Yet they do not point to the fact that the narrative structure of (nearly all) film is in no way reflective of reality, which is not always parceled out into neat heaps of acts. Film purists also tend to be largely ignorant of the massive body of theory in game design, which sets up a relationship between the audience and the game designer, who is communicating the vision not just of his or her self, but the vision of hundreds of creative professionals who work incredibly hard to craft an experience. Whether that experience is the thrill of completing a goal or the melancholy of loss (both of which are typical abstract goals in both film and video games), it is a focused experience intended to invoke a real reaction in its consumer. If there was another point to art, someone’d better inform me.

Categories: game design, game development Tags:

[REDACTED] News

April 1st, 2011 No comments

So I haven’t been able to write for a while. February was a tremendously busy month for me. At Appiction, we both closed and finished off the biggest single-app deal that we’ve ever done in a project that… I can’t talk about yet. (But I was happy to be the lead designer of it!) Another one of our Appiction apps that I designed has been getting some major ink, but I don’t think I’m officially allowed to say anything about it yet either. And earlier this week I was happy to do the dev handoff for another project I designed that I’m not allowed to talk about (but honestly, I should be able to… it’s the coolest little application for its niche ever).

And my friends at OAK9 have been putting in some of the best work in that I’ve ever seen on an iPhone video game for an upcoming action title that… I can’t talk about yet. (But I was happy to be the lead designer of it, too!)

And meanwhile, production has kicked up on my desk as I’m working on the iPad (read: HD/enhanced) version of a game that has been bumping around at Appiction since I’ve been here (back in September). I was able to join the design team for the iPhone version, but the iPad version? I’m developing it myself. So believe me, I’m super excited about it and would love to talk about it, but I can’t yet.

So, that’s what I’ve been up to lately. Informative, no? 🙂