By Charles Lam
By Joel Beers
By LP Hastings
By Dave Barton
By LP Hastings
By Joel Beers
By LP Hastings
“It was the first thing that felt solid,” Lee says.
* * *
Standing next to a sliding white board, professor Dan Frost asks a question of his 155 students.
“What aspects of society led people to start playing video games?” His voice is loud and spirited.
A few students raise their hands. “Rich, white kids in suburbia who had nothing to do,” one young man replies.
Frost jots down “idle children” with a dry-erase marker, and then looks to the class for more thoughts.
“There was a greater need to achieve,” another student hypothesizes.
Throughout the bright lecture hall in the Parkview Classroom Building near the center of Aldrich Park, more hands rise.
“People needed an escape.”
“Businesses wanted to make money.”
“Real life is boring.”
The course is Computer Games and Society, and before the students leave the day’s lecture, they watch a commercial for the new war game Call of Duty: Black Ops (the controversial television ad stars Kobe Bryant and Jimmy Kimmel with machine guns storming a city, while the Rolling Stones’ “Gimme Shelter” howls on the soundtrack), and Frost assigns them homework: play an educational video game and critique it. The description of the introductory course—“the study and critical analysis of computer games as art objects, cultural artifacts, gateways to virtual worlds, educational aids, and tools for persuasion and social change”—gives pupils an early clue they won’t spend their study hours fighting off dragons and saving the princess.
“It isn’t a video-games-playing major,” Frost says. “It isn’t, ‘Let’s learn how to get to Level 99.’ It’s about giving students skills that can lead to a very good career.”
After understanding the history and culture behind computer-game technology and picking up some programming skills (everyone must learn the coding language Java), those in the major will move on to game-development courses such as world-building and multiplayer game systems, during which they’ll break into teams to develop original games.
“Students are so excited and energized,” says Frost, adding that he grew up playing chess and Monopoly. “This is part of their culture. This is the medium that speaks to them more than anything else.”
When video games emerged in the 1970s and 1980s, self-taught college students developed some of the most prominent titles. Having free access to mainframe computers, they would pluck away after-hours or during summer vacations. In 1975, Don Daglow, then a student at Claremont Graduate University, wrote the first computer role-playing game, Dungeon, based on Dungeons & Dragons. Kelton Flinn, a pioneer in online games, was a student at the University of Virginia when he co-created Air, an air-combat game that foreshadowed the hit Air Warrior.
But formal training was almost unheard of, and as the demand for increasingly sophisticated games grew, not just for computer systems but also in arcades and home consoles such as Atari and Nintendo, companies found a shortage of workers with the necessary skills. To counter this dilemma, the Vancouver-based animation firm DigiPen Corporation partnered with Nintendo in 1994 to open a training facility, the DigiPen Institute of Technology. It was the first post-secondary program in video-game programming to gain wide acclaim; DigiPen graduates were soon in high demand in the industry. The school proved so successful it eventually located to Redmond, Washington—home of Microsoft’s headquarters—and now offers several bachelor’s-degree programs, all focused on video games.
Still, academia wasn’t particularly quick to catch on. While UCI welcomed its first computer-game-development course in 1999, proposed and taught by Frost, the university was hesitant to adopt games as a permanent discipline. In 2000, Robert Nideffer, a studio art and informatics professor, proposed an undergraduate concentration in gaming studies, one that would incorporate computer-game development, digital arts, software engineering and artificial intelligence. It would have been the first interdisciplinary academic program of its kind at a top-tier North American research university.
But UCI’s review committee rejected the proposal, and Nideffer was perplexed. He’d worked with different schools within the university to develop the curriculum, garnering widespread support. “This was an interdisciplinary program to take a scholarly look at video games,” he told Wired magazine in 2002. “The skill sets translate into any number of things.”
The concentration was finally approved in 2005. The Weekly was unable to reach Nideffer for comment.
That initial resistance doesn’t surprise Jason Della Rocca, former executive director of the International Game Developers Association and founder of the game-industry consulting agency Perimeter Partners. The trade association hosts forums for academics and developers to converse and work to build stronger ties.
“The gatekeepers for validation are not gamers and do not understand the medium,” Della Rocca says. “Take Roger Ebert, who utterly dismisses games, and yet he personally fought many of the same battles for the ascendancy of film as art. Similarly, those in charge of approving new degrees often simply lack an understanding of games.” He adds that many of the old-school academics see game development as a vocational domain, but counters, “People take film studies, and not everyone ends up a film director.”