United States – A Thaumaturgical Compendium https://alex.halavais.net Things that interest me. Sat, 16 Mar 2013 03:55:03 +0000 en-US hourly 1 12644277 Undo It Yourself (U.i.Y.) https://alex.halavais.net/undo-it-yourself-u-i-y/ https://alex.halavais.net/undo-it-yourself-u-i-y/#respond Sat, 16 Mar 2013 01:41:44 +0000 http://alex.halavais.net/?p=3395 disThere is a TV show called (in the US) Junkyard Wars. The premise of the show is simple enough: two teams meet in a junkyard and are assigned to build something: a trebuchet, a crane, or some other device. I think we can assume that the collection of stuff is, let us say, “semi-random.” I don’t know whether they start with a real junkyard and just make sure to seed it with useful bits, or they start with useful bits and cover it in random crap, or what, but I just cannot assume that they do this in a real, random scrapyard. The challenge is to make the most of the stuff at hand, and to create something that will work for the purposes of the challenge.

I was thinking about this during the Digital Medial and Learning conference in Chicago this week, and especially during the session titled Make, Do, Engage. The whole conference has a double set of themes. The official theme has to do with civic culture, and my favorite sessions this year have talked about new forms of activism and ways of encouraging social justice. But there is also a focus (including a pre-conference) on making stuff. Panelists spoke about ways students subvert game construction, the idea of jugaad, and thoughts about hacking-based media literacies. There seemed to be an interweaving here between building “stuff” (technology) and building government, and learning. This nexus (learning, politics, and making) was very present at the conference, and hits directly on my specific intersection of interests, so it has been an especially engaging conference for me this year.

In particular, the question is how to lead people to be more willing to engage in hacking, and how to create environments and ecosystems that encourage hacking of the environment. Rafi Santo talked a bit about the “emergence” of the hashtag as an example of Twitter’s relative hackability when compared with Facebook. (The evolution of features of Twitter is something I write about in a short chapter in the upcoming volume Twitter and Society.) Chris Hoadley also talked about the absence of any sort of state support for physical infrastructure led people to have to engage in their own hacks. This recalled for me a point made by Ethan Zuckerman about Occupy Sandy as being an interesting example of collective action that had a very real impact.

At one point Ingrid Erickson mentioned that she had been talking with Rafi about “do it together” technologies–making the hacking process more social. But part of me is much more interested in infrastructure for creativity–forcing people to work together. No one would wish Sandy on any group, but that particular pressure, and the vacuum of institutional support, led to a Temporary Autonomous Government of sorts that stepped in and did stuff because it needed to be done. I also recalled danah boyd mentioning earlier something that anyone who has ever taught in a grad program knows full well: placing a group in a difficult or impossible situation is a good way to quickly build an esprit de corps and bring together those who would otherwise not necessarily choose to collaborate. With all of these ideas mixing around, I wonder if we need a new aesthetic of “undoing it yourself.”

Yes, I suppose that could be what jailbreaking a phone is about, or you might associate this with frame-breaking or other forms of sabotage. But I am thinking of something a bit more pre-constructive.

I went to a lot of schools as a kid; more than one built on one or another piece of the Montessori model. At one, there was a pile of wood, a hammer, and some nails. It wasn’t in a classroom, as I recall, it was down at the end of a hall. If I asked, they would let me go mess with it. It was dangerous: I managed to hammer my thumb with some consistency. And I would be very surprised if they had an outcome in mind; or even if I did. I think I made a model boat. I don’t think anyone would have guessed it was a model boat unless I had told them.

In a more structured setting, piles of Lego bricks might want to look like what is on the cover of the box. And I am sure there are kids who manage–at least once–to achieve the vehicles or castles shown there. But that’s not why you play with Lego. Some part of me really rebels against the new Lego world, with the huge proliferations of specialized pieces. But the truth is that as a kid the specialized pieces were the interesting bits, not the bare blocks. The core 8×2 were there almost as a glue to keep the fun bits together.

Especially in the postmodern world we celebrate the bricoleur, we recognize hybridized work and kludges as interesting and useful, but far less thought is put into where that stuff comes from. Disassembly precedes assembly. I’m interested in what it means to be an effective disassembler, to unmake environments. There is space for scaffolding only once you’ve actually torn down the walls.

I think we need an Undo-it-Yourself movement. People who individually loosen bolts and disconnect wires. Who destroy mindfully. Those who leave junk in your way, knowing that you might see yourself in it. Our world is ripe for decomposition. New ideas about how we shape our built environment and our society are not born out of the ashes of the past, but out of the bits and pieces that are no longer attached the way the Designer intended.

I am not advocating chaos. I’m not suggesting that we should start an evil organization that turns every screw we encounter twice anti-clockwise. Perhaps what I am suggesting is something somewhere between the kit and the junkyard. Something with possibilities we know and we don’t know. Disassemblies of things for playing with.

]]>
https://alex.halavais.net/undo-it-yourself-u-i-y/feed/ 0 3395
Do online classes suck? https://alex.halavais.net/do-online-classes-suck/ https://alex.halavais.net/do-online-classes-suck/#comments Sat, 08 Dec 2012 05:24:46 +0000 http://alex.halavais.net/?p=3321 Before arriving at my current posting, I would have thought the idea that online classes compared poorly to their offline counterparts was one that was slowly and inevitably fading away. But a recent suggestion by a colleague that we might tell incoming freshmen that real students take traditional meatspace courses and those just interested in a diploma go for the online classes caught me a bit off-guard.

I want to be able to argue that online courses are as good as their offline counterparts, but it’s difficult, because we don’t really know that. And this is for a lot of reasons.

The UoP Effect

First, if traditional and elite universities had been the originators of successful online courses and degrees, or if they had promoted those successes better (since I suspect you can find some pretty substantial successes reaching back at least three decades), we wouldn’t have the stigma of the University of Phoenix and its kin. For many, UoP is synonymous with online education, particularly in these parts (i.e., Phoenix).

Is UoP that bad? I don’t know. All I have to judge them on is people I’ve met with UoP degrees (I was not at all impressed), and what I’ve heard from students. What I do know is that they spend a lot of money on advertising and recruiting, and not very much money on faculty, which to me suggests that it is a bad deal.

Many faculty see what UoP and even worse for-profit start-ups are doing and rightly perceive it as a pretty impoverished model for higher education. They rightly worry that if their own university becomes known for online education, it will carry the same stigma a University of Phoenix degree does.

The Adjuncts

At ASU, as with many other research universities, the online courses are far more likely to be taught by contingent faculty rather than core tenure-track faculty, and as a result the students are more likely to end up with the second-string. I’ll apologize for demeaning adjuncts: I know full well that if you stack up the best teachers in any department there is a good chance that adjuncts will be among them, or even predominate. But on average, I suspect that a class taught by an adjunct instructor is simply not as good as one taught by full-time research faculty. There are a lot of reasons for this, but perhaps the most important one is that they do not have the level of support from the university that regular faculty do.

I’ve been told by a colleague here that they wanted to teach in the online program but were told that they were “too expensive” to be employed in that capacity. And there is a model that is beginning to separate out course design, “delivery”(ugh!) or “facilitation,” and evaluation. But I suspect the main reason more full-time faculty don’t teach online is more complicated.

Online is for training, not complex topics

This used to be “Would you trust a brain surgeon with an online degree?” which is actually a pretty odd question. Brain surgeons in some ways have more in common with auto mechanics than they do with engineers, but the point was to test whether you would put yourself in mortal danger if you were claiming online education was good. Given how much surgery is now done using computer-controlled tools, I think some of that question is moot now, but there remains this idea that you can learn how to use Excel online, but you certainly cannot learn about social theory without the give-and-take of a seminar.

It’s a position that is hard for me to argue against, in large part because it’s how almost all of us in academia learned about these things. I too have been taught in that environment, and for the most part, my teaching is in that environment. As one colleague noted, teaching in a physical classroom is something they have been taught how to do and they have honed their craft; they do it really well. Why are they forced to compete for students with online courses when they know they would not be as effective a teacher in that environment?

But in many ways this is a self-fulfilling prophecy. Few schools require “traditional” faculty to teach online, though they may allow or even encourage it. As a result the best teachers are not necessarily trying to figure out how to make online learning great. We are left with the poor substitute of models coming from industry (modules teaching employees why they should wear a hair net) and the cult of the instructional designer.

Instructional Designers

As long as I’ve already insulted adjuncts, I’ll extend to instructional designers. I know a lot of brilliant ones, but the “best practices” make online education into the spoon-feeding idiot-proof nonsense that many faculty think it is. It is as if the worst of college education has been simmered until you get it down to a fine paste, and this paste can be flavored with “subject expertise.” Many are Blackboard personified.

When you receive a call–as I recently did–for proposals to change your course so that it can be graded automatically, using multiple guess exams and the like, it makes you wonder what the administration thinks good teaching is.

I am a systematizer. I love the idea of learning objectives aligned with assessments and all that jazz. But in sitting through a seminar on Quality Matters recently, we found ourselves critiquing a course that encouraged participation on a discussion board. How did discussion align with the learning objectives? It didn’t. OK, let’s reverse engineer it. How can you come up with a learning objective, other than “can discuss matters cogently in an online forum” that encourages the use of discussion-based learning. Frankly, one of the outcomes of discussion is a personalized form of learning, a learning outcome that really comes out as “Please put your own learning outcome here, decided either before or after the class.” Naturally, such a learning outcome won’t sit well with those who follow the traditional mantra of instructional design.

QM has its heart in the right place: it provides a nice guideline for making online courses more usable, and that’s important. But what is vital is making online spaces worthy of big ideas, and not just training exercises.

The Numbers

I like the idea of the MOOC, and frankly, it makes a lot of sense for a lot of courses. It’s funny when people claim their 100-student in-person class is more engaging than a 1,000-student online course. In most cases, this is balderdash. Perhaps it is a different experience for the 10 people who sit up front and talk, but generally, big classes online are better for more students than big classes off.

Now, if you are a good teacher, chances are you do more than lecture-and-test. You get students into small groups, and they work together on meaningful projects, and the like. Guess what: that’s true of the good online instructors as well.

I think you can create courses that scale without reducing them to delivery-and-test. ASU is known for doing large-scale adaptive learning for our basic math courses, for example, and I think there are models for large-scale conversation that can be applied to scalable models for teaching. It requires decentering the instructor–something many of my colleagues are far from comfortable with–but I am convinced highly scalable models for interaction can be developed further. But scalable courses aren’t the only alternative.

I think the Semester Online project, which allows students from a consortium of universities to take specialized small classes online, is a great way to start to break the “online = big” perception. Moreover, you can make small online course materials and interactions open, leading to a kind of TOOC (Tiny Open Online Course) or a Course as a Fishbowl.

Assessment as Essential

I’ll admit, I’m not really a big part of the institutionalized assessment process. But it strikes me as odd that tenure, and our continued employment as professors, is largely based on an assessment of the quality of our research, not just how many papers we put out–though of course, volume isn’t ignored. On the other hand, in almost every department in the US, budgeting and success is based on FTEs: how can you produce more student hours with less faculty hours. Yes, there is recognition for effective and innovative teaching. But when the rubber hits the road, it’s the FTEs that count.

Critics of online education could be at least quieted a bit if there were strong structures of course and program assessment. Not just something that gets thrown out there when accreditation comes up, but something that allowed for the ongoing open assessment of what students were learning in each class. This would change the value proposition, and make us rethink a lot of our decisions. It would also provide a much better basis for deciding on teachers’ effectiveness (although the teacher is only one part of what leads to learning in a course) than student evals alone.

This wouldn’t fix everything. It may very well be that people learn better in small, in-person classrooms, but that it costs too much to do that for every student or for every course. The more likely outcome, it seems to me, is that some people learn some things better online than they do offline. If that’s the case, it would take the air out of the idea that large institutions are pursuing online education just because it is better for their bottom line.

In any case, the idea that we are making serious, long-term investments and decisions in the absence of these kinds of data strikes me as careless. Assessment doesn’t come for free, and there will be people who resist the process, but it seems like a far better metric of success than does butts in seats.

]]>
https://alex.halavais.net/do-online-classes-suck/feed/ 7 3321
The Coming Gaming Machine, 1975-1985 https://alex.halavais.net/the-coming-gaming-machine-1975-1985/ https://alex.halavais.net/the-coming-gaming-machine-1975-1985/#comments Fri, 13 Jul 2012 20:35:28 +0000 http://alex.halavais.net/?p=3258 Was going through old backups in the hope of finding some stuff I’ve lost and I ran into this, a draft I was working on around the turn of the millennium. Never went anywhere, and was never published. I was just pressing delete, when I realized it might actually be of interest to someone. It was a bit of an attempt at a history of the future of gaming, during the heyday of the console. Please excuse any stupidity–I haven’t even looked at it, just copied it over “as is.”

The Coming Gaming Machine, 1975-1985

Abstract

The early 1980s are sometimes referred to as the ‘Golden Age’ of computer games. The explosion of video games–in arcades, as home consoles, and eventually on home computers–led many to question when the fad would end. In fact, rather than an aberration, the decade from 1975 to 1985 shaped our view of what a computer is and could be. In gaming, we saw the convergence of media appliances, the rise of the professional software, and the first ‘killer app’ for networking. During this period, the computer moved from being a ‘giant brain’ to a home appliance, in large part because of the success of computer gaming.

Introduction

Sony’s offering in the game console arena, the Playstation 2, was among the most anticipated new products for the 2000 Christmas season. Although rumors and reviews added to the demand, much of this eagerness was fueled by an expensive international advertising campaign. One of the prominent television spots in the US listed some of the features of a new gaming console, including the ability to ‘tap straight into your adrenal gland’ and play ‘telepathic personal music.’ The product advertised was not the Playstation 2, but the hypothetical Playstation 9, ‘new for 2078.’ The commercial ends with an image of the Playstation 2 and a two-word tag line: ‘The Beginning’ 1.

The beginning, however, came over twenty-five years earlier with the introduction of home gaming consoles. For the first time, the computer became an intimate object within the home, and became the vehicle for collective hopes and fears about the future. In 1975 there were hundreds of thousands of gaming consoles sold, and there were dozens of arcade games to choose from. By 1985, the year the gaming console industry was (prematurely) declared dead, estimates put the number of Atari 2600 consoles alone at over 20 million world-wide2.

The natural assumption would be that gaming consoles paved the way for home computers, that the simple graphics and computing power of the Atari 2600 was an intermediary evolutionary step toward a ‘real’ computer. Such a view would obscure both the changes in home computers that made them more like gaming consoles, and the fact that many bought these home computers almost exclusively for gaming. But during the decade following 1975, the view of what gaming was and could be changed significantly. Since gaming was the greatest point of contact between American society and computing machinery, gaming influenced the way the public viewed and adopted the new technology, and how that technology was shaped to meet these expectations.

The Place of Gaming

When the University of California at Irvine recently announced that they may offer an undergraduate minor in computer gaming, many scoffed at the idea. The lead in an article in the Toronto Star, quipped, ‘certainly, it sounds like the punchline to a joke’3. As with any academic study of popular culture, many suggested the material was inappropriate for the university. In fact, despite the relatively brief history of computer gaming, it has had an enormous impact on the development of computing technology, how computers are seen and used by a wide public, and the degree to which society has adapted to the technology. Games help define how society imagines and relates to computers, and how they imagine future computers will look and how they will be used. The shift in the public view of computers from ‘giant brains’ to domestic playthings occurred on a broad scale during the ten years between 1975 and 1985, the period coincident with the most explosive growth of computer gaming.

Games have also played a role in both driving and demonstrating the cutting edge of computing. While they are rarely the sole purpose for advances in computing, they are often the first to exploit new technology and provide a good way for designers and promoters to easily learn and demonstrate the capabilities of new equipment. Programmers have used games as a vehicle for developing more sophisticated machine intelligence4, as well as graphic techniques. Despite being seen as an amusement, and therefore not of import, ‘the future of “serious” computer software—educational products, artistic and reference titles, and even productivity applications—first becomes apparent in the design of computer games’5. Tracing a history of games then provides some indication of where technology and desire meet. Indeed, while Spacewar might not have been the best use of the PDP-1’s capabilities, it (along with adventure games created at Stanford and the early massively multiplayer games available on the PLATO network) foreshadowed the future of computer entertainment surprisingly well. Moreover, while the mainstream prognostications of the future of computing are often notoriously misguided, many had better luck when the future of computing technology was looked at through the lens of computer games.

Computer Gaming to 1975

The groundwork of computer gaming was laid well before computer games were ever implemented. Generally, video games grew out of earlier models for gaming: board and card games, war games, and sports, for example. William Higinbotham’s implementation of a Pong-like game (‘Tennis for Two’) in 1958, using an oscilloscope as a display device, deserves some recognition as being the first prototype of what would come to be a popular arcade game. Generally, though, the first computer game is credited to Steve Russel, who with the help of a group of programmers wrote the first version of the Spacewar game at MIT in 1961. The game quickly spread to other campuses, and was modified by enterprising players. Although Spacewar remained ensconced within the milieu of early hackers, it demonstrated a surprisingly wide range of innovations during the decade following 1961. The earliest versions were quite simple, two ships that could be steered in real time on a CRT and could shoot torpedoes at one another. Over time, elaborations and variations were added: gravity, differing versions of hyperspace, dual monitors, and electric shocks for the losing player, among others. As Alan Kay noted: ‘The game of Spacewar blossoms spontaneously wherever there is a graphics display connected to a computer’6.

In many ways, Spacewar typified the computer game until the early 1970s. It was played on an enormously expensive computer, generally within a research university, often after hours. Certainly, there was little thought to this being the sole, or even a ‘legitimate,’ use of the computer. While time was spent playing the game, equally as important was the process of creating the game. The differentiation between player and game author had yet to be drawn, and though a recreational activity—and not the intended use of the system—this game playing took place in a research environment. There was no clear relationship between computer gaming and the more prosaic pinball machine.

However, after a ten year diffusion, Spacewar marked a new kind of computing: a move from the ‘giant brain’ of the forties to a more popular device in the 1970s. Stewart Brand wrote an article in Rolling Stone in 1972 that clearly hooked the popular diffusion of computing to ‘low-rent’ development in computer gaming. Brand begins his article by claiming that ‘ready or not, computers are coming to the people.’ It was within the realm of gaming that the general public first began to see computers as personal machines.

Perhaps more importantly, by taking games seriously, Brand was able to put a new face on the future of computing. At a time when Douglas Englebart’s graphical user interfaces were being left aside for more traditional approaches to large-scale scientific computing, Brand offered the following:

… Spacewar, if anyone cared to notice, was a flawless crystal ball of things to come in computer science and computer use:
1. It was intensely interactive in real time with the computer.
2. It encouraged new programming by the user.
3. It bonded human and machine through a responsive broadhand (sic) interface of live graphics display.
4. It served primarily as a communication device between humans.
5. It was a game.
6. It functioned best on, stand-alone equipment (and disrupted multiple-user equipment).
7. It served human interest, not machine. (Spacewar is trivial to a computer.)
8. It was delightful. (p. 58.)

Brand’s focus was on how people could get hold of a computer, or how they could build one for themselves. The article ends with a listing of the code for the Spacewar game, the first and only time computer code appeared in Rolling Stone. He mentions off-handedly that an arcade version of Spacewar was appearing on university campuses. Brand missed the significance of this. Gaming would indeed spread the use of computing technology, but it would do so without the diffusion of programmable computers. Nonetheless, this early view of the future would be echoed in later predictions over the next 15 years.

On the arcade front, Nolan Bushnell (who would later found Atari), made a first foray into the arcade game market with a commercial version of Spacewar entitled Computer Space in 1971. The game was relatively unsuccessful, in large part, according to Bushnell, because of the complicated game play. His next arcade game was much easier to understand: a game called Pong that had its roots both in a popular television gaming console and earlier experimentation in electronic gaming. Pong’s simple game play (with instructions easily comprehended by inebriated customers: ‘Avoid missing ball for high score’) drove its success and encouraged the development of a video gaming industry.

Equally important was the tentative television and portable gaming technologies that began to sprout up during the period. Though Magnavox’s Odyssey system enjoyed some popularity with its introduction in 1972, the expense of the television gaming devices and their relatively primitive game play restricted early diffusion. It would take the combination of microprocessor controlled gaming with the television gaming platform to drive the enormous success of the Atari 2600 and its successors. At the same time, the miniaturization of electronics generally allowed for a new wave of hand-held toys and games. These portable devices remain at the periphery of gaming technology, though these early hand-held games would be forerunners to the Lynx, Gameboy and PDA-based games that would come later.

By 1975, it was clear that computer gaming, at least in the form of arcade games and home gaming systems, was more than an isolated trend. In the previous year, Pong arcade games and clones numbered over 100,000. In 1975, Sears pre-sold 100,000 units of Atari’s Pong home game, selling out before it had shipped7. It had not yet reached its greatest heights (the introduction of Space Invaders several years later would set off a new boom in arcade games, and drive sales of the Atari 2600), but the success of Pong in arcades and at home had secured a place for gaming.

The personal computer market, on the other hand, was still dominated by hobbyists. This would be a hallmark year for personal computing, with the Altair system being joined by the Commodore PET, Atari’s 400 and 800, and Apple computers. Despite Atari’s presence and the focus on better graphics and sound, the computer hobbyists remained somewhat distinct from the console gaming and arcade gaming worlds. Byte magazine, first published in 1975, made infrequent mention of computer gaming, and focused more heavily on programming issues.

Brand was both the first and among the most pronounced to use gaming as a guide to the future of computing and society. In the decade between 1975 and 1985, there were a number of predictions about the future of gaming made, but most of these were off-handed comments of a dismissive nature. It is still possible to draw out a general picture of what was held as the future of gaming—and with it the future of computing—by examining contemporaneous accounts and predictions8.

Many of these elements are already present in Brand’s prescient view from 1972. One that he seemed to have missed is the temporary bifurcation of computer gaming into machines built for gaming specifically, and more general computing devices. (At the end of the article, it is clear that Alan Kay—who was at Xerox PARC at the time and would later become chief scientist for Atari—has suggested that Spacewar can be programmed on a computer or created on a dedicated machine, a distinction that Brand appears to have missed.) That split, and its continuing re-combinations, have driven the identity of the PC as both a computer and a communications device. As a corollary, there are periods in which the future seems to be dominated by eager young programmers creating their own games, followed by a long period in which computer game design is increasingly thought of as an ‘art,’ dominated by a new class of pop stars. Finally, over time there evolves an understanding of the future as a vast network, and how this will affect gaming and computer use generally.

Convergence

1975 marks an interesting starting point, because it is in this year that the microprocessor emerges as a unifying element between personal computers and video games. Although early visions of the home gaming console suggested the ability to play a variety of games, most of the early examples, like their arcade counterparts, were limited to a single sort of game, and tended to be multi-player rather than relying upon complex computer-controlled opponents. Moreover, until this time console games were more closely related to television, and arcade video games to earlier forms of arcade games. Early gaming systems, even those that made extensive use of microprocessors, were not, at least initially, computers ‘in the true sense’9. They lacked the basic structure that allowed them to be flexible, programmable machines. The emerging popularity of home computers, meanwhile, was generally limited to those with an electronics and programming background, as well as a significant disposable income.

As consoles, arcade games, and personal computers became increasingly similar in design, their futures also appeared to be more closely enmeshed. At the high point of this convergence, home computers were increasingly able to emulate gaming systems—an adaptor for the Vic-20 home computer allowed it to play Atari 2600 console game cartridges, for example. On the other side, gaming consoles were increasingly capable of doing more ‘computer-like’ operations. As an advertisement in Electronic Gaming for Spectravideo’s ‘Compumate’ add-on to the Atari 2600 asks ‘Why just play video games? … For less than $80, you can have your own personal computer.’ The suggestion is that rather than ‘just play games,’ you can use your gaming console to learn to program and ‘break into the exciting world of computing.’ Many early computer enthusiasts were gamers who tinkered with the hardware in order to create better gaming systems10. This led some to reason that video game consoles might be a ‘possible ancestor of tomorrow’s PC’11. As early as 1979, one commentator noted that the distinction between home computers and gaming consoles seemed to have ‘disappeared’12. An important part of this world is learning to program and using the system to create images and compose music. Just before console sales began to lose momentum in the early 1980s, and home computer sales began to take off, it became increasingly difficult to differentiate the two platforms.

Those who had gaming consoles often saw personal computers as ultimate gaming machines, and ‘graduated’ to these more complex machines. Despite being termed ‘home computers,’ most were installed in offices and schools13. Just as now, there were attempts to define the home computer and the gaming console in terms of previous and future technologies, particularly those that had a firm domestic footing. While electronic games (and eventually computer games) looked initially like automated versions of traditional games, eventually they came to be more closely identified with television and broadcasting. With this association came a wedding of their futures. It seemed natural that games would be delivered by cable companies and that videodisks with ‘live’ content would replace the blocky graphics of the current systems. This shift influenced not only the gaming console but the home computer itself. Now associated with this familiar technology, it seemed clear that the future of gaming lay in the elaborations of Hollywood productions. This similarity played itself out in the authoring of games and in attempts to network them, but also in the hardware and software available for the machines.

Many argued that the use of cartridges (‘carts’) for the Atari 2600, along with the use of new microprocessors and the availability of popular arcade games like Space Invaders, catapulted the product to success. Indeed, the lack of permanent storage for early home computers severely limited their flexibility. A program (often in the BASIC programming language) would have to be painstakingly typed into the computer, then lost when the computer was turned off. As a result, this was only appealing to the hard-core hobbyist, and kept less expert users away14. Early on, these computers began using audio cassette recorders to record programs, but the process of loading a program into memory was a painstaking one. More importantly, perhaps, this process of loading a program into the computer made copy-protection very difficult. By the end of the period, floppy disk drives were in wide use. This remained an expensive technology in the early days, and could easily exceed the cost of the computer itself. Taking a cue from the gaming consoles, many of these new home computers accepted cartridges, and most of these cartridges were games.

The effort to unite the computer with entertainment occurred on an organizational level as well. Bushnell’s ‘Pizza Time Theaters’ drew together food and arcade gaming and were phenomenally successful, at one point opening a new location every five days. Not surprisingly, the traditional entertainment industry saw electronic gaming as an opportunity for growth. Since the earliest days of gaming, the film industry served as an effective ‘back story’ for many of the games. It was no coincidence that 1975’s Shark Jaws (with the word ‘shark’ in very small type), for example, was released very soon after Jaws hit the theaters. The link eventually went the other direction as well, from video games and home computer gaming back into motion pictures, with such films as Tron (1982), WarGames (1983) and The Last Starfighter (1984).

In the early 1980s the tie between films and gaming was well established, with a partnership between Atari and Lucasfilm yielding a popular series of Star Wars based games, and the creation of the E.T. game (often considered the worst mass-marketed game ever produced for the 2600). Warner Communications acquired Atari—the most successful of the home gaming producers, and eventually a significant player in home computing—in 1976. By 1982, after some significant work in other areas (including the ultimately unsuccessful Qube project, which was abandoned in 1984), Atari accounted for 70% of the group’s total profits. Despite these clear precedents, it is impossible to find any predictions that future ties between popular film and gaming would continue to grow as they have over the interceding fifteen years.

This new association did lead to one of the most wide-spread misjudgments about the future of gaming: the rise of the laserdisc and interactive video. Dragon’s Lair was the first popular game to make use of this technology. Many predicted that this (or furtive attempts at holography15) would save arcade and home games from the dive in sales suffered after 1983, and that just as the video game market rapidly introduced computers to the home, they would also bring expensive laserdisc players into the home. The use of animated or live action video, combined with decision-based narrative games or shooting games, provided a limited number of possible outcomes. Despite the increased attractiveness of the graphics, the lack of interactivity made the playability of these games fairly limited, and it was not long before the Dragon’s Lair machines were collecting dust. Because each machine required (at the time) very expensive laserdisc technology, and because the production costs of games for the system rivaled that of film and television, it eventually became clear that arcade games based on laserdisc video were not profitable, and that home-based laserdisc systems were impractical.

The prediction that laserdiscs would make up a significant part of the future of gaming is not as misguided as it at first seems. The diffusion of writable CD-ROM drives, DVD drives, and MP3 as domestic technologies owes a great deal to gaming—both computer and console-based. At present, few applications make extensive use of the storage capacities of CD-ROMs in the way that games do, and without the large new computer games, there would be little or no market for DVD-RAM and other new storage technologies in the home. Unfortunately, neither the software nor the hardware of the mid-1980s could make good use of the video capability of laserdiscs, and the technology remained too costly to be effective for gaming. A few saw the ultimate potential of optical storage. Arnie Katz, in his column in Electronic Games in 1984, for example, suggests that new raster graphics techniques would continue to be important, and that ‘ultimately, many machines will blend laserdisc and computer input to take advantage of the strengths of both systems’ 16 (this despite the fact that eight months earlier he had predicted that laserdisc gaming would reach the home market by the end of 1983). Douglas Carlston, the president of Broderbund, saw a near future in which Aldous Huxley’s ‘feelies’ were achieved and a user ‘not only sees and hears what the characters in the films might have seen and heard, but also feels what they touch and smells what they smell’17. Overall, it is instructive to note the degree to which television, gaming systems, and home computers each heavily influenced the design of the other. The process continues today, with newer gaming consoles like the Playstation 2 and Microsoft’s new Xbox being internally virtually indistinguishable from the PC. Yet where, in the forecasting of industry analysts and work of social scientists, is the video game?

A Whole New Game

Throughout the 1970s and 1980s, arcade games and console games were heavily linked. New games were released first as dedicated arcade games, and later as console games. The constraints of designing games for the arcade—those which would encourage continual interest and payment—often guided the design of games that also appeared on console systems. In large part because of this commercial constraint, many saw video games (as opposed to computer games) as a relatively limited genre. Even the more flexible PC-based games, though, were rarely seen as anything but an extension of traditional games in a new modality. Guides throughout the period suggested choosing games using the same criteria that they would apply to choosing traditional games. Just as importantly, it was not yet clear how wide the appeal of computerized versions of games would be in the long run. As one board game designer suggested, while video games would continue to become more strategic and sophisticated, they would never capture the same kind of audience enjoyed by the traditional games18.

Throughout the rapid rise and fall of gaming during the early 1980s, two changes came about in the way people began to think about the future of gaming. On the one hand, there emerged a new view of games not merely as direct translations of traditional models (board games, etc.), but as an artistic pursuit. The media and meta-discourse surrounding the gaming world gave rise to a cult of personality. At the same time, it became increasingly difficult for a single gaming author to create a game in its entirety. The demand cycle for new games, and increasingly more complex and intricate games, not only excluded the novice programmer, it made the creation of a game a team effort by necessity. As such, the industrial scale of gaming increased, leaving smaller companies and individuals unable to compete in the maturing market.
This revolution began with home computers that were capable of more involved and long-term gaming. As one sardonic newspaper column in 1981 noted:

The last barriers are crumbling between television and life. On the Apple II you can get a game called Soft Porn Adventure. The Atari 400 and 800 home computers already can bring you games on the order of Energy Czar or SCRAM, which is a nuclear power plant simulation. This is fun? These are games? 19

The capabilities of new home computers were rapidly exploited by the new superstars of game design. An article in Popular Computing in 1982 noted that game reviewers had gone so far overboard in praising Chris Crawford’s Eastern Front, that they recommended buying an Atari home computer, if you didn’t have one, just to be able to play the game20. Crawford was among the most visible group of programmers who were pushing game design beyond the limits of traditional games:

Crawford hopes games like Eastern Front and Camelot will usher in a renaissance in personal computer games, producing games designed for adults rather than teenagers. He looks forward to elaborate games that require thought and stimulate the mind and even multiplayer games that will be played cross-country by many players at the same time, with each player’s computer displaying only a part of the game and using networks linked by telephone lines, satellites, and cable TV.

Crawford extended his views in a book entitled, naturally, The Art of Computer Game Design (1982), in which he provided a taxonomy of computer games and discussed the process of creating a video game. He also devotes a chapter to discussing the future of the computer game. Crawford notes that changes in technology are unlikely to define the world of gaming. Instead, he hoped for new diversity in gaming genres:

I see a future in which computer games are a major recreational activity. I see a mass market of computer games not too different from what we now have, complete with blockbuster games, spin-off games, remake games, and tired complaints that computer games constitute a vast wasteland. I even have a term for such games—cyberschlock. I also see a much more exciting literature of computer games, reaching into almost all spheres of human fantasy. Collectively, these baby market games will probably be more important as a social force than the homogenized clones of the mass market, but individual games in this arena will never have the economic success of the big time games.21

In an interview fifteen years later, Crawford laments that such hopes were well off base. Though such hopes were modest—that in addition to the ‘shoot the monsters!’ formula, as he called it, there would be a ‘flowering of heterogeneity’ that would allow for ‘country-western games, gothic romance games, soap-opera games, comedy games, X-rated games, wargames, accountant games, and snob games’ and eventually games would be recognized as ‘a serious art form’—he suggests that over fifteen years they proved to be misguided22. In fact, there were some interesting developments in the interim years: everything from Sim City and Lemmings to Myst and Alice. A new taxonomy would have to include the wide range of ‘god games’ in addition to the more familiar first-person shooters. In suggesting the diversification of what games could be, Crawford was marking out a new territory, and reflecting the new-found respectability of an industry that was at the peak of its influence. The view that ‘programmer/artists are moving toward creating an art form ranging from slapstick to profundity,’ appeared throughout the next few years23.

During the same period, there was a short window during which the future of gaming was all about the computer owner programming games rather than purchasing them. Indeed, it seemed that the ability to create your own arcade-quality games would make home computers irresistible24. Listings in the BASIC programming language could be found in magazines and books into the early 1980s. It seemed clear that in the future, everyone would know how to program. Ralph Baer noted in an interview in the same year that students ‘should be able to speak one or two computer languages by the age of 18, those who are interested. We’re developing a whole new generation of kids who won’t be afraid to generate software’25. By the time computers began to gain a foothold in the home, they increasingly came with a slot for gaming cartridges, much like the consoles that were available. In part, this was dictated by economic concerns—many of the new manufacturers of home computers recognized that software was both a selling point for the hardware and a long-terms source of income26—but part of it came with a new view of the computer as an appliance, and not the sole purview of the enthusiast. Computer games during the 1980s outgrew the ability of any single programmer to create, and it became clear that, in the future, games would be designed more often by teams27.

Connected Gaming

By the 1980s, there was little question that networking would be a part of the future of gaming. The forerunners of current networked games were already in place. The question, instead, was what form these games would take and how important they would be. The predictions regarding networking tended to change from the highly interactive experiments in networked computing, to the experiments in cable-television and telephone distribution of games in the 1980s. A view from 1981 typifies the importance given to communications and interfaces for the future of gaming. It suggests that in five years time:

Players will be able to engage in intergalactic warfare against opponents in other cities, using computers connected by telephone lines. With two-way cable television, viewers on one side of town might compete against viewers on the other side. And parents who think their children are already too attached to the video games might ponder this: Children in the future might be physically attached to the games by wires, as in a lie detector28.

A 1977 article suggests the creation of persistent on-line worlds that ‘could go on forever,’ and that your place in the game might even be something you list in a will29. Others saw these multi-player simulations as clearly a more ‘adult’ form of gaming, that began to erase the ‘educational/ entertainment dichotomy’30. The short-term reality of large-scale on-line gaming remained in many ways a dream during this period, at least for the general public. But the ability to collect a subscription fee led many to believe that multiplayer games were ‘too lucrative for companies to ignore’31. Indeed, the multiplayer games like Mega Wars could cost up to $100 a week to play, and provided a significant base of subscribers for Compuserve32.

The software industry had far less ambitious plans in mind, including a number of abortive attempts to use cable and telephone networks to distribute gaming software for specialized consoles. Despite failures in cable and modem delivery, this was still seen as a viable future into the middle-1980s. Even with early successes in large-scale on-line gaming, it would be nearly a decade before the mainstream gaming industry would become involved in a significant way.

Retelling the Future

The above discussions suggests that when predictions are made about the future of gaming, they are often not only good predictors of the future of computing technology, but also indicators of general contemporaneous attitudes toward the technology. Given this, it would seem to make sense that we should turn to current games to achieve some kind of grasp on the future of the technology. It is not uncommon to end a small piece of history with a view to the future, but here I will call for just the opposite: we should look more closely at the evolution of gaming and its social consequences at present.

Despite a recognition that games have been important in the past, we seem eager to move ‘beyond’ games to something more serious. Games seem, by definition, to be trivial. Ken Uston, in an article appearing in 1983 in Creative Computing on the future of video games expressed the feeling:

Home computers, in many areas, are still a solution in search of a problem. It is still basically games, games, games. How can they seriously expect us to process words on the low-end computers? The educational stuff will find a niche soon enough. But home finance and the filing of recipes and cataloguing of our stamp collections has a long way to go.

A similar contempt of gaming was suggested by a New York Times article two years later: ‘The first generation of video games swept into American homes, if ever so briefly. And that was about as far as the home-computer revolution appeared ever destined to go’33. More succinctly, in an issue in which Time named the personal computer its ‘Man’ of the Year, it notes that the ‘most visible aspect of the computer revolution, the video game, is its least significant’34. Though later the article goes on to suggest that entertainment and gaming will continue to be driving forces over the next decade, the idea of games (at least in their primitive state) is treated disdainfully.

This contempt of gaming, of the audience, and of popular computing, neglects what has been an extremely influential means by which society and culture have come to terms with the new technology. Increasingly, much of the work with computers is seen from the perspective of game-playing35. Games are also central to our social life. Certainly, such a view is central to many of the post-modern theorists that have become closely tied to new technologies, who view all discourse as gaming36. Within the more traditional sociological and anthropological literature, games have been seen as a way of acculturating our young and ourselves. We dismiss this valuable window on society at our own peril.

A recognition of gaming’s central role in computer technology, as a driving force and early vanguard, should also turn our attention to today’s gamers. Recent advances in gaming, from involved social simulations like The Sims, to ‘first-person shooters’ like Quake that have evolved new communal forms around them, to what have come to be called ‘massively multiplayer on-line role playing games’ (MMORPGs) like Everquest and Ultima Online, the games of today are hard to ignore. They have the potential not only to tell us about our relation to technology in the future, but about the values of our society today. Researchers lost out on this opportunity in the early days of popular computing, we should not make the same mistake.

Notes

1. A copy of this advertisement is available at ‘AdCritic.com’: http:// www.adcritic.com/content/sony-playstation2-the-beginning.html (accessed 1 April 2001).
2. Donald A. Thomas, Jr., ‘I.C. When,’ http://www.icwhen.com (accessed 1 April 2001).
3. David Kronke, ‘Program Promises Video Fun N’ Games’, Toronto Star, Entertainment section, 19 March 2000.
4. Ivars Peterson, ‘Silicon Champions of the Game,’ Science News Online, 2 August 1997, http://www.sciencenews.org/ sn_arc97/8_2_97/bob1.htm (accessed 1 April 2000).
5. Ralph Lombreglia, ‘In Games Begin Responsibilities,’ The Atlantic Unbound, 21 December 1996, http://www.theatlantic.com/unbound/digicult/dc9612/dc9612.htm (accessed 1 April 2001).
6. Stewart Brand, ‘Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums,’ Rolling Stone, 7 December 1972, p 58.
7. Thomas.
8. While there is easy access to many of the popular magazines of the period, it remains difficult to obtain some of the gaming magazines and books, and much of the ephemera. The reasons are two-fold: First, academic and public libraries often did not subscribe to the gaming monthlies. Often these were strong advertising vehicles for the gaming industry, and as already suggested, the subject matter is not ‘serious,’ and is often very time-sensitive. More importantly, there has been a strong resurgence of nostalgia for gaming during the period, and this has led to the theft of many periodical collections from libraries. It is now far easier to find early copies of Electronic Games magazine on Ebay than it is to locate them in libraries.
9. Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: BasicBooks, 1996), p. 228.
10. Jake Roamer, ‘Toys or Tools,’ Personal Computing, Nov/Dec, 1977, pp. 83-84.
11. Jack M. Nilles, Exploring the World of the Personal Computer (Englewood Cliffs, NJ: Prentice-Hall, 1982), p. 21.
12. Peter Schuyten, ‘Worry Mars Electronics Show,’ New York Times, 7 June 1979, sec. 4, p2, col. 1.
13. Richard Schaffer, ‘Business Bulletin: A Special Background Report,’ Wall Street Journal, 14 September 1978, p.1, col. 5.
14. Mitchell C. Lynch, ‘Coming Home,’ Wall Street Journal, 14 May 1979, p. 1, col. 4.
15. Stephen Rudosh, Personal Computing, July 1981, pp.42-51, 128.
16. Arnie Katz, ‘Switch On! The Future of Coin-Op Video Games,’ Electronic Games, September 1984. Also available on-line at http://cvmm.vintagegaming.com/egsep84.htm (accessed 1 April 2001).
17. Douglas G. Carlston, Software People: An Insider’s Look at the Personal Computer Industry (New York: Simon & Schuster, 1985), p. 269.
18. William Smart, ‘Games: The Scramble to Get On Board,’ Washington Post, 8 December 1982, pg. C5.
19. Henry Allen, ‘Blip! The Light Fantastic,’ Washington Post, 23 December 1981, C1.
20. A. Richard Immel, ‘Chris Crawford: Artist as a Game Designer,’ Popular Computing 1(8), June 1982, pp. 56-64.
21. Chris Crawford, The Art of Computer Game Design (New York: Osborn/McGraw-Hill, 1984). Also available at http:// www.vancouver.wsu.edu/fac/peabody/game-book/ and at http://members.nbci.com/kalid/art/art.html (accessed 1 April 2001).
22. Sue Peabody, ‘Interview With Chris Crawford: Fifteen Years After Excalibur and the Art of Computer Game Design,’ 1997, http://www.vancouver.wsu.edu/fac/peabody/game-book/Chris-talk.html (accessed 1 April 2001).
23. Lee The, ‘Giving Games? Go with the Classics’ Personal Computing, Dec. 1984, pp. 84-93.
24. ‘Do it yourself,’ Personal Computing, Nov/Dec 1977, p. 87.
25. Ralph Baer, ‘Getting Into Games’ (Interview), Personal Computing, Nov/Dec 1977.
26. Carlston, p. 30.
27. Ken Uston, ‘Whither the Video Games Industry?’ Creative Computer 9(9), September 1983, pp. 232-246.
28. Andrew Pollack, ‘Game Playing: A Big Future,’ New York Times, 31 December 1981, sec. 4, pg. 2, col. 1.
29. Rick Loomis, ‘Future Computing Games,’ Personal Computing, May/June 1977, pp. 104-106.
30. H. D. Lechner, The Computer Chronicles (Belmont, CA: Wadsworth Publishing, 1984).
31. Richard Wrege, ‘Across Space & Time: Multiplayer Games are the Wave of the Future,’ Popular Computing 2(9), July 1983, pp. 83-86.
32. Jim Bartimo, ‘Games Executives Play,’ Personal Computing, July, 1985, pp. 95-99.
33. Erik Sandberg, ‘A Future for Home Computers,’ New York Times, 22 September 1985, sec. 6, part 2, pg. 77, col. 5.
34. Otto Friedrich, ‘Machine of the Year: The Computer Moves In,’ 3 January 1983.
35. Richard Thieme, ‘Games Engineers Play,’ CMC Magazine 3(12), 1 December 1996, http:// www.december.com/ cmc/ mag/ (accessed 1 April 2001).
36. For overview, see Ronald E. Day, ‘The Virtual Game: Objects, Groups, and Games in the Works of Pierre Levy,’ Information Society 15(4), 1999, pp. 265-271.

]]>
https://alex.halavais.net/the-coming-gaming-machine-1975-1985/feed/ 2 3258
Mind the MOOC? https://alex.halavais.net/mind-the-mooc/ https://alex.halavais.net/mind-the-mooc/#comments Fri, 06 Jul 2012 22:00:34 +0000 http://alex.halavais.net/?p=3253 Siva Vaidhyanathan has a new post up on the Chronicle blog that takes on the hype cycle around MOOCs. Which is a good thing. Experimenting with new ways learning online and off, particularly in higher ed, is more than a worthwhile venture. I think it probably does have a lot to do with the future of the university.

But maybe not in the way University of Virginia Rector Helen Dragas and others seem to think. For those not playing at home, the UVa recently went through a very public and destructive firing and rehiring of their president. The reason, it turned out, is that their Board of Visitors seemed to think the university should be engaging in creative destruction more quickly. Or something similar to that. They wanted more motion, faster. And MOOCs seem to be the current darling of what elite institutions can do to… well to forestall the inevitable.

To be clear, I agree with the economic doom-casters. I think we are in for a cataclysmic and rapid change in what universities do in the US. I think it will feel a bit like an echo of the newspaper collapse, and in particular, we will see a large number of universities and colleges not make it through the process. Part of that is that there will be challengers outside of traditional universities, and part of it will be that traditional universities will find ways of reaching new students. A big part will be rapid changes in how universities–particularly private universities–are funded.

But I think Siva has MOOCs wrong, in part by assuming that there is a thing called a MOOC and that it is a stable sort of a thing. In particular:

He notes:

Let me pause to say that I enjoy MOOCs. I watch course videos and online instruction like those from the Khan Academy … well, obsessively. I have learned a lot about a lot of things beyond my expertise from them. My life is richer because of them. MOOCs inform me. But they do not educate me. There is a difference.

So, there is a question of terminology. Are Khan courses MOOCs? Let’s assume they hold together into courses and curricula, even then, are they MOOCs? Are MIT’s Open Courses MOOCs? I think calling these MOOCs makes about as much sense as calling a BOOK a MOOC. These are the open resources that make up an important part of a scalable online open course (a SOOC! I can wordify too!).

The main issue here is, I think, his insistence on this idea of “education.” I don’t think I believe in education any more. I’m not sure I believe teaching is much more than setting the stage for the important bit: learning. But he is suggesting that there is more here. That education consists of more than just learning.

But I also think it is way too early to guess at what “MOOCs” do well, when they are a moving target. The idea that calculus or chemistry instruction scales well but history or philosophy does not I think has a lot more to do with institutional structures and university politics than it does with the nature of learning these things.

I think one of the major problems universities–both the elite institutions Siva is talking about and the “less elite” universities and colleges–is that they are the wrong tool for the problem they face. They face students coming to college not well prepared by high schools. The first two years is remedial work, often outsourced to adjunct labor. And since the university wants to put its resources into the “meat” of education, the cool stuff students don’t get to until senior year, they are screwing up what is happening up to that point.

The result is Bio 101 and English 101. Courses that best reflect the worst in college education. They are either 30-student courses taught by first year grad students and/or adjuncts, or 1,200-student courses that involve showing up to class, memorizing key terms, and regurgitating them into the appropriate bubble on a Scantron form. It’s not the 20-person senior seminar on Kierkegaard’s less known knitting patterns that are the target of MOOCs, it is the Bio 101s.

Now, part of the problem is that many large state schools (and small private colleges) only have Bio 101s. I regularly had students at the senior level at SUNY Buffalo who had never written a term paper. At Quinnipiac (which boasts very few giant lectures courses), I heard something similar. As bad as Bio 101 is, it’s a cash cow for the university. If you are able to can that cash cow, all the better.

But here’s the trick, if you are able to can it, and make it available to all for free, it’s not a cash cow, it’s an open service to society. It is not the best solution to the problem (reminder: the problem is failing public secondary and primary education in the US), but it is a stop-gap that doesn’t soak the student.

At present, scaled courses follow the trajectory of scaled courses in giant lecture halls over the last two decades: lecture and multiple choice. The real innovation in MOOCs is the potential for creating networked learning communities within these massive courses. I think it’s possible we can do that. I also think it’s going to take a lot of work, and a lot of time. Which means money.

So, if administrators are excited about MOOCs, I say: good. If they don’t understand the monetization of open education resources, I say: join the crowd.

]]>
https://alex.halavais.net/mind-the-mooc/feed/ 5 3253
On teaching at Quinnipiac https://alex.halavais.net/on-teaching-at-quinnipiac/ https://alex.halavais.net/on-teaching-at-quinnipiac/#comments Thu, 07 Jun 2012 21:21:50 +0000 http://alex.halavais.net/?p=3210 This draws the close on my second teaching appointment, having taught in the School of Communications at Quinnipiac University from 2006 to 2012. I recently sat next to someone on a plane who was about to receive her Ph.D. in Communications, and she noted that it no longer seems like you take an academic job for life. That certainly seems to be the case for me, at least so far in my career. I suspect it’s true for more faculty members today than it was two decades ago, and that (particularly with post-tenure review) it will continue to be.

As I did with Buffalo, I feel moved to provide something of a post-mortem, a review of the university without feeling like I need to pull any punches. As I look over what comes below, I realize that it might be seen by some as a bit more bridge-burning than intended, but it’s nothing I didn’t say privately as a member of the community. I still hold the faculty in high esteem, and I still think there is great potential in Quinnipiac. Perhaps what is reflected below is my belief that such potential is not being effectively realized.

Ultimately, Quinnipiac was not the best fit for me. I am not an impartial observer, and what worked poorly for me might work very well for others. QU has a surprisingly large number of dedicated, bright teachers, and that it is a good fit for them speaks volumes about the university as a whole.

What’s Great About Quinnipiac

1. The Campus

Some of the architecture is a bit love-it-or-hate-it, though most (all?) of the buildings were built by the same firm, so you get fairly consistent design cues. The natural situation of the main campus at the foot of Sleeping Giant, and the York Hill campus, with a view over the foothills is breathtaking. Especially in the autumn, walking northward on the campus can be awe inspiring.

The grounds are kept neat and taken great care of. Many of the parents get the feeling of a country club, which no doubt is by design. It can feel a bit corporate, and perhaps because I am more accustomed to the scale of larger universities, when I first arrived it felt a lot like a private high school. The library is comfortable and attractive. The new Rocky Top student center feels like a comfortable lodge resort.

It falls a bit short when it comes to classrooms, which also are very reminiscent of high school classrooms, for the most part. On the new graduate campus, the similarity to a corporate campus is much more extreme: that’s what it was (Blue Cross) until just a few years ago, and the office suites and meeting rooms are much more comfortable and conducive to seminars. But on the main campus, the inside is rarely as pretty as the outside.

2. Student-Centered / Class Size

Although this is changing, I think, it was great to come from an impacted public university and undergraduate course sizes in the hundreds to a department with an average undergraduate course size of 16. It appears that isn’t sustainable, and there are pushes to change to way teaching load is calculated, but the largest room on campus couldn’t hold the smallest freshman lecture from a large state school. On the other hand, the graduate courses, particularly online, are too large.

There is also a real focus on teaching and improving teaching among most of the faculty. There are the star teachers you would get on any university campus, but the median teacher is also excited about teaching and supported in many ways by the administration in their teaching role. Likewise, I think that QU serves the average student better than most schools do, and provides not nearly as much for the exceptional student. I suspect just the opposite is true for many large state schools and elite private universities.

3. Collegiality

There is still the feeling that it is a small school, and faculty know one another and are genuinely friendly. I feel like I probably missed out on some of this, since I lived so far away. But the truth is a lot of faculty live far from the campus (if not as a far as I do). Actually, it may be that lack of proximity that promotes collegiality. It may also be that the School of Communications was more friendly than some of the other schools. (I get the feeling there was some strife in one College in particular), but I think, on the whole, the faculty got along well with one another and there was less plotting, scheming, and arguing that there is on many campuses. This extended also–for the most part–to administrators, though many faculty seemed to have a conflicted view of the president.

4. Resources

This is a hard one, but generally speaking, there was money to do things you wanted to do. Or, at the very least, you didn’t feel like you were working under the sword of Damocles the way you might at a school reliant on state funds. If you had an interesting project that appealed to the president, you didn’t have to jump through tons of hoops to make it happen.

5. Students

They were the best of students, they were the worst of students. I can’t comment too much on the undergrads, but we attracted some amazingly bright, articulate, and dedicated graduate students during my time at QU. I said it more than once–I would put the top 50% of our classes up against any grad program in the US–and maybe even up against any of their top 50%. In many cases, proximity or subject matter drew them to QU, but they could have thrived in any strong program.

What Isn’t

In the end, the things that are wrong outweighed the above advantages for me.

1. Mission Shift / Administrative Caprice

If you don’t like what the university is doing, wait a few years. In some ways, it feels like the president likes retail therapy. You know what we need? A medical school! How about a school of engineering! These are the most recent ventures, but they are at the expense of the core existing programs at the university. Better to be large and mediocre than small and excellent. No doubt, this has something to do with the need to collect tuition from a larger student base. It’s frustrating, of course, when the gaze and resources of the president’s office wanders, but more frustrating that you don’t know which way to look. By the time I left, I had mission fatigue, and I suspect I’m not the only one.

2. Teaching Load

Very simply said, the teaching load is unreasonable, compared to that at peer institutions, and it’s beginning to show. When I joined, it was less, and while it has shrunk on many competing campuses, at QU the teaching expectation seems to have no downward pressure. It doesn’t help that there isn’t a teaching load any longer–you are assigned some kind of teaching by your departments. There isn’t a clear expectation of the number of courses or FTEs you are expected to teach. Moreover, by devolving the decision for teaching loads to the department chairs, they have created a recipe for even distribution of teaching loads, and crowded out any time or incentive to do research.

3. Library

The library is a great space, but not useful for research. Every serious researcher on the campus had finagled access to a real research library somehow–many by buying a Yale card. I mentioned at a publishing conference that QU didn’t have a subscription to ACM’s Digital Library and someone from ACM noted that they would price things so that everyone could get access. But even after putting him in touch with our library, nothing. I recognize that underfunded libraries are a problem everywhere, and as I said, there are good things that the libraries do, but it isn’t a beacon on the campus. While it may serve some of the undergraduate mission, it isn’t big enough to support researchers.

4. Publicity / Tuition Dollars

This may be true of any private university, but there is always a tension between selling yourself and focusing on doing great work. A lot of time and effort is spent on recruiting and making the university look good to the outside, sometimes to the exclusion of improving the core educational experience. At least this is what I heard from students, who felt the campus tour (for example) was deliberately misleading. Efforts to “manage” some of the PR crises on campus (racist incidents, etc.) resulted in an administration willing to stifle both student and faculty comments in public. Sometimes, again, this feels like presidential hubris, as in the case of kicking the Society for Professional Journalism off campus for their critical remarks or taking a Title IX case to court rather than settling it.

Folks on the West Coast of the US generally have not heard of QU, and as you move east, in many cases they know us as a polling institute first, and college second.

As the relevance of universities are increasingly questioned, it’s also hard to establish the value of an undergraduate education at QU. That’s not to say it’s a poor education: I think the faculty serves students reasonably well. The question is whether it’s worth north of $200K. I suspect our tuition is slightly more than that of most private universities, though certainly not in the NYU/Sarah Lawrence range. (On the other hand, QU’s president is one of the 36 in the nation to receive a seven-figure compensation package–the only place where QU ranks in the top 36, I believe.) In many cases, parents are well able to pay the costs of QU, and perhaps because of location or some other determinant they feel the relative value of that money makes the tuition tenable. But I’ve talked to many students who leave our program with wholly unrealistic views of what they will be earning, and student loan debt that–without parental support–will be crippling.

In Sum

I guess what it really comes down to is that QU doesn’t allocate resources the way I would: money or institutional will. On the money side, I think they could do a lot more to support faculty, especially research. I suspect many faculty at many institution feel this way, but if you look at things like office space, teaching loads, and general support for research on some of QU’s peer campuses, it becomes clear that this is not a priority for QU.

And then there is the culture bit and the lack of a shared, consistent mission. It’s not about messaging, it’s about a real sense of purpose. I think many among the faculty and staff at QU are pretty happy about the way the university is already. And as I’ve said, I think there are many reasons for them to be happy with it. But that also provides a bit of a sense of complacency, and little real reason for change.

I still consider myself a friend of QU, and I see a great deal of potential, especially in the School of Communications. I suspect that in the long run it will continue to improve and will find its way toward a future than many at the university can get behind. In the shorter run, I’m off somewhere new, somewhere I get the feeling is already moving quickly. It’s a bit more risky in some ways, and moving is always hard, but I am eager to work in an institution that seems to share my interests and values more closely.

]]>
https://alex.halavais.net/on-teaching-at-quinnipiac/feed/ 7 3210
Buffet Evals https://alex.halavais.net/buffet-evals/ https://alex.halavais.net/buffet-evals/#respond Thu, 03 May 2012 03:16:06 +0000 http://alex.halavais.net/?p=3173 “Leon Rothberg, Ph.D., a 58-year-old professor of English Literature at Ohio State University, was shocked and saddened Monday after receiving a sub-par mid-semester evaluation from freshman student Chad Berner. The circles labeled 4 and 5 on the Scan-Tron form were predominantly filled in, placing Rothberg’s teaching skill in the ‘below average’ to ‘poor’ range.”

So begins an article in what has become one of the truthiest sources of news on the web. But it is no longer time for mid-semester evals. In most of the US classes are wrapping up, and professors are chest-deep in grading. And the students–the students are also grading.

Few faculty are great fans of student evaluations, and I think with good reason. Even the best designed instruments–and few are well designed–treat the course like a marketing survey. How did you feel about the textbook that was chosen? Were the tests too hard? And tell us, were you entertained?

Were the student evals used for marketing, that would probably be OK. At a couple of the universities where I taught, evals were made publicly available, allowing students a glimpse of what to expect from a course or a professor. While that has its own problems, it’s not a bad use of the practice. It can also be helpful for a professor who is student-centered (and that should be all of us) and wants to consider this response when redesigning the course. I certainly have benefited from evaluations in that way.

Their primary importance on the university campus, however, is as a measure of teaching effectiveness. Often, they are used as the main measure of such effectiveness. Especially for tenure, and now as many universities incorporate more rigorous post-tenure evaluation, there as well.

Teaching to the Test

A former colleague, who shall remain nameless, noted that priming the student evals was actually pretty easily done, and started with the syllabus. You note why your text choice is appropriate, how you are making sure grading is fair, indicate the methods you use to be well organized and speak clearly, etc. Throughout the semester, you keep using the terms used on the evals to make clear how outstanding a professor you really are. While not all the students may fall for this, a good proportion would, he surmised.

(Yes, this faculty member had ridiculously good teaching evaluations. But from what I knew, he was also an outstanding teacher.)

Or you could just change your wardrobe. Or do one of a dozen other things the literature suggests improves student evaluations.

Or you could do what my car dealership does and prominently note that you are going to be surveyed and if you can’t answer “Excellent” to any item, to please bring it to their attention so they can get to excellent. This verges on slimy, and I can imagine, in the final third of the semester, that if I said this it might even cross over into unethical. Of course, if I do the same for students–give them an opportunity to get to the A–it is called mastery learning, and can actually be a pretty effective use of formative assessment.

Or you could do what an Amazon seller has recently done for me, and offer students $10 to remove any negative evaluations. But I think the clearly crosses the line both in Amazon’s case and in the classroom. (That said, I have on one occasion had students fill out evals in a bar after buying them a pitcher of beer.)

It is perhaps a testament to the general character of the professoriate that in an environment where student evaluations have come to be disproportionately influential on our careers, such manipulation–if it occurs at all–is extremely rare.

It’s the nature of the beast, though: we focus on what is measured. If what is being measured is student attitudes toward the course and the professor, we will naturally focus on those attitudes. While such attitudes are related to the ability to learn new material, they are not equivalent.

Doctor Feelgood

Imagine a hospital that promoted doctors (or dismissed them) based largely on patient reviews. Some of you may be saying “that would be awesome.” Given the way many doctors relate to patients, I am right there with you. My current doctor, Ernest Young, actually takes time to talk to me, listens to me, and seems to care about my health, which makes me want to care about my health too. So, good. And frankly, I do think that student (and patient) evaluation serves an important role.

But–and mind you I really have no idea how hospitals evaluate their staff–I suspect there are other metrics involved. Probably some metrics we would prefer were not (how many patients the doctor sees in an hour) and some that we are happy about (how many patients manage to stay alive). As I type this, I strongly suspect that hospitals are not making use of these outcome measures, but I would be pleased to hear otherwise.

A hospital that promoted only doctors who made patients think they were doing better, and who made important medical decisions for them, and who fed them drugs on demand would be a not-so-great place to go to get well. Likewise, a university that promotes faculty who inflate grades, reduce workload to nill, and focus on entertainment to the exclusion of learning would also be a pretty bad place to spend four years.

If we are talking about teaching effectiveness, we should measure outcomes: do students walk out of the classroom knowing much more than they did when they walked in? And we may also want to measure performance: are professors following practices that we know promote learning? The worst people to determine these things: the legislature. The second worst: the students. The third worst: fellow faculty.

Faculty should have their students evaluated by someone else. They should have their teaching performance peer reviewed–and not just by their departmental colleagues. And yes, well designed student evaluations could remain a part of this picture, but they shouldn’t be the whole things.

Buffet Evals

I would guess that 95% of my courses are in the top half on average evals, and that a slightly smaller percentage are in the top quarter. (At SUNY Buffalo, our means were reported against department, school, and university means, as well as weighted against our average grade in the course. Not the case at Quinnipiac.) So, my student evals tend not to suck, but there are also faculty who much more consistently get top marks. In some cases, this is because they are young, charming, and cool–three things I emphatically am not. But in many cases it is because they really care about teaching.

These are the people who need to lead reform of the use of teaching evaluation use in tenure and promotion. It’s true, a lot of them probably like reading their own reviews, and probably agree with their students that they do, indeed, rock. But a fair number I’ve talked to recognize that these evals are given far more weight than they deserve. Right now, the most vocal opponents to student evaluations are those who are–both fairly and unfairly–consistently savaged by their students at the end of the semester.

We need those who have heart-stoppingly perfect evaluations to stand up and say that we need to not pay so much attention to evaluations. I’m not going to hold my breath on that one.

Short of this, we need to create systems of evaluating teaching that are at least reasonably easy and can begin to crowd out the student eval as the sole quantitative measure of teaching effectiveness.

]]>
https://alex.halavais.net/buffet-evals/feed/ 0 3173
Super PACs hurt the economy https://alex.halavais.net/super-pacs-hurt-the-economy/ https://alex.halavais.net/super-pacs-hurt-the-economy/#respond Wed, 14 Mar 2012 14:49:41 +0000 http://alex.halavais.net/?p=3129 There have been any number of criticisms of the Citizens United case and the Super PACs that have emerged as a result: they allow corporations and the rich to shape public debate and they provide no accountability, allowing for influence peddling and potential foreign influence. But I wonder if anyone has looked closely at their effect on the economy.

First, there is a reason PACs buy TV ads. It’s the same reason retailers do: they work. On Super Tuesday alone, GOP candidates spent just shy of a $100 million, much of it on TV ads. It’s hard to know to what degree spending will accelerate during the general, but let’s say it comes out just short of a total of, say $4 per each household in the US. (That’s lower, by almost an order of magnitude, than some are predicting.) That’s not, in the whole scheme of things, that much money. It’s, say, a dozen B-1 bombers. It’s probably not much more than our daily burn in Afghanistan.

What I wonder, though, is how all this TV ad spending affects the cost of advertising. If we can take a guess at the total spend on TV ads during the campaign, it will almost certainly outstrip the annual spending on television advertising for soda, for example (pdf). As a result, this makes local TV advertising–particularly in contested markets–more scarce, and drives up prices.

Leaving aside whether we can spend our way out of the recession as consumers, it does seem like retail sales have an effect on the health of our economy. So it’s a double whammy. Some consumers are clearly donating to these super PACs–although it will be interesting to see how much of Obama’s ad buys are also being paid for by large donors this time around. And the businesses are presumably donating millions of dollars into these funds. They are then faced with increased costs for TV ad buys–and probably mitigate this by buying fewer ads and spending more on advertising. This works its way into their product pricing structures. So the consumer donates to these PACs, and then finds that they are paying for the TV ads they are seeing, but they also are (already) paying for the ads they are seeing for retailers. Although most Super PAC money is coming from Wall Street and various parts of the service industry rather than manufacturers or retailers, those donations also end up ultimately coming out of the consumer’s pocket.

Leaving aside the pernicious effect of election spending on public discourse, it’s a great way to put the brakes on economic recovery.

]]>
https://alex.halavais.net/super-pacs-hurt-the-economy/feed/ 0 3129
Does Mitt Romney Hate Noodles? https://alex.halavais.net/does-mitt-romney-hate-noodles/ https://alex.halavais.net/does-mitt-romney-hate-noodles/#comments Wed, 19 Dec 2007 21:30:31 +0000 http://alex.halavais.net/does-mitt-romney-hate-noodles/ Pater PastaI was watching my Sunday comedy program, which included an interview with Mitt Romney, who discussed his recent speech on “Freedom and Religion.” Romney said that America needed “morality and religion” though that religion was “of course, not a particular denomination.” Tim Russert questioned him on this, asking about whether atheists had a place in American government, and Romney admitted that it was possible, on a person-by-person basis, for an atheist to have a moral code (unlike the blanket morality of those in organized religion, one supposes). He went on to say that

the, the founders of the nation, coming from different faiths and different persuasions, nonetheless all believed that the, the creator was an instrumental part of the founding of this nation. And I believe that that part of history should be taught, I believe that we should recognize the divine with everything from celebrations in the town square, with menorahs and nativity scenes, as well as in our history books, talking about the fact that the creators did believe in a fundamental sense of, of the divine. And, and recognizing that that gives us a moral code, a suggestion of what is right and wrong, that is–that is, in many respects, unique in the world.

I have to assume that he would be open to other things showing up in the town square, including seasonal icons from latter day religious tradition.

And here, of course, I am speaking of Pastafarianism. While I am not devout, I was proud to be one of the founding members of the First Buffalo Church of the Flying Spaghetti Monster, and yet, was chagrined to admit that I did not immediately know what would be appropriate to put alongside the menorah and nativity in the town square. Of course, in September I always celebrate the doctrinal Talk Like a Pirate Day, bringing a bit of diversity into the classroom in subtle ways, but can’t we follow in the footsteps of the popes, and bull ourselves a holiday to piggyback on Yule? I did some preliminary research, finding some information on Jólasveinarnir, The Yuletide Lads, and other alternative Santa Clauses, but no Pater Pasta. In the heartland, however, this battle is already in full force.

It’s bad enough that schools are banning our religious garb, it seems that universities are turning out to be equally bigoted when it comes to Pastafarian celebrations this time of year. Some students at Missouri State University attempted in a small way to celebrate the FSM:


The administration was not content, however, and has steadfastly refused reasonable requests by the active MSU Pastafarian community (including a large number of students and faculty) to celebrate the holidays with displays alongside other religious paraphernalia.

Meanwhile, at Michigan State, the MTU Pastifarians student group had threats posted on their door, resulting in a disciplinary hearing.

While I find little evidence of a “War on Christmas,” this seems to be a hard season to remain a devout follower of Its Noodly Appendage. A very happy holiday to you, however you choose to celebrate it, ramen.

]]>
https://alex.halavais.net/does-mitt-romney-hate-noodles/feed/ 2 1873
Ask Alex: Getting a Communication Ph.D. https://alex.halavais.net/ask-alex-communication-graduate-school/ https://alex.halavais.net/ask-alex-communication-graduate-school/#comments Tue, 15 May 2007 04:03:57 +0000 http://alex.halavais.net/ask-alex-communication-graduate-school/ So, it’s that time of the year again, and so the inevitable question comes from a few graduate students: Where’s a good place to get a communication Ph.D.?

Well, first of all, that’s probably the wrong question. The right question is: “Should I pursue a Ph.D.?” and the answer I will always give is “no.”

Should I go for a Ph.D.?

No. There are lots of good reasons not to pursue the doctoral degree:

1. People really won’t respect you more. Some folks actually do pursue a Ph.D. with the thought that they can then be called “Dr. X” (OK, maybe not Dr. X. Heck, it would be worth it if you could be called “Dr. X.” I mean they want “Dr.” in front of their own name.) I’ve talked to these people, and don’t understand it. There’s no special power a Ph.D. grants–it doesn’t certify you for much of anything, with the below exception. In other words, if you are doing the Ph.D. because you want the prestige, it’s really not worth the effort. Besides, this is America! No titles, remember? If you want the Dr., just use it; or, as a co-worker did, Senator.

2. You won’t make more money. At least not with a communication degree–it may be different with an engineering degree, for example. Someone is now sure to come up with a statistic that says that you make an extra million dollars in your lifetime with a Ph.D., but (a) it’s false causation and (b) you’ll spend that on therapists and paying off debts.

3. You’re really good at coursework, and so you think it’s the natural next step. Generally, it’s not. Particularly if you are in a program that is designed as a “terminal degree,” like the Informatics program at UB, or our MS program at Quinnipiac, you probably are not very well prepared to pursue the Ph.D. People have successfully moved on, but it isn’t a smooth transition. If you gain admittance, you’ll probably be scrambling to catch up with students who have been on the research path during their masters programs. Moreover, although there is generally coursework at the doctoral level in US institutions, it isn’t the major part of the work of the degree. The Ph.D. is always a research degree–you are expected to come in and be an apprentice researcher fairly quickly, on top of your required coursework.

4. You want to be a college instructor, and you think this is where you learn to do it. I was actually lucky in that my program did talk a little bit about teaching, but that is certainly not the focus of a Ph.D. program anywhere; except, of course, in education programs. If you aren’t ready to teach after finishing your masters degree, that isn’t going to change by the end of the Ph.D. You should already be a master of your field when you have the masters degree in hand, the doctorate means that you have made a significant contribution to that field. Many doctoral programs graduate excellent researchers who would be horrible if unleashed on an undergraduate class.

Now, it’s true: it is increasingly the case that colleges and universities will only consider Ph.D.s for their teaching positions. But the problem is two-fold. First, if you are really primarily interested in teaching, you are going to be very frustrated spending 18 hours a day doing research for several years. As a result, you probably won’t be very good at it. Second, as noted below, you probably won’t be able to get a teaching job after all that anyway.

Dirty Ph.D. Program Secrets!

Still not convinced? OK, the two dirty secrets of doctoral education:

1. Many people don’t finish. It’s bad enough that you are going to be alienating your family, and going into debt (and this is assuming that you aren’t paying tuition, but just for living, etc.), you may end up not finishing. The lucky people drop out in the first year. Many get through the coursework, only to be unable to complete general exams. A much larger number get through any required coursework and exams, but find themselves unable to complete the dissertation. If you don’t think you can write a 300 page book now, don’t expect that is going to magically change by the end of your program. There is a reason my university sent out “Ph.C.” (candidate) diplomas. A lot of people end up stuck indefinitely on the dissertation, and in at least some cases, this isn’t even their fault. Sometimes departmental politics or shifts in the field make completing a dissertation in your area impossible.

2. Of those who get the degree, only a small fraction actually get a job teaching in a college or university. An even smaller number end up teaching at an institution as good as the one they attended. Now, you may not want to do this, and you have another target, which is fine. If you do want to teach, you should definitely have a strong “plan B.” Oh, and when I say teach, I mean anywhere. I have colleagues who are brighter and more accomplished than I am who are either unemployed or who are teaching under conditions they hate. A large number of doctorate-holding individuals are stuck in the perpetual hell of adjunct work, hoping one day to “make their break.” Just read through the archives of Invisible Adjunct to get a feel.

You have self-confidence, or you wouldn’t be even considering this. But be realistic about that self-confidence–it takes a lot to make even a minor splash. I know that the JD and MBA people will eat me alive for saying this, but there is usually some clear path out of the top programs for law, business, and medicine. Unless you are at the bottom of your class, you’re likely to get some job in your profession. The truth that schools won’t tell you is that even among the most elite programs, a tenure-track position is far from guaranteed. The majority of graduates go into something else. You would be surprised how many movers and baristas hold doctorates from top universities.

Not Dead Yet!

So, still here? Is there a good reason to pursue a doctorate? Yes, I think–and this is just my own opinion–that there are two good reasons. First, you love to do research. You aren’t just a curious person–everyone says they are a curious person–you live on curiosity and Top Ramen. You do not care particularly about being rich, but you want to be challenged every day. You are passionate about learning and helping others to learn. You will need that passion to sustain yourself through the idiocy, politics, and bureaucracy of the typical doctoral program. Doctoral programs virtually guarantee stress beyond what you have experienced before, which accounts for the strange bestiary that is the typical university faculty.

Second, you like spending most of your life around people who are smarter and more driven than you are. If you are used to being the smartest person in the room, get over it. (Contrariwise, if you think everyone who pursues a Ph.D. is brilliant, be prepared to be disabused of that notion. Many of the brightest people said “screw this” several paragraphs ago and are signing up for the GMAT/LSAT/MCAT as you are reading.) That was really important for me, because I am naturally both lazy and competitive. If there aren’t people around me doing really interesting stuff, I am less likely to be doing so. There was something really exciting to me about being in a room with people who were likely to change the world, and hoping that I could too.

Finding a program

So, now that you are sold on the idea of a doctorate, where’s the best place to go for one in communication? There isn’t a single answer to that question. As you will find, if you haven’t already in your coursework, there isn’t really a field of communication. Really, it’s more of a family of topical areas and approaches that gets bundled together under that name. As one of the younger fields of study, what you find in one communication department is unlikely to be identical to what you will find in another. There are certain affinities among some programs, but there isn’t any clear leader.

The best way to find a program you would like to study in is to identify the dozen or so living researchers you would most like to be a slave assistant for. Whose thinking really excites you? Now, it may be that their work on paper is a really poor representation of what they are like in person, but this will at least get you going down the right path. Honestly, if you can’t think of anyone you would get excited about working with, you have a lot more homework to do before you consider going on to a doctoral program.

You probably shouldn’t choose a program based on just that one person. Once you find where these folks are working, you should take a look at the rest of the faculty, and see whether there are other people you would like to work with there. This is pretty important, since you are likely to be taking classes with them, and one of them may end up being your advisor, depending on how the department assigns students to committees. Finally, if you can figure out who the students are, see if you like the kinds of research they are doing. Email some of them and ask about the department: current students are often the best resource for deciding whether this is the kind of place you want to go.

Set up a time to talk with the chair of the department and the faculty members you are most interested in. Yes, even (especially!) if the campus is in another part of the world. There is a good chance you will be relocating for graduate school, so you better find out if you like the city and the campus as well as the people. Equally importantly, although I don’t know of doctoral programs that explicitly interview candidates, by becoming a real person to the faculty, you are more likely to be in mind when they consider admissions and tuition awards.

I will reiterate: don’t go unless it is paid for. There are a handful of programs that do not award assistantships to new students, but most use the assistantships mainly, or even exclusively, as a recruitment tool. Don’t expect, in those cases, that you are going to show up, pay tuition for a year, and wow them into supporting you. Too many students do, and then find themselves in impossible financial binds and heartbreak.

But, you ask, isn’t there a ranking of Ph.D. doctoral programs? I would like to say “no,” but there is such a ranking. The National Communication Association does a reputational ranking of doctoral programs in a number of subfields. There are a couple of caveats to bear in mind. First, “reputation” doesn’t necessarily mean quality. If Princeton decided to offer a Communication Ph.D., it would quickly rise to the top of these lists, largely because of the name. That’s not to say that a Princeton Department of Communication would suck, just that the reputational measures might outstrip the reality of the program itself. The other piece of this is that the NCA does not represent all of communication. In fact, a lot of scholars in the field may choose the ICA as their primary affiliation, or IAMCR, for example. So the ICA people might have a slightly different take on the best schools.

Making the application

Once you have picked out five or seven schools that you think are worth applying to, spend some time working on the applications. It’s really hard to gauge what admissions committees will do with your application. A letter of recommendation from a colleague that is well known in the field might go a long way. Stellar GREs might attract attention. While good grades are expected, they are more likely to look at the courses you took to decide whether you have the appropriate preparation for a doctorate. But most important, for many schools, is a statement of purpose that shows that you have a clear expectation for your future as a researcher, and that you know about what their program can offer you. It is pretty common that students receive admission and an assistantship from one of their most desired schools only to be rejected by one of their less interesting picks. Admission to doctoral programs tends to be very idiosyncratic.

I would strongly recommend against limiting yourself geographically. I have to admit that the city of Seattle was a major part of the reason I ended up at the University of Washington, and that worked out well for me. Had I stayed in San Diego, I would have done fine with UCSD. Both programs are of very high quality, and also happen to be in great cities. But if you are limiting yourself to a local university, and that university is not among the top in the US, consider seriously whether it is worth your time and effort to commit to a Ph.D. there. Without naming names, there are Ph.D. programs that really are sub-par. There is an unfortunate amount of snobbery and nose-turning as it is, often at cross purposes. Put someone from Columbia, Wisconsin, and Austin in the same room, and there is a chance all three will consider themselves to be at the top of the food chain. If you are completing a Ph.D. at Pudunk U., you may be limiting your possibilities. Since only fools do the Ph.D. more than once, do you really want to put that effort into a university that has an undistinguished program?

Please don’t take this the wrong way. I loved graduate school. I’ve talked to many successful researchers who hated it, but I wouldn’t have had it any other way. I would have loved it even if it turned out that I didn’t get the chance to work in academia, and I’m really happy that I do. But doctoral programs often share their Kool-Aid widely, and are lost in a haze of self-appreciation. Don’t be afraid to ask the tough questions: What percentage of people finish? What percentage of those get tenure track jobs? What do the others end up doing? Are the students happy? Are the faculty happy? Is it a supportive environment? This will be your entire life for a good number of years, you should go in with your eyes wide open.

Update: Also, don’t even think about a Ph.D. in physics :).

]]>
https://alex.halavais.net/ask-alex-communication-graduate-school/feed/ 56 1753
Sequel, then original, then book https://alex.halavais.net/sequel-then-original-then-book/ https://alex.halavais.net/sequel-then-original-then-book/#respond Sun, 16 Jun 2002 18:45:28 +0000 /?p=8 This is the review I submitted to IMDB for the Bourne Identity. The short version: rental, unless you’ve never seen the original film or read the book and are desperate :).

Sequel, then original, then book. That is the recommended order of exposure for any film, and this is no exception. I’ve never read the book, but did see the original movie, which was decent but not spectacular. When I heard about the remake, and then saw previews, I was pretty excited about seeing it. It looked as though they had significantly improved the action sequences, something that I felt the original film was lacking.

I was right about the action sequences. The fights are well choreographed, and the chase scene–despite the obvious tie-in with the new Mini campaign in the U.S.–was very well done. Had they just made these changes, I think the film would have been truly excellent.

But for reasons unclear (perhaps to draw a wider audience, or for easier international distribution), they dumbed down the plot significantly. The characters are made unidimensional (or non-dimensional, as in the absolute waste they made of Julia Stiles’s involvement), and the changes to the plot move it from Ronin to Mission Impossible: both required suspension of disbelief, but the latter also required suspension of thinking. Why is it that Hollywood refuses to make intelligent action flicks?

This one is worth a rental, if only for the action sequences. If you never read the book or saw the original, take a chance on it, I guess… 5/10.

]]>
https://alex.halavais.net/sequel-then-original-then-book/feed/ 0 8
Life Immitating Art? https://alex.halavais.net/life-immitating-art/ https://alex.halavais.net/life-immitating-art/#respond Tue, 11 Jun 2002 10:27:44 +0000 /?p=7 What would you do if you were accused of a murder, you had not committed… yet?
(tagline for the new Fox film Minority Report)

“Jose Padilla, the American citizen accused of plotting to detonate a radioactive ‘dirty bomb’ in the U.S., is just one of many ‘would-be killers’ the United States has arrested, President Bush said Tuesday—and he promised there will be many more to come.”
(Fox News Website, today)

]]>
https://alex.halavais.net/life-immitating-art/feed/ 0 7
Creating Buffalo https://alex.halavais.net/creating-buffalo/ https://alex.halavais.net/creating-buffalo/#respond Sat, 08 Jun 2002 14:07:52 +0000 /?p=6 An article in the Buffalo News today discusses how Richard Florida’s ideas about the creative class might affect Buffalo. It’s a little funny that this has been so embraced here, given that Buffalo (and it seems Florida was once a prof here) ends up the fourth from the bottom of the list of creative cities.

The idea, however, is sound. What does Buffalo have to offer? It’s culture! And given that a lot of cyberculture and technoculture has the neo-industrial feel (e.g., industrial/goth/tech confluence), one would think the architechture and the kind of pastiche of the area would appeal–in the same way that it does in Seattle and in San Fran. With a bit of work, the fact that it is a decaying city in many respects–and I think that it is blind not to recognize that there is atrophe–could be leveraged as an advantage.

The article goes on to quote Giambra who thinks the idea of drawing creative people is fine, but we have to have jobs to do that. The truth is that this is, frankly, wrong. Creative people go to a city not because of its economic outlook, but because of its cultural currency. Why else would unemployed artists move to New York City, where costs are higher than anywhere else in the US, or busk in the Boston subway? A small seed of creative entrepreneurs would provide a basis for such jobs. You cannot create jobs by fiat.

When I interviewed for the position here at Buffalo, someone mentioned that the city had an amazingly high “lifestyle-per-dollar,” a phrase that I found absurd on the face of it. I had a very difficult time keeping a straight face. Yet, others seem to find nothing strange about such a calculation. It’s true that you can buy a huge Victorian here for far less than a two-bedroom condo in San Fran or Seattle. But lifestyle and dollars need not have a very close relationship. You can’t measure a cities cultural capital in dollars.

I went to Hank Bromley’s house for dinner last night. It was really nice to see people who cared about the city and wanted to do something about it. But I think that what needs to happen is something more akin to a branding campaign. We need to be telling ourselves and the world what Buffalo is. And some form of vision of what it can be should come out of that.

Buffalo does not need “jobs”–or at least not only that. Florida’s observation that tolerance of diversity is necessary is also important. No, not tolerance, but celebration. Elmwood avenue could start to do that. But we need rock stars. We need people with vision and with the character to strive for that vision.

I really mean that. Neither Kurt Cobain nor Bill Gates “made” Seattle, but they introduced the city to a new generation. It needn’t be a rock star or a computer geek, but they wouldn’t be a bad first step.

I am not sure a casino would kill the city. I am sure that it is a stupid way to make use of our financial and organizational resources. A failed casino effort–and this seems to be the destiny of a project which entails significant negative externalities and allows a large part of the profits to leave the city–will only decrease the citizen’s trust in government.

Now is the time for a cultural movement of the streets and of the office buildings. What concrete :) steps can we take to make that happen?

]]>
https://alex.halavais.net/creating-buffalo/feed/ 0 6