Browsed by
Category: interwebs

Not Everyone Lives Like Me

Not Everyone Lives Like Me

What a relevation.

When I first started this blog, back when it was on blogspot and it was pseudonymous, no one had the rules about what you should and shouldn’t put online. It was still early days. We experimented, we reflected, we discussed. I remember being told that I shouldn’t mention that my doctor had put me on an anti-depressant when I found myself unable to get excited about the phd program I was in. You can talk about a broken leg, but not a set of broken synapses. At the time I thought: why shouldn’t I write about things I don’t mind others knowing? Each person needs to determine their comfort level.

Time went on. Everyone was still talking about it (and, I suppose, they still are, aren’t they). Yes, anything you publish online can be seen by in-laws, employers, potential employers, potential dates, etc. But if you take that into account and think, yes, well, I struggled, I survived; why not talk about it? Isn’t it okay? If you accept that someone might take issue with you one day? Or if you know, if anyone WERE to take issue with you because of it, they aren’t someone you’d want to date/spend time with/work for?

I have deliberately held things back from this blog, many times, with those things in mind. Anything I wasn’t sure I really wanted my real name associated with, I didn’t put here. And when I was having biopsies and was scared out of my skull about my health, I shut up on here. That was purely out of fear and denial.

I’ve been blogging for 9 years now, and I’m fairly comfortable with what I’m willing to put on my blog. When I started working, I wondered about what was appropriate, but nearly four years in, I think I’ve mostly got a grip on that as well.

I’m not used to people being uncomfortable with it.

Most of the people I’m close to have had blogs for years and think nothing of it. When I meet up with people, they are often “my kind”, and are hip to the blog thing. I mean, so hip it’s square. Blogs are dead. Me and Jason finally agree: yes, blogs are dead, because blogs are everywhere. Everyone has one, so yeah, their novelty is gone.

But not everyone is in the same place as me. Not everyone is comfortable looking at people’s lives online. I remember once in a while someone used to tell me that they feel like voyeurs when reading blogs, but I’ve never understood that. Anyone with a blog knows someone might read it. There’s no reason to feel secretive about it.

But that’s my realization today: not everyone has gotten immune to the fact that everyone can create content at the drop of a hat with the internet. Inner dialogues now have a platform on places like twitter and facebook. Our insides are coming out.

I’m used to it. I love it. I’m comfortable with it. I like to engage with the world around me on a deep level. I don’t particularly do well having casual friends; I have intense friendships, or nothing. So this user-generated web is absolutely up my alley. Why only know the surface when you can dig deeper?

But that’s not everyone’s perspective. I know, not a revelation to you. It’s just a reminder to me. My way is not the only way, nor is it the default, or probably even that common.

So I shouldn’t be surprised if my web presence makes people uncomfortable. No one needs to consume my productions if they don’t want. I’m so used to being half online all the time that I think of my web presence as being half my identity. It feels completely natural to me.



Based on the previous post, I am seriously considering a day of lifecasting with Jason and Alex. Not sure about the logistics at all let alone a date (Jason prefers summer), but I think it would be an interesting challenge. In sum: we record as much as possible of our lives throughout a single day, in as many media as possible.

Current thoughts: photographs documenting where we are, what we look like; video documenting us interacting with our environments, pets, spouses, children, and possbly some video updates of us describing what we’re doing and what we’re thinking about; uploaded documents that we’re working on, email we’re sending (where feasible); playlists of what we’re listening to, lists of any movies/tv we watch; IM conversations; snippets of audio of things like our alarms going off, breakfast being cooked, etc.; descriptions and photos of any food we eat or drinks we drink; descriptions and data of basic things like maps of the area and weather reports. If we really want to get serious, we could add in things like body temperature and whatnot too. Full documentation.

At the the moment I think we should set up some separate place for all this information too be stored. The first thing that comes to mind is that we set up a blog with a lot of bells and whistles, and everyone who’s participating gets their own category. So you could see it all at once, or by person. I’d want to use twitter, but I’d want tweets to show up on the blog as well, in between the blog posts, ideally in a different colour. Marked off, so to speak. Also, I wouldn’t want to use my normal twitte account for all this. I bet that would just annoy the hell out of people. No sure if a blog will work as the basic platform, though. We still need to think that through. Jason may have a point about waiting a bit.

The general point of this exercise, as I currently understand it, is to demonstrate how much “information” we can create on a regular basis, turn it into digital, archivable material, and to force the question about how useful it really is. I’d also like to see for myself just what is and is not comfortable to reveal. Some obvious elements immediately spring to mind; can I ethically copy my email to the project? (As long as someone else’e email doesn’t show up as well? Can I ethically, or legally, make someone else’s email, addressed to me, publicly available? I suspect that would fall outside the scope of the project.) Will I modulate my behaviour because of how I want to be seen? Will I alter my behaviour because I know everything is being recorded? Is the concept of perpetual web archiving an influencing factor in what I’m prepared to share online? Does it stifle my communication? Does it inherently alter the nature of the information online? Traditional media certainly is shaped by its storage medium; I can’t imagine this would be any different. More than anything I’d worry that I’m being boring; will I spend all my time trying to be as witty and entertaining as possible? How does archiving actually become the material? I’m sure there are many more questions, these are just top of mind for me.

I think before we really get started I’ll have a look at lifecasting as it currently exists and see what I can learn from it. I don’t really want to do a life stream of video for archive, because the sheer size of the file such a video would have to be when it’s running the whole day makes me queasy. We could do ephemeral live streaming (I have no problem with that), but that sort of defeats the purpose. More investigation on this matter is required.

Anyone else interested in participating in this warped little experiment? It’s just one day. I think the reflection on the experience will be worthwhile. We might even have to write it up. We have lots of time to prepare. I think we have a lot of sorting out to do before we can really go forward. We can get together and develop some basic policy around how we’ll manage it. Jason’s probably right about the summer. It will probably take that long to sort out the details.

You in? Come on, it will be fun.

The Plight of Future Historians

The Plight of Future Historians

Today, the Guardian warns:

“Too many of us suffer from a condition that is going to leave our grandchildren bereft,” Brindley states. “I call it personal digital disorder. Think of those thousands of digital photographs that lie hidden on our computers. Few store them, so those who come after us will not be able to look at them. It’s tragic.”

She believes similar gaps could appear in the national memory, pointing out that, contrary to popular assumption, internet companies such as Google are not collecting and archiving material of this type. It is left instead to the libraries and archives which have been gathering books, periodicals, newspapers and recordings for centuries. With an interim report from communications minister Lord Carter on the future of digital Britain imminent, Brindley makes the case for the British Library as the repository that will ensure emails and websites are preserved as reliably as manuscripts and books.

I don’t have a lot of sympathy for this imaginary plight of future historians, in spite of being a librarian. And it’s not because I don’t see the value in content that’s on the web. There are two sides of the question that I take issue with.

First: “everything should be archived”. This is simply impossible, and is actually misunderstanding what the internet is. If you understand it as a vast publication domain, where things are published every day that just don’t happen to be books, then this desire to archive it all makes sense. But is the stuff of the internet really published? Well, what does “published” really mean?

To be honest, I think the term has no meaning anymore. At one point, “published” meant that a whole team of people thought what you wrote was worth producing, selling, and storing. It comes with a sense of authority, a kind of title. It’s a way we divide the masses into those we want to listen to and those we don’t, in many different arenas. It connotes a sense of value (to someone, at least). Many people object to the idea that there’s value of any kind of the wild open internet, because just anyone can “publish”. I learned in my reference class at library school that one should always check the author of a book to see who they are and what institution they’re associated with before taking them seriously; if you fall outside our institutions, why, surely you have nothing of value to say, and you’re probably lying! Wikipedia: case in point. We have our ways to determine whether we ought to consider what you’re saying not based on the content, but on who and what you are. Apparently this protects us from ever having to have critical reading skills. We are afraid of being duped, so we cling to our social structures.

So many people just turn that “publish” definition on its head and say everything on the internet is “published”, everyone has a pulpit, everyone can be heard in the same way. I object to this as well. Turning an ineffective idea upside down doesn’t get us any closer to a useful definition of a term, or a practice.

Currently, this is how I define “publication”: blocks of text that are published by a company have been vetted and determined to be sellable to whatever audience the company serves. This holds for fiction, for academic work, etc.

Is content on the web “published”? What does that even mean? I think we start shifting to turn that meaning into “available”. If I write something and post it online, it’s available to anyone who wants to see it, but it’s not “published” in any traditional sense. If I take it down, does it become unpublished? Can I only unpublish if I get to it before it gets cached by anyone’s browsers, before Google gets to it? What if I post something online, but no search engine ever finds it and no one ever visits the page? Was it published then? If I put something online but lock it up and let no one see it, is it published?

I think we need a more sophisticated conception of publication to fully incorporate the way we use and interact with the web. I don’t think the traditional notion is helpful, and I think it presumes a kind of static life for web content that just isn’t there. Web content is read/write. It’s editable, it’s alterable. Rather than dislike that about the content, we should encourage and celebrate that. That’s what’s great about it.

There has always been ephemera. Most of it has been lost. Is that sad? I suppose so. As a (former) historian-in-training, I would have loved to get my hands on the ephemera of early modern women’s lives. I would love to know more about them, more about what drove them, what they’re lives were like. But I don’t feel like I’m owed that information. Ephemera is what fills our lives; when that ephemera becomes digital, we need to come to terms with our own privacy. Just because you can record and store things doesn’t mean you should.

And this comes to the heart of the matter, the second element of the desire to archive everything that irks me. The common statement is that we are producing more information now than ever before, and this information needs archiving. The reality is this: we are not producing “more information” per capita. We simply are not, I refuse to believe that. Medieval people swam in seas of information much as we do, it’s just that the vast majority of it was oral, or otherwise unstorable (for them). These are people who believed that reading itself was a group event, they couldn’t read without speaking aloud. (Don’t be so shy if you move your lips while reading; it’s a noble tradition!) Reading and listening were a pair. In our history we just stored more of that information in our brains and less of it in portable media. If you think surviving in a medieval village required no information, consider how many things you’d need to know how to do, how many separate “trades” a medieval woman would need to be an expert in just to feed, clothe, and sustain her family. Did she have “less” information? She certainly knew her neighbours better. She knew the details of other people’s lives, from start to finish. She knew her bible without ever having looked at one. Her wikipedia was inside her own head.

Today we have stopped using our brains for storage and using them for processing power instead. Not better or worse, just different. We use media to store our knowledge and information rather than remembering it. So of course there appears to be more information. Because we keep dumping it outside ourselves, and everyone’s doing it.

Not to say that a complete archive of everyone’s ephemera, every thought, detail, bit of reference material ever produced by a person throughout their life wouldn’t make interesting history. I think it would, but that’s not what we think libraries are really for. We do generally respect a certain level of privacy. It would be a neat project for someone out there to decide to archive absolutely everything about themselves for a year of their lives and submit that to an archive. Temperature, diet, thoughts, recordings of conversations, television programs watched, books read, everything. We you want to harvest everything on the web, then you might as well use all those security cameras out there to literally record everything that goes on, for ever, and store that in the library for future historians. Set up microphones on the street corners, in homes, in classrooms, submit recordings to the library. A complete record of food bought and consumed. Everything. That’s not what we consider “published”, no matter how public any of it is. We draw the line. Somehow if it’s in writing it’s fair game.

But that’s not what people are generally talking about when they talk about “archiving information”. I know this is true because the article ends with this:

“On the other hand, we’re producing much more information these days than we used to, and not all of it is necessary. Do we want to keep the Twitter account of Stephen Fry or some of the marginalia around the edges of the Sydney Olympics? I don’t think we necessarily do.”

There’s “good” information and then this other, random ephemera. I will bet you that Stephen Fry’s twitter feed will be of more interest to these future historians than a record of the official Sydney Olympics webpage. And that’s the other side of this argument.

This isn’t about preserving information for those sacred future historians. This is about making sure the future sees us the way we want to be seen; not mired in debates about Survivor, or writing stacks and stacks of Harry Potter slash fanfiction, or coming up with captions for LOLcats. Not twitter, because that is too silly, but serious websites, like the whitehouse’s. We’re trying to shape the way the future sees us, and we want to be seen in a particular light.

I object to that process.

Why I like my Netbook

Why I like my Netbook

I’ve been puzzling over this, because I love my netbook (eeePC) more than I expected to. Now, I do like some gadgets, but only when I can see an application for them in my life. My first gadget was a second generation ipod. The moment the ipod came out I knew that was the gadget for me. Until then I would open up SoundJam on my clamshell ibook on the bus between Toronto and Guelph, stick in my earphones, and then close my ibook with my thumb inside it to keep it from turning off, and listen to my tunes. Ipods had moved into their second generation by the time I found the cash to get myself one. Now I have a cellphone as well, but I didn’t really feel that connected to it until I discovered text messaging. Now text is what I mostly use it for (other than calling my mom). It’s basically my mobile AIM client. I don’t have an iphone (data is way too expensive in Canada to make that worth it for me). I have a PDA, given to me by my employer, but I don’t use it anymore.

So why do I love the netbook so much?

It really turned my head around. Using it made me see one possible future for computing, and I’m intrigued. It’s a fairly powerful little creature, with a gig a RAM (could be better, true, but not bad), but without much storage capacity. It has 16 gig of storage space, which is more than twice what my first ibook had, to be honest. But in 2009, 16 gig is smaller than my current ipod’s capacity. So I can’t keep my tunes on it, I can’t put movies on it. I can’t put tons of software on it, either. It’s not exactly a digital “home”.

But then, what if things turn around and we use less and less software client-side? I can use google docs right now for all that word does. (My eee, running Ubuntu, has Open Office on it, however.) I can use splashup or others to edit images without a client. I can use meebo for online IM. I love twitter, and I can get a client for that on my netbook, but why bother? the online interface is pretty simple and easy. I don’t really need a mail client, since both of my main email accounts (personal and work) have decent web clients. What software on my computer do I really need?

As for storage: both my dad and Jeremy taught me important lessons in the last couple of months. I gave me dad a digital photo frame, and it accepts SD cards as well as USB flash drives. Why would he even put his pictures on his ibook? My netbook allows me to upload pictures to flickr directly off an SD card. Given how cheap SD cards are getting, my dad could buy new ones for each trip he goes on. What used to be a transient storage method (that cost serious dollars) is now so cheap you could just leave the data on them. He could have 16 gigs per trip (twice a year) and just store the cards. That kind of storage capacity is just never going to be feasible inside a computer. Talk about extendable.

Jeremy bought a 64 gig flash drive to store media on. It cost him $100. I paid more than that two years ago for my 1 gig SD card. My netbook has three USB drives. If I bought three 64 gig flash drives and plugged them in, my netbook would have more accessible storage capacity than my current macbook. If I bought as many 64 gig flash drives as I needed to partition my data, my netbook would have unlimited storage capacity. I could keep all my tunes on one drive, movies and TV on another, work docs and software on another, etc. I don’t want to have to open up my computer to add more storage. I’d like to be able to just plug it in. The size of those flash drives is only going to go up; I bet my netbook would have more storage capacity than my work and home computers put together pretty soon.

My netbook makes me think about a world where my computer is just a portal to other things, not a location in itself. Any computer can do that, sure; the netbook is just more upfront about it.

Also: my netbook fits in my purse. It’s low profile makes it perfect for use while sitting in cramped economy seats on overnight flights (Jeremy and I watched a lot of british television while on our overnight flight). I wouldn’t have to turn sideways to use it on a greyhound. It’s perfect for taking minutes, mostly because it’s so small that typing on it doesn’t hide you behind a screen. It’s great for the bar, which provides free wireless to patrons. I can use it and still have room for my meal. I can sit in a crowded auditorium and tweet about the keynote I’m hearing. I can connect with other conference goers without having to carry a whole computer with me. And if something terrible happens and it breaks? I’m out 300 bucks, not 1400. And because I don’t store anything on it directly, I didn’t lose any data. It’s a sturdy thing too, since it’s all flash memory and no moving parts.

It’s a form of casual computing that I really like. It’s the kind of gadget I would take with me while wandering around town, in case I wanted to stop and look something up, or blog something, or get in touch with someone via email or IM. It’s perfect for conferences. Why carry your whole life with you when you can just bring a relatively cheap little portal instead?

The small screen: completely not a problem. I thought it would be, but I adjusted to it really fast. I think Jeremy did too. When I returned to my macbook, it felt bloody HUGE. We’re getting spoiled by huge screens. There’s a time and a place for them, sure, but is that all the time?

The small keyboard: takes some getting used to, but I like it. I don’t have huge hands, though. (I don’t have small hands either.) Jeremy I think struggles with it more, but I can type pretty well on the reduced QWERTY. It’s just a matter of getting used to a new keyboard. But I don’t think I would suggest that it’s a keyboard to do all your writing on. It’s more a casual keyboard. This presumes that people have the cash to have more than one computer (something with a bigger screen and a big keyboard, and this little guy), but to me, the netbook is a nice addition to my computing family. I suspect I won’t be traveling with my macbook as much as I used to.

Anyone in the market for a beautiful black leather computer bag? I don’t think I’ll be needing it anymore.

Use of Video

Use of Video

I was launched awake at 4:30am this morning thinking about something I probably won’t be able to approach in the next six months, or even possibly not within the next year. Or ever. And yet.

I’m on a small team at work set on getting a brand new website. No facelift; something totally new. We’ve opened up the floodgates and are interested in anything we could do with a good web presence. Mostly I’m dreaming up more interactive things, which is a bit of a pipe dream. Library websites are usually not interactive on the level that I’m thinking about, but I’m still dreaming about it.

So this morning, out of nowhere, a very simple idea pings me and throws me out of bed. Virtual building. Videos. Navigating services and resources.

I’ve been talking about building a replica of our library in Second Life for some time. I want to do it less to get people interested in virtual worlds or in Second Life in particular, but more as a more interesting way to think and talk about the building and its purpose. I don’t want to lure people into Second Life (one of my pet peeves: people judging projects in Second Life by how many people who experience it stick around in-world). I’d rather they glean what they need from the experience and move forward in whatever way makes the most sense to them.

I want to have it more like an exhibit, a thing you look at and interact with in public places.

But then I was thinking: it would be dead easy, once you have such a replica build, to create short videos about how to get places. For instance: on our new website, we will probably have a section on reference services. Well, why not have a short video that shows you how to find the reference staff? It’s not exactly crystal clear in the building itself. Or how to find the technology centre, or the smart classroom, or the fiance learning centre. Or places for quiet study. Or places for group work. Bathrooms. Gosh, anything! Not a map: we already have those. But actually watching a person do it. Going down the stairs, turning right after the elevators, chosing the door on the left rather than on the right. Going up the stairs, turning left at the water fountain. The details that non-map people use to navigate their world.

Well, that’s not the idea that woke me up. The idea that woke me up was about videos that create themselves. I don’t know much about video, but I presume that’s not impossible; a video that is generated from pieces of existing video and strung together based on the circumstances of a particular set of variables. Does this take forever for a system to accomplish? What woke me up was this: wouldn’t it be awesome if you did a search for a book or journal, and the system showed you a video of an avatar walking up to the stacks and moving toward exactly where that book should be? If we had RFID on all the books this would be even more precise, but we should be able to roughly guess where a book (that isn’t checked out) ought to be. To the shelf, at least. And I got thinking about it because I was thinking about mobile devices, and having such a video play on a mobile device to help you navigate the stacks. A little bandwidth-heavy, but it was just a half-awake sort of thought.

Gift Economies and Librarian Blogs

Gift Economies and Librarian Blogs

I’ve been turning over the idea of gift economies and the internet for some time now. For me it started with Henry Jenkins’ keynote at Internet Research 8 in Vancouver, when he suggested that fans who produce popular product should be paid by the company that owns the copyright. My gut turned sideways and I nearly shouted it, NO. NO NO NO. It registered at the top of the horribly wrong meter.

The more I thought about it, and examined my violent gut reaction, I started to think that adding money to the equation goes against the natural economy of fandom cultures. I’m pretty firmly convinced that fandoms revolve around gift economies, where fans create product that other fans consume, and the consumers are required to pay back the gift by providing feedback, linking others to the product, engaging in commentary about the product, or other fandom behaviours. I hesitate to say it, but another payback activity is deference. I shouldn’t shy away from it. It’s true. There are some fans who are seen to give more to the community than any individual can properly pay back, and thus resentments and frustrations are born. This is exactly gift economy theory, so I’m fairly certain it fits.

So my own reaction at the idea of adding money to the mix is justified; it’s the wrong kind of economy. It would swing the balance. It would increase resentment a million fold, because the people who get paid for their fandom production would become completely unpayable by fandom standards, and would be seen as a stooge of the original producer. I sell out. No longer fully part of the community. Untrustable. No spreading the wealth; any fandom creation is a product of the community, with inspiration and ideas from the community, build on the scaffold of commentary and conversation, beta readers, donations of art, video, songs, fandom trends and ideas, and communal construction of character interpretation. How can one person gain reward from something that is, at its heart, entirely dependent on the community?

So that said, I think I’m seeing the same thing happening in the librarian blogosphere, and I find it interesting. The Annoyed Librarian kept an anonymous blog ranting about librarianship. It was funny and wry and I don’t remember it being too terribly controversial in its blogspot form. People might have disagreed with her approach, but it was just one anonymous blog. There are many more named blogs to read.

But then Library Journal moved the Annoyed Librarian over to their website, and paid her to write her rants. Now she’s official, she’s part of the machine, and getting paid to do it. Perhaps I wasn’t paying enough attention to the blogspot blog and its comments, but I think there’s a marked difference in the kind of comments she gets.

A Selection:
Since I am an Annoyed Librarian too, do I get a cut of the profits?
Rehashing old posts is the best you can do? Couldn’t you have just said this in a comment on the original post? How about some original material? I guess the AL cheerleaders are happy so that’s all that matters.
If you like light and fluffy posts, you’re in the right place. Not much substance here so far.

Generally speaking, librarians don’t comment like this on non-profit blogs. Now that the Annoyed Librarian is being paid for her trouble, that changes things. Comments that won’t help: when her attempt at humour is criticized, the Annoyed Librarian says this:

I don’t need Comedy Central, I’ve got LJ paying me to write this stuff.

And, the post that prompted me to write this post:

Set a date, tell your overlordier, plan a big finale, whatever you like, but give it up. Soon. Because the joke’s been played, we’ve all been had, you’ve picked up a few pennies, and now the joke’s just going to get old. Fast. And you know I know you know that.

I want you to hit it and quit. Can you hit it and quit?

In a world where librarians get book deals and we actually do get paid to do the work we write about, I was a bit surprised to see what I’m used to seeing in fandoms happening in the librarian blog world. But maybe it’s not fandom that generates a gift economy; maybe it’s something inherent in online communities generally. (Could that be so?) Apparently, we librarian bloggers also understand our blogs to be gifts to the community rather than something that aught to be remunerated financially. People are feeling skimmed off for cash. The understanding seems to be: you wouldn’t exist without us. If you get paid for what you do, you’re using us for your own profit. And you will pay our price for that.

I wanted to think about it in terms of fandoms and fandom culture, but maybe it’s much broader than that.

R.I.P., Lively

R.I.P., Lively

Dear Google,

I’m sorry to hear about Lively. I guess you didn’t get the response you expected. But you know, it was a good idea. I love the idea of the same avatar turning up on many different pages, a representation of me that moves with me from web location to web location. It’s like an ID, but with features and motion. You were on to something there. It’s not your fault that people can’t figure out how to use it. This is always the way with things. When cool new apps appear on the horizon, everyone says: “Well, what’s it for?” People as a rule aren’t terribly imaginative.

Anyway, I’m sorry it didn’t get the reception you expected. I hope you’re not giving up on virtual worlds entirely. So many people are right now. Every time I turn around someone tells me how the concept is dead and no one wants to go near it. Why is this? We haven’t even BEGUN to scratch the surface of what we could do with virtual worlds. Every once in a while you see something amazing blossom out and people are stunned. I guess we just need a few more blooms to get people’s imaginations stirred.

I mean really; how many times have blogs been dead? And how many blogs are there now? Sheesh.

I hope you have a little party/wake in honour of Lively’s passing. I would go.

Love always,


What I learned (and thought about) in Copenhagen

What I learned (and thought about) in Copenhagen

I tried to think of a way to present what I learned via Internet Research 9 in Copenhagen, but I’m still heavily jet lagged. So I’m going to present it in discrete chunks.

Work and Play
There are certain ideas and words that trigger a very serious gut reaction in me. I really appreciate these conferences so I can sit with those feelings and talk with others about them to see if I need to fight with my gut or against it. One of those triggers went off during the pre-conference workshop when we talked about the terms “game”, “play”, “recreation”, and “leisure”.

First, game: this came up in the perennial (and yawn-inducing) question about whether or not Second Life is a game. In my opinion, Second Life is a game engine with the game pulled out, just like MOOs before them. But the term came up again. My answer is the same: no. It’s not a game.

“What’s wrong with play?”

No no no, “game” does not mean “play”, and “play” does not mean “game”. I have no objection to games, but turning all play into a game is a dangerous slide in terminology. I’ve read Julian Dibbell‘s fantastic book Play Money and I already know that there’s a difference between play and games. You may “play a game”, but play is much more than a game. A game has rules and outcomes, play can be just about anything that’s fun. Julian Dibbell notes that there are always elements of play in work, and those are the most productive times across the board. He also notes that there is a lot of work in games, so the classic allocation of “play” behaviour to games alone is a misnomer. No. Just because it’s play doesn’t make it a game. And that goes on with “leisure” and “recreation”. Limiting our days to “work” and “non-work/fun” portions makes my skin crawl. The only distinction there is that one is rewarded financially and (presumably) is not, and I’m not sure I’m ready to let capitalism dictate the basic terminology of my life. There are so many areas where I want to break down the false dichotomy between work and play for the sake of my own sanity, I just can’t get into a worldview where fun is a thing that happens when not working. I must back away slowly from that entire set of terminology.

But the conversation is important. Play has value in education, and needs to be understood that way. Working with social networks and technology on a full time (more than full time) basis, I run into a lot of people who have problems with people “playing” or having fun at work or in school. So one of my goals, added to all the others I already have, is to help people understand and accept the amazing value that play brings to our work and to academic success.

This isn’t about fighting work-life balance; I’m all for that. But I’m also all for letting your “play” life bleed into your work life and not deliberately holding back the most productive and creative parts of yourself for only one or the other. In a way, this is like the old blogging argument; a good blog, according to some, has one topic and sticks to it. This is “work”. Then there’s the rest of us, who blog about whatever’s going on and catches our interest, and thus lets it all blend together in a big creative pile. My current case in point: I “played” in Second Life for many hours to build Cancerland, which is ultimately expression of something so personal I was assigned an agent at work to help me manage the communication of it. But now it’s very much linked to my work life as well, as an experience, as an idea, and an example, and by turning my brain around to the idea that you can create experiential learning spaces that express information in amazingly effective ways. That’s valuable, in spite of the false distinctions of work and play.

Ubiquitous Computing
One of the conference’s keynotes involved a fascinating look at what a fully integrated city would look like; where the internet is a part of everything. I like this idea, and I need to spend more time considering it. Unshackle the world from computers themselves but hook them into the internet in million new ways. Walk through the world and stay online at the same time; overlay a google map on the world as you see it with your own eyes. One of the ideas that tweaked my imagination was the idea of using the city as your interface. I’m kind of already down that road in my thoughts about replica builds in Second Life and how the replica element of it allows us to provide layers of meaning into the interactivity and affordances; the idea of your city as interface takes that idea and turns it around. Being out in the world and playing an online game with your city. (Probably not Grand Theft Auto.) My first thought was this: how do we turn the library into a location for a ubiquitous computer game? How do we take students offline but keep them online? It’s an expensive proposition (maybe), but it’s something I’d like to keep thinking about. There are lo-tech ways to do it, and I want to try them.

In/Formal Learning
I realized during this conference that my true interest in education goes beyond just technology. My interest, at its heart, is in examining the many (many) means and methods of informal learning, and bringing them to formal learning. When people make the distinction between formal and informal learning there’s a big part of me that wants to shout: “Why are you making those two things so distinct?” The passion that’s so often present in spades in informal learning is what I want to see more often in formal settings.

Interesting things about Twitter

Interesting things about Twitter

I’m heading out to Internet Research 9, the annual conference of the Association of Internet Researchers, in Copenhagen today. My flight leaves tonight, so I’m still in my pjs, going over my packing, counting pairs of underpants, looking for socks, and filling up my toiletry bag. Since my list of friends on Twitter is largely people like me, interested in the internet from a professional as well as personal perspective, many of them are also attending the conference. With each update, I see more and more of them heading to the airport, reporting on line ups and airport staff, giving their final hurrahs as the plane door shuts. Seeing people already on their way makes me question whether I got the time right for my flight, and I’ve already checked my ticket twice.

While many people can’t work out what the point of twitter is for, and I might not be the right person to explain it to them, I can’t really think of another medium that gives you that kind of glimpse into other people’s lives as pieces of a puzzle that occasionally all fits together. For me, right now, it’s a look at the zeitgeist, a sense of shifting from one place to the next that we, as a group of professionals, are in the process of taking.

Jeremy is already there; he left yesterday and is stumbling around Copenhagen right now trying to stay awake and enjoy the sunshine. I wish we were traveling together; I don’t much like overnight flights and I’m anxious about getting there and going through all the minutae to get myself to the hotel while feeling groggy, exhausted and uncomfortable. I find it strangely comforting to see so many other people, just like me, so unlike me, going through the exact same process.

How do you quantify the feeling of comfort?

The Future of Questions

The Future of Questions

I was asked recently to fill out a survey about the situation, goals, and ideas of “future library leaders”. One of the very first questions the survey asked was a true or false type thing; there was a statement and I was asked my opinion about it. The statement said something like this: “In the future, 100% of questions will be directed first at Google.” It was worded better than that, though.

I disagreed. I explained why, but now that I’ve answered this question, I want to elaborate on my answer, and why I’m positive that I’m right.

I don’t mean to imply that Google will become less important. If anything, it will probably become more important. It works. But I don’t think all questions will start there. I think we’re missing something really key.

While everyone loves Google and uses it, most people would prefer to ask their questions of real people, in digital form. In every online community of which I’m a part, there is this constant problem; new users “abusing” the group by picking their brains. On Feminist, the erudite community on livejournal, there were so many questions looking for help writing women’s studies papers that schoolwork-related questions were actually banned from the community. Similarly, on Academics Anon, another livejournal community, many, many questions are posted that are answered thus: “Google is your friend.” There is a near-constant conversation going on about how people don’t read and can’t they just google that citation question, and why does everyone expect us to answer all these silly questions that we’ve answered already 15000 times? The crankiness about it is one thing (and I understand it, in spite of being a librarian). The fact that anyone would rather face that kind of hostility and ask their question to a community of jaded academics (the basic premise of the community) rather than simply type the keywords of their question into google (how to cite a website, etc.) is telling.

In the last two days, as I’ve been preparing for Burning Life, the same thing is happening again. In order to get into the land set aside for Burning Life, you have to join a group. The chat related to that group is almost 99% basic questions that are all answered on Burning Life’s webpages, and the natives are getting very restless. Those webpages are actually very clear and well constructed, but when redirected to these pages, the question-askers are getting mightily upset, as if being asked to read a webpage is some kind of insult. I find this fascinating. They don’t want to read the webpage, even though they are told repeatedly that the answer to their question is there. They want to be told. They want their hands held. They want the personal touch. All digitally, of course.

So why is it that reference as a service is dying by this desire for personal communication is so prominent in online communities?

I think the key to it is trust. And it’s not that these new Burning Life folks trust the rest of us in the group as individuals. They trust that we went through this process already and know how to do it. They trust that we have expertise, and an unwillingness to share it with them offends them. The same is true in the feminist and academics community; they don’t come to us because they like us as people, or find us approachable. They come to us because they trust that we know what we’re talking about.

What makes this all the more confusing is that there’s that constant refrain out there about how you never know who you’re dealing with on the internet, but no one takes that too seriously in these cases. They don’t care if you’re really a dog. They only care that you know something about this very specific category of knowledge, and your participation in this forum provides that degree of trustworthiness.

How can libraries get themselves into that kind of category? I’m not sure. But I think clearly defining and expressing our particular expertise is part of it. The rest is an open question.

Best Underused Technologies in Education

Best Underused Technologies in Education

I’ve been searching and watching and experimenting, and in the last few months I’ve come to realize that these handful of technologies, some very well known, some less so, have a lot to offer teaching and learning but are less well-used for those purposes than they perhaps ought to be. Here’s my rationale:

Some of the real leaders in instructional technology have been using wikis with students for some time, but they’re just not as widely-used as they should be, not by a long shot. Wikis can be both extraordinarily challenging for both the instructor and the students, or they can fit rather seamlessly into a traditional classroom. This flexibility makes them almost universally useful. Students can use wikis to keep collaborative notes in small groups or with the entire class, collectively annotate a poem or other text, create a collective bibliography, collaborate on assignments, write documents in a group, or create a document (an encyclopedia, a book of chapters, a picture book, anything) that end up supporting, say, a classroom in an underprivileged area, for instance; collectively rewriting the syllabus of the class; the options are almost limitless. Wikis, in short, are cool. They take very little training to use, they revolve around traditional skills and activities of writing and citing, and there are few classes that can’t benefit from their use. In time I believe they will be a standard tool inside courseware, simply because they are so incredibly flexible.

Pretty much anyone with a camera and an internet connection uses flickr already, so flickr alone isn’t the revelation. The reason I include it here is primarily because I’ve pretty much given up trying to discourage people from using powerpoint. Most people in my circle of influence are married to powerpoint and won’t give it up regardless. So I’ve decided instead to encourage people to be a bit more conceptual and interesting with their powerpoint instead. Why not use creative commons licensed images from flickr to make that powerpoint more punchy? Under advanced search, flickr allows you to search only for content that you can borrow and reuse. Why not take advantage of a free source of amazing-looking images? My favourite powerpoint presentations are the ones that use no text at all, but represent points and ideas through creative commons images only. It fulfills the instructor’s desire to have a prompt for the next point, and it at least gives the class an opportunity to try and guess the point based on the picture alone. Hey, at least it’s something beyond endless bullet points on slides, right? We are turning into a user-content driven society, but so far we have been focusing more on creating it rather than using it. There’s a time for both!

Odeo and Seesmic
I’m linking these two particular technologies, but really it could be any of the significant number of online audio and video recorders. Odeo and Seesmic are, I think, only the simplest of them. (Though, the video recorder in facebook is stellar.) Here’s why I put it here: we waste a lot of time in education being talking heads. Often, people don’t even want to be interrupted; they want to read their their paper, or process their way through their detailed notes of the lecture, and take questions afterwards. Why are we wasting valuable in-class time for this? Why not read the paper at home into a microphone and/or webcam a couple of weeks beforehand and post the audio/video stream as a reading? Then you can use the time you have in class to actually build upon that lecture, build on the ideas and communicate with the students. I know lots of faculty feel students won’t watch it or listen to it or pay attention to it, but I think our fear around that support it happening. Listening to or watching the lecture is required; put them on the spot when you’re face to face. Tell them they need to come up with 3 questions and 3 comments based on the lecture and the readings, and post them before class starts. Expect them to do it. Students might be just lazy, but I think in fact we train them that they don’t need to do the required reading, because most of the time they sit in a lecture hall bored out of their skins and they don’t see the point of all that preparation. They’ll do the reading when it will matter, ie, before an exam. In graduate seminars you are expected to talk, and everyone feels the pressure to get the reading done and have something to say. Put the same pressure on undergrads! I see audio as a way to off-load our easiest ways to use a 2 hour lecture slot and do something that actually requires everyone’s presence and attention during that time. Life is short. Every face-to face minute should have value. The hardest part is figuring out what to do with 2 hours when everyone’s already heard your excellent lecture. What a great problem to have, I’d say.

Second Life
As much as some technologies are almost universally useful, Second Life is not. I know there are cohorts of educators that believe all courses should be at least partially in Second Life, but I don’t understand their reasons. Second Life is an amazing tool, but only where its particular kind of tool is needed. I think most people are excited by the togetherness factor; unlike message boards or email, when you’re all logged into Second Life, you’re all in there together. You can see each other moving around, and lately, you can hear each other’s voices. That’s very cool for distance ed courses that require face-to-face time. It’s also pretty cool for language learning. However, I’d say for the most part that Second Life’s greatest use is in building. I believe that the tools inside Second Life are excellent for in-depth research projects where students work either alone or in groups, where it is too easy to plagiarize or buy a paper rather than learn anything. If you’re ready to throw the traditional essay assignment out the window, a Second Life building project (say, a particular historical moment, a biography, an idea or concept like postmodernism or the nature of the hijab) might be just the thing. Students need to do a lot of research to get the details right and build it, and then they make a movie out of it that fits into another class, or on a website, or as part of a larger project. It’s interesting, it’s different, it’s engaging and unique, and it’s a lot of fun. With the right concept and the right support, I think this could be one of the most rewarding projects for instructors and students alike.
This is brand new. When people first look at it, they don’t get why I think it has any relationship to education. I did a bad thing in that I grabbed an article from First Monday to try and explain it: check it out here. Look for the little bar at the bottom with the button “start chatting”. Get it? Basically it puts cursors and chat over top of a document, anchored to the document. So if I start typing while I’m reading the introduction, anyone else reading the introduction will see it. If I start typing while I’m in paragraph 5, others at that point will see it. If I get confused part way through, I can hover around the confusing part with others who are also confused. Essentially, we can now book a time and read collaboratively with students. Students can meet together and go through a document. You don’t have to wait until you get to the end anymore. I think this is way more valuable than you’d think, because one of the first skills students need to pick up when they start university is a new kind of reading. Reading an academic article is not like reading a book; it’s more like sitting in someone’s office and hearing a personal lecture. You to learn to respond as you hear it, you need to become part of a conversation with the article. If we encourage that early on, we end up with more vocal students. Also an advantage: with a tricky article, the class and read together and the TA or instructor can scroll through and see where the clusters of cursors are to see where students get stuck. And work it through right there!

Those are my current top 5. More next week or so, I’m sure.

Tell me a Story

Tell me a Story

A message I sent to the Second Life in Education Mailing list today:

I was just listening to the latest Radio Lab episode, which summed up a great deal of what I’d argue Second Life has to offer academic communication: the tools to create interactive, powerful, immersive and engaging narrative out of scholarly ideas and works. In this podcast, Robert Krulwich talks about the long conflict between “popular” means of communication and the sciences, and how that stand-off between them has resulted in the dramatic gulf between the ivory tower and everyone else. He links it directly to the power of the anti-evolution front springing forth from the US and spreading out over the world, because the anti-evolution front has an excellent *story* to tell, while science has agreed that story is not useful, is “play”, and science must be “work” and “fact” rather than metaphor and play.

At the same time I’m currently reading Julian Dibbell‘s excellent book Play Money, which underscores the odd divide western culture places between work and play, even when it becomes startlingly clear that productive work and play are by no means seperate entities.

So this podcast brings together these ideas; metaphor, story, and “play” have a valid place even in academic/scientific communication. Play and metaphor doesn’t cheapen or simplify ideas; it merely makes their implications strike us at deeper levels than mere facts. They are the driving mechanism for facts, perhaps. The means to deliver information.

And really, since language is really just a derivative of song, how is metaphor any less frivilous a means than singing?

Second Life, and and any other constructivist worlds that have appeared before and will appear in future, provides the tools to communicate concepts and ideas in a different, more emmersive way. In a way more like play, more like story, with a strong metaphor. I think this is crucially valuable.

The Speed of Adoption

The Speed of Adoption

Last night I was waiting for Stargate SG1 to come on and watched an entertainment news program (or parts of it). On it, they described the internet phenomenon that is the YouTube Divorce, wherein some famous person’s wife recorded a video of complaints about her soon-to-be ex-husband and uploaded it to youtube. The entertainment news host indicated that everyone had seen it. I had not seen it, and had not ever heard of either of the two players involved. What struck me about it was this; there was a time that I heard about internet memes on the internet exclusively, and suddenly I learn about them on tv.


I didn’t catch this broadcast (October 8th, 1993), but didn’t hear much else about “internet” in popular media until much later. I remember the first time I heard someone talk about the internet on tv on a local news program. It wasn’t cable, just basic free tv from a pair of bunny ears, and the year was 1999. It was a mention of some businesses web site, which you could visit for more information, with a url given. I had been spending lots of quality time on the internets in the 90s, and hearing the web talked about so openly on tv made me think: well, that was a fun ride. Now everyone and their dog is about to appear. This is it: it’s over. It’s not ours anymore.

The lag time between hearing about something online, seeing it/using it/adopting it, and then hearing about it in the mainstream media seems to be getting shorter and shorter. I used to be able to count on hearing about something via friends or Metafilter or some other random web browsing months if not years before mainstream media would “discover” it. Watching blogging come into the mainstream has been fascinating; I thought it hit the mainstream in 2001 when I finally decided to get on board, but it seems that every year thereafter blogging was a new relevation and just got bigger and bigger. Now, no one even blinks in the mainstream media when they reference someone’s blog. They all have blogs of their own, and no one need to define the term anymore. Everyone’s heard of wikipedia. The basic level of internet literacy is going up.

So that’s why yesterday seemed to be such a turning point for me; the internet now has a few killer apps that have finally taken such hold of mainstream North America that I am hearing about internet memes on tv before they reach me on the internet. In fact, perhaps there’s only one killer app that brings the rest of the culture online: it’s YouTube. As with any growing community, there are no more universal “big internet memes”; what’s big in your world isn’t so big in mine, and I may never hear about your internet celebrities until something strange or radical puts them in the media spotlight. The internet has long been fractured into interest groups, but there will be fewer and fewer “all your base” moments as the user group grows, things that everyone on the internet hears about at one time or another. In my internet universe, the youtube divorce didn’t merit a mention; it was clearly huge for the mainstream entertainment media’s audience.

Next up: Entertainment Tonight finds cool social software/web 2.0 apps and reports on them before I hear about them from my networks. Scary!

CBC Search Engine

CBC Search Engine

I heard a great episode of Search Engine on CBC radio this morning; it started with the idea of an internet bill of rights (to be continued), and then goes on into the details of an outstanding story about how an internet community caught a car thief (using bulletin boards, photo sharing, youtube, and even ebay). While people often see the virtual world and the real world as starkly separate (one being for losers and one being legitimate), this story shows the intersections of the two both in healthy, positive ways and also in quite disturbing ways. The show then goes on to an equally fascinating story about the main editor and protector of Hillary Clinton’s wikipedia page. Wikipedia is often demonized among librarians, educators, and the general public, and this story, the story of one editor with one particular interest, is timely and interesting. He explains how he does and doesn’t control the fate of what’s on that page, how people constantly try to vandalize the page, and he and his fellow committed editors keep that page accurate, and how their edits are vetted by others. Anyone can edit wikipedia pages, but it also follows that anyone can keep them fair and accurate, too. The show then ends with a short bit about net neutrality by Cory Doctorow.

The CBC has gone all interwebs lately!

You can hear this episode of Search Engine here: Click to listen.
To subscribe to the podcast using Itunes click here.



I heard a bit on the radio about the internet and microcelebrity, but I only caught the tail end of it. I found an article about the idea here, written by Clive Thompson of Wired, and found that it really resonated with me as a tip-of-the-iceberg kind of idea. I wish this concept were more widespread in online discussion, and it’s implications more carefully considered. Even for those who know about it, few really take it seriously. I mean, Tay Zonday doesn’t really need serious deconstruction, does he? We watch him, we talk about him. So what?

I’m disturbed by our tendency to create and worship at the altar of alternative authority figures in online communities, and then to scoff about the whole thing because it doesn’t matter.

This is primarily why I hestitate over studies like Walt’s which seek to quantify popularity in the world of librarian blogs; I fear the creation of a hierarchy within this online community. Creating a list of popular bloggers creates more visible, more defined, and authoritative list of our community’s microcelebrities, encouraging others to vie for the top spot and pay closer attention to these community leaders. In reality this happens anyway, regardless of whether you quantify it, so I suppose I shouldn’t be so skittish about lists. But I feel like we don’t consider the implications of this microcelebrity enough, that we don’t stop to deconstruct the process enough and see what kinds of behaviours we unthinkingly adopt in its presence.

I’m interested in what it means to be a microcelebrity in any community, because I’ve seen in turn destructive and counterproductive so many times online. Why does this happen? Most people start doing what they do, putting themselves online, for a set of self-defined and often unique purposes: they enjoy writing out loud, they enjoy participating in a community of like-minded people with similar interests, they enjoy the challenge of alternative perspectives, they want a place to react and respond to the things that go on in their daily lives. They like to record their own growth and be urged on in that growth by people they do and don’t know. They want to get some feedback on something they’re doing, get some reaction and attention, perhaps. They want to create an online presence. Most people (I imagine) don’t enter into an online community with the goal of becoming one of that community’s celebrities; most people don’t realize that all online communities have their own homegrown celebrities. We don’t conceive of celebrity that way, and we don’t, as a rule, know the internet and it communities well enough to know that this is what happens. But I have never seen an online community that didn’t have them. It’s rarely a positive experience for anyone, even though “it’s not real” and “it doesn’t matter” and “who is it really hurting”. It hurts us. It reflects the way we build our communities, and being conscious of it will hopefully create a richer, more diverse environment.

What does it mean to be a microcelebrity, known in other circles as a BNF? It means that everything the microcelebrity writes about or focuses on gains more attention than it would otherwise; microcelebrities set the topics for discussion within the community, because everyone is reading what they say and wants in on the conversation. If the microcelebrity develops an interest in something relatively ignored to that point, that interest becomes a new fad. The microcelebrity coins terms that have currency in the community. The ideas, rough drafts, or work of the microcelebrity gets lots of feedback and response in the form of comments, forum posts, tweets, or blog posts; the work of the microcelebrity is more often cited and built upon than that of others. The ideas or work of microcelebrities become goalposts of the community, and everyone else is often compared against them. It’s a powerful position, but that power is often invisible to the microcelebrity, who is often just trying to do what everyone else is doing without recognizing the influence they’re having on the community at large. This definition of celebrity is so absurd to people that the power that comes with it is difficult for them to comprehend. It often feels like microcelebrities “run” the community, when in reality they do not and cannot. Their interests and activities just consistently receive more attention than that of others in the community.

It all sounds pretty positive, but there are downsides, and I think those downsides are dangerous for a healthy online community. Being under a microscope constantly by one’s own community of peers means that the microcelebrity is required to be increasingly careful about what kinds of ideas they espouse lest they inadvertently quash someone else’s project or cause drama. Clive Thompson writes: “Some pundits fret that microcelebrity will soon force everyone to write blog posts and even talk in the bland, focus-grouped cadences of Hillary Clinton (minus the cackle).” He doesn’t believe this is likely, but I’ve never been involved in a community where I haven’t seen it happen. As soon as everyone is staring at you all the time, and the slightest negative opinion sends some part of your community into a tailspin and your inbox to fill up with hate mail, things do get pretty bland. We talk about celebrities (micro or otherwise) as if they are not flesh and blood people; we can talk about them negatively without imagining that they would ever find and read our words about them. We curtail the people we read the most, in the end. The microcelebrity’s views and interests become more mainstream because mainstream is what we want from them; we want them to pet our egos, support our projects, and not stomp on any emerging subcultures or fledgling ideas, and we want to be able to eviscerate them for everything they say and do, as well. Why do we do this to each other? Why is this necessary? (Ask Jessamyn if she gets any hatemail. I bet she does. Do you?)

People approach microcelebrities to pimp their project or their posts, because the approval of a microcelebrity has such great weight; people post comments on these people’s posts just to get their names out there and visible within the community. People put microcelebrities in their feedreaders just to keep track of what they’re paying attention to, either to repost and respond to it, or possibly just to mock it. People get scornful of microcelebrities and everything they say and do, just because there is always a group of people who want to define themselves against what’s popular and shaping public discussion. Microcelebrities will always be judged as not as smart, interesting, or up-to-date as whoever is trying to build themselves up in their shadows. (“Why does she get all that attention? She doesn’t deserve it.“) They become heroes and an anti-heroes at the same time. It’s junior high all over again, and what disturbs me the most is that we don’t ruminate often on the nature of our interaction with microcelebrity at all. We don’t get metacritical about the way we build people up and use them as community signposts. We don’t question the way we adopt authority even when such authority is entirely fictional. We naturally shape our online communities that way and then chafe under them without investigating what underpins the construction of a community.

Being careful about what you post online is no great tragedy, but deliberately creating a hierarchy as a collective where a small subset of a community are expected to control topics and opinions, set trends, and give blessing to emerging subcultures, is self-limiting on all sides.

And this is why I object to creating “top 10 lists” of librarian bloggers; I know what ends up happening. People troll these lists for the ones to watch rather than exclusively following the people they would naturally gravitate toward or find interesting. We create a canon. Without the top 10 list, at least the people getting attention at any one time would shift and change a bit more; as soon as we publicly acknowledge those who get most of our attention, we’re starting to build up those hierarchies and cement them.

Microcelebrity is a real thing, and it can have a negative impact on an online community. I’d love to see a community structured to allow everyone to get the feedback and attention they want without any small subset becoming the de facto class presidents. Maybe we’re just not wired that way.

Edit: Seems I’m not the only one feeling uncomfortable with blogs and their communities today.

What Facebook Hath Wrought

What Facebook Hath Wrought

So, like many librarians, I have a facebook profile. It doesn’t tell you all that much about me on the public face, and all that it does tell you is quite deliberately public information. However, we know that there are always issues with social networking sites because of third parties swooping in and abusing that database for alternative purposes. Check this out: a place I’ve never heard of called scooped my information and created a webpage for me. It snagged my facebook profile picture to pretty it up, too. They’re not just stealing it from facebook, which would at least just be the sin of stealing bandwidth; they actually stole the image and resaved it (creating a new instance).

Another example of why it’s important to be extraordinarily careful what information you add to social networking sites, and how public you make it.

Henry Jenkins and Fan Culture

Henry Jenkins and Fan Culture

The keynote at AoIR‘s Internet Research conference yesterday was given by Henry Jenkins. I’ve read Henry Jenkins, I’m very familiar with his work and his impact, but I’d never seen him speak before. I suppose it’s true of any outsider/insider community politics that it feels incredibly weird and empowering to hear an academic speaking, in a public and respectable forum, about something so secretive and taboo as online slash fandoms. His primary gift to fandom, I’d say, is that he legitimizes fandom by talking about fandom activities in a non-judgemental way. He doesn’t shroud it in shame. He talks about it like it’s a cool thing to do. That’s a wonderful, wonderful thing to do for fandom.

What he said: fandom is good PR. Media owners give mixed signals. Fandom as collective intelligence, etc. It felt good to hear it in that forum. Definitely.

But when I walked away something else hit me. Sour grapes, probably, or a natural response to the insider/outsider online ethnography: Henry Jenkins didn’t tell us anything fandom didn’t already know. He didn’t say anything that fandom hasn’t already analysed ad nausem. I’m glad he gets it, I’m glad he understands how things work, someone has to be the one to tell the rest of the world how fandom concieves of itself and gets meta on itself, but I didn’t hear him say anything that wasn’t just direct reportage of what’s happening in fandom. I didn’t hear him express something that wasn’t just as well expressed, if not better, within fandom itself. And that made me a bit sad. He objects to media companies taking advantage of fans, but how is this really all that different?

I’m not sure what I really expect from him. But that was the sad feeling I had walking away from that keynote. It’s great that he gets it, it’s great that he talks about it in such positive terms, it’s great that he doesn’t shy away from talking about slash (and in fact, he seems way more interested in slash fandom than gen or het fandom, which is interesting). I wonder if all ethnographized people feel this way; why do you get to be famous for just looking at me and describing what I say to someone else? I suppose that’s the way it works.

He mentioned that he would like fans to get some kickback (money) from the media companies, and someone in the audience objected, saying that their fandom research subjects weren’t in favour of that. He said there was a minority that were in favour, and that it would be a good thing. I immediately disagreed, quietly; getting money for fandom activities is anathema, my gut said no, and my immediate reaction at the time was to say, “since you don’t own the characters to start with, getting money for it is considered inappropriate and dangerous, and would bring negative attention from the media companies that would have a terrible impact on the rest of fandom.” It’s also considered insanely wanky to ask for money for fannish activities. But that only makes so much sense as a criticism; what if it’s the media company handing out the money, and no bad press is involved?

In retrospect, there’s another reason why taking money is problematic, and I think, deep down, it’s this that fandom is reacting too when the money question rears its head. Fandom is not a money economy, but it is an ecomony nevertheless. It’s a complex gift economy, where creative production, feedback, and critical reflection are the products and name recognition, attention and feedback are the currency.

Fandoms are extremely hierarchical, in spite of all attempts to deny that hierarchy and others to subvert it; it’s a hierarchy based on subtle differences in reception, feedback, and attention. A person with high value in a fandom economy (a Big Name Fan, or BNF) writes a blog post about how she doesn’t like a particular kind of narrative, or particular characters, and her opinion echoes out throughout fandom, marginalizing some elements and making it more difficult to get positive attention and organization around those particular characters or narratives. People with high economic value in fandom dictate the nature of fandom, even without intending to. Fandom itself decides who those people are. Each indiividual within fandom is a part of creating that economy and providing the currency that enables that power. If the media companies start entering into the fray and give money to some fans, that utterly changes the way fandom economy works. Now people who get money will be immediately elevated in the fandom economy, but they will probably be seen as corporate shills. They will be seen, I would guess, as fakes; the company arbitrarily blessed them, in spite of them not understanding X or Y, or because they wrote a bad X character, etc. etc. Company blessing is like a parent chosing a favourite child and making a bit fuss about it; it’s unhealthy and builds resentment.

If media companies really wanted to do something nice for fans, there are a variety of creative ways to do it. First: hire them. Hire them to do stuff with them. Fandom liaisons, etc. Create insider/outsiders who can still at least peripherally participate in fandom while communicating to both sides at the same time. This will still cause ripples in the fandom economy, but the media company would be providing one person with information that they can share with the rest of fandom; that works into the natural fandom economy. Don’t pay them for fannish activity in the past, pay them to communicate in the future. The best things media can do is provide more raw material to fans; media companies, particularly tv and film, could release additional footage via their websites, without audio would be fine, for fans to use in their vids (video mashups that create alternative visual narratives). It could be outtakes, but if they really wanted to do fans a favour they could stage footage specifically for use by fans, pairing unusual characters in a scene or creating short scenes that would spur on particular kinds of stories and vids. Teaser video, essentially. That would have the added bonus of fueling the creation of free ads for shows that will, inevitably, pull in more viewers. Another idea, for book fandoms, is to arrange to publish an edited collection of stories written in a particular year; have fans submit work, be transparent about the criteria, and celebrate the writers with the publication. Allow them to maintain their anonimity should they wish to.

Money is not always the best way to reward fandom. Best to look at the internal structure of fandoms and reward them in ways that don’t destroy the economy and culture that already exists.

Ritual and Virtual Worlds

Ritual and Virtual Worlds

I went to an interesting set of presentations this morning about ritual as performed in virtual worlds. The first thing that stuck out for me is that everyone has a different working definition of the word ‘ritual’. For some, everything is a ritual, everything we do, from sitting in a room listening to an instructor or presenter to accepting the eucharist. Because of that wide definition, all kinds of things got crammed into a session about “ritual” and I’m not sure I’m completely in favour of that. For instance, one of the “rituals” presented was online gaming activities, like trying to kill a dragon in online Dungeons and Dragons. Gathering together and attempting to complete a task communally is ritualistic (hiding behind a rock, everyone with their task to accomplish, the order in which people stand in the virtual world, etc.). I can understand how there are traditions and customary activities in that context, but I seriously hesitate to call them rituals. That’s like calling everything that has any impact an icon, which I’m also not delighted about.

But the key piece that I’m going to take away from the presentations and the discussion is the sense I got that in moving ritual and religious experience online, we’ve in a sense brought it back to an earlier form. Religious authorities are not the only ones with rituals to preside over and religious knowledge to impart, and much as christian leaders needed to fight with local ritual and knowledge to be heard in a medieval and early modern European context, so modern religious leaders need to cope with the influx of religious information and authority that’s available online. And one of the points of discussion was this: can you have a legitimate ritual without a body? Apparently there is some debate around this. Can you? My goodness, how can you not? If Julian of Norwich or Teresa of Avila were given the opportunity to worship God in a ritual that did not include their physical bodies in any way, they would have jumped at it, I’m sure. Christianity has traditionally had such disdain for the physical body, I don’t understand how anyone could suggest that there’s even a question about whether the virtual ritual is possible. The virtual ritual has been the most desireable kind since medieval christians climbed up on pillars and stood on one foot for 10 years. Remove the impact of the body, remove it from your consciousness, and then you are free to approach the divine with your lustful, sinful flesh tamed and left behind. In many ways, Second Life ritual could be seen as the consumate religious experience.

Can the same be said for jewish ritual, however? Jewish traditions isn’t nearly as flesh-hating, and jewish ritual respects the physicality of the performing the ritual itself, often above the intellectual understanding of it. Perhaps jewish ritual cannot move into a virtual context, but I’d suggest that christian ritual absolutely can.

An interesting morning! I never though my Master of Theological Studies degree would could in handy at an internet researchers’ conference, fancy that!

Lessons in How to Live: Randy Pausch’s “Last Lecture”

Lessons in How to Live: Randy Pausch’s “Last Lecture”

It’s an hour and 44 minutes of the distilled wisdom of Randy Pausch: make time for it today. Seriously. Check out the teaser first if you’re not ready yet to commit the time; that will ensure that you want to. I can’t think of anything better to watch on Thanksgiving. In his own words:

Almost all of us have childhood dreams: for example, being an astronaut, or making movies or video games for a living. Sadly, most people don’t achieve theirs, and I think that’s a shame. I had several specific childhood dreams, and I’ve actually achieved most of them. More importantly, I have found ways, in particular the creation (with Don Marinelli), of CMU’s Entertainment Technology Center (, of helping many young people actually *achieve* their childhood dreams. This talk will discuss how I achieved my childhood dreams (being in zero gravity, designing theme park rides for Disney, and a few others), and will contain realistic advice on how *you* can live your life so that you can make your childhood dreams come true, too.

Thanks to Stephanie Booth for the link.

Quechup Quandry: Today, we’re all Spammers

Quechup Quandry: Today, we’re all Spammers

I like to think that the blogosphere in general has a certain amount of power. One blog out of milions may not, but when something happens that runs counter to expectations and a good chunk of bloggers complain publicly, the results seem to be pretty dramatic.

Case in point: social networking site Quechup created a splash in the last few days by asking users to enter their email addresses and passwords so that the system could check to see if any of their friends were already members. (Personally, I don’t see why anyone would do this in the first place. While I guess there’s some email from my closest friends and family in my inbox somewhere, I don’t really use email to communicate with the people I see on, say, Facebook, or in Second Life or on IRC. Email is too formal for that, and I’d rather search for my friends some other way. I’d rather look up one and see who they have friended already, etc. I found a lot of people I know on Facebook through the groups. Anyway.) So the system does what one would expect; it looks up the email addresses in your contacts lists and checks to see if those people have accounts yet. Then it shows you the ones it found. Do you want to add these people are friends? Sure! Who wouldn’t push that button? Why not. Add them. What Quechup does next: it emails out an invitation, from you, to everyone you’ve ever corresponded with, personally inviting them to join this rockin’ new site you found. Without warning you that it was going to do it.

This is really not what email is for, and it’s a real abuse to use it that way.

In what universe is this okay? In what universe, seriously, is it okay for a system to prompt you to send out mass email to people who have not signed up for it? It’s one thing to send a message to people in a facebook group; they’re there on purpose. People in your contacts in email? Didn’t sign up for squat.

It’s a trust issue, certainly. You see an invite from someone you know (heck, there are a handful of people whose mere name in email would get me to click a button somewhere, sure!), follow through, and suddenly…you’re in the same boat! You’ve just spammed every living soul you know! So Quechup was certainly taking advantage of that trust, but is in return eroding people’s trust not only in social networking systems, but also in us. (Will people think twice when you ask them to have a look at something? Sure they will.)

So now if you run a google search on “quetchup” (like this one) you see a zillion posts by angry bloggers who are incredibly sorry to those they accidentally emailed, and incredibly angry at Quetchup. I’d love to see what happens. Will that mistake, and the widespread reaction to it, destroy Quetchup? Or is any publicity good publicity, and will this be the making of them?

Moral of the story: don’t give anyone or any website your email password. Ever!

Wikipedia as Community Service

Wikipedia as Community Service

If I were “You”: How Academics Can Stop Worrying and Learn to Love “the Encyclopedia that Anyone Can Edit”. I’ve spouted off about this a million times before, and I’m glad to see someone else finally saying it too:

This recognition of the extent to which the Wikipedia has engaged the imagination of the general public and turned it to the amateur practice of scholarship suggests what I think may prove to be the best way of incorporating it into the lives of professional academics: since the Wikipedia appears unable to serve as a route to professional advancement for intrinsic reasons, perhaps we should begin to see contributions to it by professional scholars as a different type of activity altogether—as a form of community service to be performed by academics in much the same way lawyers are often expected to give back to the public through their pro bono work. A glance at almost any discussion page on the Wikipedia will show that the Wikipedians themselves are aware of the dangers posed to the enterprise by the inclusion of fringe theories, poor research, and contributions by people with insufficient disciplinary expertise. As certified experts who work daily with the secondary and primary research required to construct good Wikipedia entries, we are in a position to contribute to the construction of individual articles in a uniquely positive way by taking the time to help clean up and provide balance to entries in our professional areas of interest. In doing so, we can both materially improve the quality of the Wikipedia and demonstrate the importance of professional scholars to a public whose hobby touches very closely on the work we are paid to do—and whose taxes, by and large, support us.

I’d like to insert a little more concern about access to information by the general public, and perhaps add just a glimmer of the serials crisis into this article, but I guess that’s for librarians to do, not academics. Though it will never cease to amaze me that academics don’t seem to realize that they give away their intellectual labour all the time to support a third party distribution system that takes money away from the universities, thus making academics the number one threat to library budgets and the number one reason why those with access to the internet but no access to university libraries can’t get a hold of scholarly works, but hey. It’s Monday morning, and this article is a start.

Via Jason.

Remembering Suzanne

Remembering Suzanne

One of the morbid conversations we have online, those of us who have enough of a life that’s lived digitally to wonder these things: what would happen if we died suddenly? Who would make sure that our friends online find out about it? Would they imagine that we just stopped logging on, or blocked them on AIM, or just got too busy to check in? Would they ever know that we didn’t mean to just disappear?

Well, tonight I found out how it goes, or got a taste of it, at least. A wonderful woman I never met in person but felt close to all the same died last week, and I found out about it today through a strange collusion of real life and digital networks. The world really isn’t that big after all; even if you don’t set it up that way on purpose, it seems that news can still travel pretty fast.

Suzanne Klerks was one of the warmest, most intelligent and witty women I ever knew. She was a fascinating, multi-faceted academic, exactly the type I love most. I’m just devastated that someone so vibrant with so much to give the world is gone. And in the horrible unfairness of it, I’m also glad that I got the time that I had with her. She was an amazing person, I was challenged and touched by her writing over the last few years, and I’m going to miss her so much.

Godspeed, Suzanne.

(Virtual) Comfort is Key

(Virtual) Comfort is Key

Can I get one of these for my office? I just can’t think of a situation that wouldn’t be improved by having a great big basket full of pillows and blankets in it.laundry basket cat nap

Or, possibly, one of these.
cat nap

I’m really happy to see these kinds of very comfortable spaces and poses in Second Life. I’m the sort of person who doesn’t like to see her avatar standing for too long, or sitting in a way that appears uncomfortable. I’m also the kind who always offers a seat to people who drop by, because, while technically we’re just as comfortable standing as sitting, standing around just makes it feel like a less committed conversation. Someone somewhere is probably writing a master’s dissertation on the qualitative and quantitative differences in conversations between avatars who are standing versus sitting.

Now, here’s something you’re unlikely to ever be able to do in real life, unless you have a much wackier real life than mine:
bathing in milk
Bathing in a bowl of milk. My avatar’s skin is flawless, I tell you.

Diving into the Metaverse

Diving into the Metaverse

For the last week or so I’ve been spending some quality time getting to know Second Life. It seems to be all the rage right now in librarianship; in fact, what pushed me in that direction just now was a combination of collegial enthusiasm (from Jason) and a variety of presentations and teasers from libraries all over the place that made me feel like, gosh, I’ve missing something crucial while I’m running around implementing a learning management system; I should look up and see what’s going on.

Full disclaimer: immersive virtual environments is how I came by my honest love of things interactive, so this has been something I’ve been meaning to do for a while. Some of my personal interests have been relegated to the lowest tier of my attention lately, because a lot of my thought and time is dominated by what those around me want and need. Synchronous environments are my first true love, so it was just a matter of time before I dove into Second Life to see what was going on.

What I knew about Second Life before I logged was that it was based on the Metaverse, a fictional vision of the internet found in Neal Stephenson’s 1990 book, Snow Crash. The Metaverse built on the sense of place fostered by MUD, IRC, and BBSs, and took it a thousand steps farther; while characters in Snow Crash are logged on to the internet, they are in a shadow world, where their status is different than their real life ones, their connections are in the room with them while logged in from across the planet, and the digital streets were filled with billboards and strip joints. That book inspired a lot of people from the moment it appeared, and I’ve been seeing echoes of it in text-based environments for years, but Linden Labs took the concept extremely literally when they designed Second Life. While I always felt that the Metaverse’s commerciality was a wry commentary on the inevitable polution of captialist encroachment into the internet, a tool which was originally the result of a scientific and academic gift economy. But Linden Labs took Stephenson’s description dead seriously and fostered a real economy inside Second Life, based very much on that original vision. While I lean toward the far left of the political spectrum and tend to turn my nose up at captialist ventures inching into interactive online spaces, the Second Life economy appears to work pretty well, and benefits a large portion of its population.

Watching the Sunset in Second Life

I’d like to say I’m ready to pontificate about the pros and cons of Second Life, but I’m really not. I’m going to be sitting with it for a while, because there’s a lot to get to know about it, and I don’t think this kind of experience can be rushed. So I’ll keep my comments fairly general, and I must give you the caveat that my ideas are subject to change at any moment.

First, it’s very, very familiar to me. Second Life appears to be not only the digital child of Snow Crash, it’s also a sexier descendant of the MUDs of the 1980s, and, to be more specific, very much a close cousin to the MOOs of the 1990s (sorry Jeremy, but I think it’s true). To clarify: MUDs are games; they are game spaces with a select group of wizards doing all the building and crafting of the game elements, and a much larger group of users getting into character and role playing through that world. I haven’t played it myself, but from what I understand from friends is that World of Warcraft is a natural descendant of MUDverses. Of course they look extremely different; MUDs are text-based, and WoW is, well, not. MOOspaces were the same kind of environment as MUDs, but they had no required game elements. You could use it to game, but the general point of a MOO is that all users can build, not just the wizards. It’s a co-constructed space, still with a set of wizards, but the rules of the space are different. It’s not necessarily all about staying in character; the point is whatever you decide the point is. MOOs (and, yes I know, IRC) were the mass chat rooms before the web-based ones appeared. The difference between IRC and MOO was that MOO a place rather than a channel. A MOO was a place where you could build yourself a room and put furniture inside it, though everything you did in MOO was text. So your creation was based entirely on descriptions. But still; in MOO, people would enter a room and have a seat, because otherwise they would be standing in a room with friends chatting, and their virtual legs would get tired. Second Life, it seems to me, is the pretty young sibling in the MOO/MUD/MUSH/MUSE world. Finally, they replaced all those text descriptions with three-dimensional images.

So, Second Life is a place, or rather, a series of places, built by users. Unlike the text-based spaces I’m familiar with, Second Life is fully multi-media. And when I say fully, I really mean it; there are videos on screens in there, there’s streaming music you can listen to with your friends if you’re all sitting in the same room. There are auditoriums where masses of people can listen to a speaker speak live. There’s a voicechat in beta. Not only is your 3-D self in the space, your experiencing it, hearing it, running into it and (ouch) whacking your head against it at times. Second Life is so fully immersive that it’s very hard (in my experience) not to get emotionally involved. I mean, you’re right there. How can you not?

The first afternoon I spent in Second Life, sitting in one room together, we got up after 4 hours and realized we hadn’t moved in all that time. We’d just sat there on the couch. But we all felt like we’d be running all over the place. Because part of us had.

Sitting in the water in Second life

I’ve been very lucky in my introduction to this space; I have some friends to help me along the way. I have a few ideas so far, but I’m going to wait and see how much I can learn. It’s to easy to come to a quick conclusion about how to manage in Second Life as a librarian, and there’s lots of evidence of that in there. It’s fairly easy to rebuild the real world in there, but is that the best thing we can do? I don’t think it is. More when I’ve digested it a bit more.

Instructional Technology: Public, private, personal, or institutional?

Instructional Technology: Public, private, personal, or institutional?

I’m a bit behind on my blog reading I’ll admit (it’s amazing how easy it is to take on way too much at once, isn’t it?), but I ran into a blog post this morning that threw me. It’s from George Siemens’ Connectivism blog. He says:

I’ve decided that we are taking the wrong approach to technology adoption in schools and universities. We shouldn’t own the space of learning. The students should. We shouldn’t ask them to create a new account, or learn a new tool every time they switch to a different institution or a different job. They should have their own tools…and we should “expose” our content so they can bring it into their space (pick any tool – drupal, blogger, myspace, facebook, elgg). And the conversation that ensues should be controlled (from a public internet or private ownership stance) by the learner. When the learner graduates, the content and conversations remain his/hers.

I agree with him in principle; just not in practice. Yes, students should feel some ownership over their own learning space, or at least some part of the learning space. I think we see this in the most traditional classrooms in the form of personal notebooks; the student doesn’t own the classroom, but they own their own way of making sense of what happens there, what words they note on a page, etc. I’ve always felt a particularly strong attachment to my own notes, which I was loath to lend. I would tend to write done things like whether or not I was tired, what the instructor was wearing that day, and shopping lists in the margins. Because it’s my space, I felt I should be able to write down whatever I wanted to. Some bit of ownership is, I think, critical to the process, and granting students more ownership is not, I would say, a bad thing.


I really don’t like the idea of bowing down to the habits of our students to such a degree that their platforms become our platforms. I have always resisted this. When we have discussions about things like facebook or myspace and people say, hey, that’s where the students are, that’s where we should be! my general reaction is, yeah? Well, the kids are down at the pub, maybe we should move our offices down there too, eh? Come on. There are places where students are, and they don’t want us there with them. There is a danger there of becoming telemarketers of the academic world, the spam of the institution. It’s good to be accessible, but we don’t really want to be sitting on the students’ laps on a Friday night when they’re out to see a movie, right? Give them their space. We don’t need to be in the faces all the time. So part of my objection to George’s suggestion above is that we need to let students have some communities and technologies that they use for fun.

But my primary objection is actually grounded in the basic presumption here. The presumption I see glaring out at me from that pargraph is that students know best. I mean, when it was Father knows best or Librarian knows best we weren’t really better off either, lest it be said that I have a bias against students, but why on earth are we looking to students to work out the best platform for learning? There’s a bit of noble savage about this. Just because today’s undergradate students are supposedly “digital natives” doesn’t mean that they know which platform and which interactive software is best for a classroom, or best for learning (best for learning linguistics, or best for learning microbiology, because there isn’t one be-all-end-all piece of instructional technology either). I drives me batty when I see professionals with lots of offer twisting themselves into pretzels because the mode of the moment is myspace or facebook or cellphones. We can learn lessons from how people interact with social software and mobile technology, definitely, but we don’t need to migrate everything we do into the web 2.0 fad du jour. Students are not technology savants. We need a mixture of experimentation with software, research on trends and what kinds of interactions fit best into which platforms, not a wild free-for-all. Have we nothing to teach here? Don’t we have anything to offer as an institution? Do we not have a responsibility to choose our tools based on the learning outcomes we’ve developed?

Additionally, there are a whole host of problems that come along with allowing students to syndicate institutional content into, say, myspace. If we just provide the feeds, does this mean the instructor is giving up their intellectual property rights? Are instructors meant to just trust facebook’s internal privacy controls to keep their ideas to a limited group? Library content is never going to sit on livejournal, not as long as we sign off on licenses and pay our regular fee to Access Copyright. George’s suggestion above would require all faculty to distribute their work across any platform students feel like using. This is remarkably unwieldy and would be wildly unpopular among certain sectors. (Though, I know many faculty who would be more than happy to have entirely public course documents, but I can’t imagine they would particularly love having it distributed far and wide across the internet.)

This taps into another argument I seem to get into on a regular basis; should student work be public? Should students be required to put their coursework on the wild open internet while they’re still forming their ideas? Or should we be providing a sheltered space for them to grow and change their minds and reconsider? There’s definitely benefits to being wide open, but there are downsides as well. The wayback machine can be an unforgiving mistress if you’ve ever done/said/posted something you regretted years later. Whose responsibility is it to understand that, the students’, or ours?

One final problem; how do you build community if you have a class of 30, and 9 of them are synidcating course content to myspace, 12 to livejournal, and the rest to facebook, except for one student on Vox? If your teaching method consists of merely distributing course content digitally and never getting feedback or collaborating in any way, this method might have no drawbacks (barring the ones I mentioned above). But what if you’re trying to get students to respond and react to each other’s work? What if you’re trying to have students co-construct knowledge? Haven’t you just effectively split the course into 4 parts? Are students going to now have to learn four different interfaces just to connect with the whole class? How is the instructor supposed to manage that? How does this help build community? Haven’t we just isolated the students who chose a less popular system? I know George hates insitutional course management systems, but I don’t think this syndication system is in anyone’s best interests. It would be easier on the student if we introduced them to a centrally-supported system and let them all learn one interface. The key thing with any course management system is to constantly update it, rethink it, build new tools for it, revise and revisit. It can’t be a static thing. It needs to grow and change based on the needs of faculty and students.

And don’t we owe it to students (and faculty) to provide them with the tools of the trade?

Google wants you to stop Googling

Google wants you to stop Googling

It’s a trademark issue, and I’m surprised it hasn’t come up until now. Google wants people to stop using the term “googling” to mean searching for something on the internet.

“We think it’s important to make the distinction between using the word ‘Google’ to describe using Google to search the Internet and using the word ‘google’ to generally describe searching the Internet. It has some serious trademark issues,” a representative for the search company said.

I understand where they’re coming from; if the word becomes too distinct from the product, they lose their ability to control their own name. And that could get annoying.

But, in all honestly; does anyone “google” something without actually using Google? And if so, what’s wrong with those people? You can’t “google” someone on Yahoo or Alta Vista. (Does Alta Vista even still exist?) That would just be wrong. I personally would never ever consider using the term “googling” to imply using just any search engine, but then, I am a librarian, aren’t I. But still. I “google” all the time. Using Google. I even sent one of my favourite people to Google as a human sacrifice. That’s what you call love. I wouldn’t betray them by “googling” through a Yahoo interface. Never! Anathema! Perish the very thought!

Digital Natives vs. Digital Immigrants

Digital Natives vs. Digital Immigrants

From the Times Online: The next step in brain evolution. Let me summarize: young people, who have lived with the internet all their lives, are digital natives. If you’re over 30 and didn’t grow up with text messaging, MSN, and Google, you’re a digital immigrant.

This particular bit of rhetoric really gets to me, and I’ll tell you why. It’s a broad-swath excuse, apparently designed to make those over 30 feel safer about their own current knowledge base. As long as new communication technologies are something your brain is or is not hardwired to comprehend based on your experiences while a preteen/teenager, the rest of us, who don’t understand this new-fangled email thing (or whatever it is people don’t want to understand) can relax and not feel behind the times or missing out. We’re just different, that’s all. This line of reasoning has the added effect of underscoring that which we feel is already true; each generation is a radically new product, and history is based on a set of processes built upon the last that lead to greater and greater progress. Standing on the shoulders of giants, and all that. We can happily let the kids do their internet stuff, knowing that our own smug little land of postal service and telephones is the giant they’re standing on.

Because we all stop learning at age 20, right? And there should be no more pressure to learn after that. Is that really the world we want to live in? That’s like asking us to stop reading after age 20. All the greatest books have already been written by then anyway, right?

I object strenuously to the suggestion that those 20 and under are somehow more “digitally native” than those of us who came to the internet/computers later in life. The difference is not in this early experience; the difference is in whether or not you’re prepared to let something new change your life. It’s about a willingness to learn and an openness to new ideas. The only relationship between that willingness and age is that we expect people under 20 to be open to learning That’s not a “new generation” or strange new brain chemistry. That’s a decision we’re making about how we want to live our lives, and where (and when) we opt to limit ourselves.

From the article:

Emily Feld is a native of a new planet. While the 20-year-old university student may appear to live in London, she actually spends much of her time in another galaxy — out there, in the digital universe of websites, e-mails, text messages and mobile phone calls. The behaviour of Feld and her generation, say experts, is being shaped by digital technology as never before, taking her boldly where no generation has gone before. It may even be the next step in evolution, transforming brains and the way we think.

As long as it’s chemical, it means we don’t need to feel threatened by this personally. It’s not a choice, doesn’t this sound familiar? It’s biology. Further:

“First thing every morning I wake up, check my mobile for messages, have a cup of tea and then check my e-mails,” says Feld. “I may have a look at, a website connecting university students, to see if someone has written anything on my ‘wall’. I’m connected to about 80 people on that. It’s really addictive. I’ll then browse around the internet, and if a news article on Yahoo catches my eye, I’ll read it. And I may upload my iTunes page to see if any of my subscribed podcasts have come in.

“upload” is most defnitely the wrong word to use here. I presume she’s thinking “load” the podcast category inside itunes, or perhaps “download” the latest podcasts through itunes (which doesn’t use the term “download” at all, but rather the more logical “get”), and perhaps she wants to sync her ipod so that the newly downloaded podcasts are transferred to her ipod. But she’s not “uploading” anything.

Sure, Emily listens to podcasts, but is she a digital native? Does she speak the language, know how stuff works, can she easily move between one digital landscape and another? With that language, I’m going to have to say no. Using something doesn’t mean you understand how it works, and it doesn’t mean you can take that use to the next level and apply the knowledge gained from the use to another circumstance.

That’s what makes Emily a “digital native”, one who has never known a world without instant communication. Her mother, Christine, on the other hand, is a “digital immigrant”, still coming to terms with a culture ruled by the ring of a mobile and the zip of e-mails.

Okay, so that’s Emily, age 20. Enter Rochelle, age 31 (32 in a couple of weeks, might I add). Unlike Emily, I didn’t touch my first computer until I was 17. I stumbled on the internet when I was 20 and figured it was a toy. I’d say, according to this article, I would be a “digital immigrant”, a person who grew up without the internet, without cell phones, without text messaging and emails and IM.

First thing in the morning when I wake up, I open up my computer, which is constantly connected to the internet because I bought myself a wireless router. I check my personal email, which collects any new comments on my blog, and then I let my widgets check my gmail, which collects comments from my various other journals (at livejournal, Vox, etc. I check my livejournal friends page, leave a few comments, engage in a few conversations about this and that. I see new pictures posted by an American friend of mine who is currently on a cruise to Alaska, I see what’s new with my friend in the Peace Corps in Jamaica. I check my RSS reader to read my friends’ blogs. I check to see if my friend with the very hot new job in San Francisco has broken any new bones lately. I take the pulse of the blogs by those around the world who share my profession. I say hello via IM to my friend in Australia, who is just settling down for the night. We complain about the weather (always the direct opposite from each other). I wave hello to my friends in the UK. I have my breakfast with my buddy Jason, who is sitting down to his own breakfast in his condo in downtown Toronto with his lovely wife. We trade links we think are interesting and complain about the ones we think are wrong-headed.

Our conversation this morning:

Jason: hypsterism
Rochelle: yeah, I think so too
Rochelle: am writing an annoyed post about it
Rochelle: we should write a joint post or something
Rochelle: because
Rochelle: how old are you now?
Jason: 44
Jason: and I started using computers at 27
Rochelle: yeah, they’re suggesting that only 20 year olds are “digital natives”
Rochelle: and I dare any of these kids to be more digital native than you and I
Rochelle: I’m posting about that
Jason: 20 yr olds are digital naives, not natives
Rochelle: EXACTLY
Jason: The digital naives have lost all sense of critical distance
Jason: and are peons to the marketed moment
Rochelle: I think this kind of thinking is an excuse
Rochelle: for people over 30 to not bother with this stuff
Rochelle: because it’s a generational thing
Jason: unable to function outside the user manual
Jason: and the marketing campaign
Rochelle: letting adults off the hook

On the way to work, I may text my friends for the entertainment value. If I’m on the train on the way into Toronto, I definitely text for the entertainment value. Since I don’t want to spend the money to browse the internet from my cell, I text a friend near a computer to google something for me if I need to know something. (For instance: this past weekend I texted Jason to ask him to find out why the trains weren’t moving out of Union station in Toronto; the bus driver wasn’t sure, but I told him it was a freight train derailment, since that’s what Jason found out through Google News.) I was on Toronto island during the final world cup game; I texted a friend in Syracuse to ask her who won (since I know she’s a fan). I got an answer in about a minute and a half.

Being a “digital native” is not about your early experiences. It’s not about an aptitude or a particular brain chemistry. It’s about being willing to explore, to be changed by technology, and a desire to be connected in this way, beyond the physical space we inhabit. It’s a choice; no one has to do it. But we’re not limited because we’re not 20.