Browsed by
Category: info*nation

The Death of Newspapers

The Death of Newspapers

My old friend Michael drew my attention to an article by Michael Nieslen about changes in publishing and how the paradigm shifts catch companies by surprise. In short:

Each industry has (or had) a standard organizational architecture. That organizational architecture is close to optimal, in the sense that small changes mostly make things worse, not better. Everyone in the industry uses some close variant of that architecture. Then a new technology emerges and creates the possibility for a radically different organizational architecture, using an entirely different combination of skills and relationships. The only way to get from one organizational architecture to the other is to make drastic, painful changes. The money and power that come from commitment to an existing organizational architecture actually place incumbents at a disadvantage, locking them in. It’s easier and more effective to start over, from scratch.

It’s not that they’re malevolent; they’re just stuck in an institutional structure that is too difficult to change. His first example is newspapers; the New York Times (in decline) versus TechCrunch (in the black).

That got me thinking: what would it take for me to go back to supporting a newspaper? Because, in truth, I love newspapers. I haven’t subscribed to one in about two years now, but I do love newspapers. I just don’t like getting one every day. First: they’re messy. The ink stained my carpet at the point where it met the front door, because the newspaper deliverer would drop it just so. It stained my fingers. They pile up and have to be transported somewhere and be disposed of. They’re net worth isn’t sufficient for all the work I have to do to maintain their presence in my daily life. However: I love sitting outside on the patio of our favourite breakfast place with Jeremy, trading parts of the paper, skimming the stuff that is vaguely interesting, digging down on the stuff that’s very interesting, ignoring the sports section…I suppose we use the newspaper as our internet when we’re not online, or when being online would be too costly, too disruptive, or too awkward. Clearly it’s simply a matter of time before we have devices that will fill this desire handily: a roll of thin plastic, perhaps, tucked under an arm, an easy part of the breakfast scene, online for cheap no matter where we are, showing us only the articles that are at least vaguely interesting if not very interesting to us, with no sports section to ignore; our device would have the upsides of the newspaper (no computers cluttering up the table, getting between the food, the people), but the cleanliness, customizability and immediacy of the internet. The future newspaper is a gadget.

Michael Nieslen says: “My claim is that in ten to twenty years, scientific publishers will be technology companies.” Could that be true of newspapers as well? Is the medium more valuable to us than the content? If newspapers managed to produce the device, instead of the content, or perhaps in conjunction with some content funded by the popularity of the device, could that be their future?

Beth Jefferson makes the case that librarians should carefully watch the decline of the newspaper industry, because our descent is similar and may come soon afterward. We, also, are less about our content than about the medium in which we can present them. Our devices are buildings; while “the library without walls” meme has been going around for a while, the reality is that people still need space, and our spaces are popular as spaces to work, think, be and be seen. At the very least. When we move into things like ubiquitous interfaces, maybe our space becomes the medium, the device.

A recent report on libraries and mobile technologies suggested that we wait on developing mobile tech versions of our collections and services, a conclusion with which some disagreed. While I’m all for being cutting edge (bleeding edge, even), I agree with the report. We have no idea where this mobile thing is going. If we had gone all mobile three years ago (when we easily could have gone to town with it), and then the iphone would have appeared, with its alternate internet of apps. Mobile devices don’t tend to do the web well; rather than get better at it, we’re creating a new web for them, designed with their location-awareness, mobility, and lack of keyboards in mind. What if our big future isn’t in making our content web/mobile friendly, but in building ourselves into the e-newspaper or the e-book, letting you do “more like this” searches, hooking up bibliographies, keyword searches within (digital, mobile) text? Maybe the future of libraries is an app inside an app? What about blackberries and other smartphones? Are they going to get in on this app revolution? Are we going to have competing app universes to contend with? The data plan revolution (at least in Canada) is clearly coming, but when? And what will it bring with it? What restrictions will we be under?

I see the legacy of “waiting” that newspapers have demonstrated has not served them particularly well. But on the flip side, jumping in without getting the full lay of the land doesn’t have a good track record either. Maybe we’re all about to come technology companies, in some way or other.

Me in Six Panels

Me in Six Panels

Next year I will be challenging library staff to use a comic strip application (bitstrips) to explain their who they are and what they do in the library to a student audience. It’s still months away, but here’s my shot at it:

Memed Digital

Memed Digital

Since the start, I’ve taken issue with the “digital immigrants/digital natives” divide. From one angle, that division puts me and everyone I share my digital life with on the digital immigrants side, in spite of our very rich online lives. From another, it suggests that the undergraduate students I spend my days assisting are somehow “wired differently” than me, and are way more adept at technology than me. This just isn’t my experience in any way. I think it denigrates the amazing work of older net citizens and puts teens in a box in which they do not identify in any way shape or form. The generational argument just falls flat to me.

Listening to Don Tapscott’s recent Big Ideas lecture the other day gave me a new insight on the matter. Like all who advocate the idea of a digital generational shift, Tapscott was inspired by watching his kids. They’re geniuses! No wait, all their friends are geniuses too! This is the beginning of the problem; anecdotes are great, but they bias you in a particular way. In Tapscott’s world, it’s the kids who are living the digital life, not his peers. Therefore, it must be generational. There is nothing in his evidence that proves this; in fact, even the brain chemistry evidence he cites doesn’t prove it. Different behaviours, different activities can change brain chemistry; that’s not news. That’s the real story, not generations.

Different behaviours and activities can be more popular with certain age groups than others, which makes this “digital native” thing an issue of correlation, not causation. However: do we have evidence that more teenagers are interested in the digital life than any other generation? Gen X is small compared to the “millenials”, correct? In 1994 Wired predicted that by the year 2000 the average age of internet users would be 15. Then I wonder why, in 2008, the average age of internet users in the UK is 37.9? As of right now, NiteCo lists the average age of internet users as 28.3421. I’m not suggesting that teens aren’t interested in the internet and in digital life; it’s just that it’s not primarily or only them. It’s not a factor of their age. This isn’t even like Elvis, when the kids loved the rock’n’roll and the adults hated it; it’s nowhere near that clear cut.

I think it’s more like a cultural meme. It’s a series of metaphors, of truths we accept. In the digital culture meme, there can be something called “digital culture”. An online community is a real community. You can have online friends, and they’re real friends. You can “talk” online using only text, and have it mean as much to you as a face to face conversation. You’re intrigued by new internet apps, not scared. You have a tendency to play with things digital and see how they fit into, or alter, your digital life. The idea of wanting to be connected pretty much all the time is not that strange or dangerous; “thinking with the internet” is a concept that makes sense to you. These ideas, among many others, make up the digital culture meme, and the people who subscribe to it are the digital natives. It has nothing to do with when and where you were born.

Maybe it’s like Stravinsky. When they first performed Rite of Spring, people rioted. It was so foreign, no one knew how to respond to it. But eventually, the meme of radical music spread; eventually, the song made it into Disney’s Fantasia. It wasn’t worthy of a riot anymore; it wasn’t different anymore. It wasn’t going to destroy society. It was just a new way of thinking. Did that start with a generation? Or just a group of classical music lovers? We didn’t consider that a generational shift, but perhaps it was. New ways of thinking, new ways to intrepret culture.

Or are we trapped by old ideas about genetics? Old ideas, the ideas that filter through into society as truths. You can’t teach an old dog new tricks; real change comes from the youth. Is that so? For people like Don Tapscott, is thinking of the digital culture meme as a generational change a way to excuse himself, and his peers, and others who fear the meme, from participating? Is it reassuring to think of digital culture as something akin to built-into-your-genes and unfixable? They are just built differently, they’re brains are different; don’t feel challenged by these new ways of thinking and communicating. Don’t feel threatened. It’s not your fault that you don’t understand or won’t participate. That’s what’s right given your brain wiring. This is only a game for the young. This is the way THEY think, because they were born in this world. But no, it’s not like genetics in that sense; it’s more like epigenetics. Your brain is flexible, your genes are flexible depending on the choices you make, the options you have, and the circumstances you’re in. Accepting the meme and living digital can change your brain. It has nothing to do with your age.

Librarians and Elsevier

Librarians and Elsevier

Not news: an Australian unit of Elsevier contracted with drugmakers to publish what appeared to be medical journals that didn’t disclose who had paid for them. In other words, Merck supposedly created a fake peer-reviewed journal to publish data that made its drugs look good. It also got Elsevier to publish the journal to make it look legit (Elsevier being one of the bigger publishers of — of course — proprietary medical journals). This news has been filtered through the internet for a couple of weeks now.

It wasn’t a librarian who discovered the problem, though. Which makes me wonder: is it the role of librarians to examine journals that present themselves as peer reviewed to ensure that they really are? The Progressive Librarians’ Guild thinks we should. Others think it’s not feasible for us to do so. But given that libraries give authority to journals by subscribing to them, don’t we have an ethical responsibility to try to find the fakes?

As my friend Jennifer is wont to say, it’s time we work out what business we’re in and clearly articulate it. I’m not sure I even get it anymore.

Editing Documents in SL

Editing Documents in SL

[vimeo 2541800 w=1239 h=755]
Interesting. But it might be faster and easier to just do this through your browser alone. Is there a circumstance where you need to be both in-world and collaborating on a document?

This might actually have some interesting implications as a display tool, where you can get people watching a collaboration in which they aren’t participating directly.

I’d love to see some real applications of the HUD.

Search Strings: The Return

Search Strings: The Return

I haven’t done this in a long while, mostly because I did something to my site that prevented me from being able to access them anymore, and I only recently thought about adding Google Analytics. So now I can see them again! So here we go:

“a diary is an example can you til me is primary or secondary
This is interesting; homework question? It’s clearly not a copy/paste, or a typed in copy from a question sheet. It looks like it was typed in on the fly; is this an example of someone using a computer/device while in class? If so, do you think that’s a good or a bad thing? It’s research, right? Is this an example of someone getting the internet to do their thinking for them?

ban a friend (email with comma) subject
Ban a friend…from where? IM? Facebook? email with comma, does that tell us why this person wants to ban a friend? It’s a mystery!

cheapest sd cards
I suspect no one needs me to add a data point to the research indicating that people use the internet to buy things. And to find deals. But this does indicate that people look pretty broadly to find general advice before buying technology and its associated bits.

confessions of an ugly stepsister chapter summaries
There’s always someone looking for ways to avoid reading the book. It’s a good book; just read it! It won’t take that long!

dream and meaning and running home across a field
This one is an interesting combination of boolean and free text. Not “dream interpretation”, but dream AND meaning AND “running home across a field”.

dreaming of making out with someone but don’t see there face
I must post too much about dreams, apparently. This one is on the verge of being a full-blown question, interestingly; if you added an “I’m” to the front, and then the obvious question at the end, “What do you think that means?” While the first dream related question shows evidence of some thought in terms of search construction, this one is more free-flowing, containing mostly words that won’t bring up a useful result.

how do you find if someone had been running a search for your name on the internet
An entire question, minus the question mark. Now: conceivably this might work; if someone created an FAQ with this as a question on it, you’d get a good result. But given the lack of quotation marks, it reads more as if the user is asking google a question rather than searching it. I love how conversational it is. We really do think of google as an extension of our brains in a way, don’t we? Our searches are so stream-of-consciousness.

how to do that google search thing where your name comes up and it says “did you mean”
Speaking of conversational! Yeah, it’s as if instead of the Google logo, the words above the search box says “I would like to know…” and the user merely finishes the sentence. I wonder how many hits you get when you search for “that google search thing”.

primary source subject heading strings capitalization
Someone’s cataloguing homework?

swallow lymph nude on back of my neck and can’t fell on that side
This is a strange combination of search terms and conversationality. Since you can’t very well swallow your lymph nodes, I presume those are separate constructions; swallow, plus “lymph node on back of my neck” “can’t feel on that side”. A pretty ingenious way to search for a series of symptoms, really. If it weren’t for the spelling errors. It’s always easier to type symptoms into google than it is to go see your doctor. But rule number one when you have a serious illness; don’t google it. What you’ll find will only depress you.

the emerging tools to access oa content.
With a period, no less!

what could me to have a rough feeling red ring around my neck
More stream-of-conscious medical questions. We talk about how users don’t need training in how to use Google, and we know they don’t usually go beyond the first page of search results, but looking at strings like this makes it clear that they don’t really know how to use the tool. There’s just so much in it, and we appear to have so much patience with google searching (we like the browse aspect, I guess?) that we will keep hammering at it until we get somewhere that interests us.

whining and complaining examples
You came to the right place!

will having the radioactive iodine treatmenat to kill my thyroid also get rid of the puffyness around my eyes?
Of course I’m going to attract the radioactive iodine and thyroid cancer crowd. Now this one interests me for a whole other reason. No matter how sick we are, vanity is always there, isn’t it. For me, I knew how big my scar was going to be, but I didn’t really care very much about that part; I didn’t care about how it would make me look. Once I had it I realized that it marked me as damaged, made me sort of Frankenstein-like. Pulled apart. Never the same again. I never once considered whether radioactive iodine would have an effect on my face, except that I worried about whether it would block up my salivary glands. However, it’s pretty clear that this person doesn’t have thyroid cancer, s/he has hyperthyroidism. But I don’t think the radiation would change puffiness. It only gets rid of the bug-eyed look that comes with Graves Disease. Sadly, there’s no pill that will magically turn us into Scarlett Johansson.

you don’t have to be afraid of cancer anymore
I hope that’s an accurate prediction.

Twitter Follow Fail/Win

Twitter Follow Fail/Win

In response to Mashable’s Twitter Follow Fail, my own 10 Reasons why I won’t follow you on Twitter:

1. You’re trying to sell me something. This goes for all entrepreneurs of all varieties, particularly the “social media” ones. Now, if you’re a social media entrepreneur but not directly using Twitter to market yourself and your company, but instead using Twitter like everyone else, that’s cool.

2. You follow a zillion people. By a zillion I mean something near or over a thousand, because it’s unlikely that you’re even able to follow all those people. So why are you following me? It’s not like you’re really going to read what I’m saying right? Now, as an exception: if your tweets are awesome and I want to follow you for the content, I don’t care how many people you follow, and if you follow me I will follow you back.

3. You follow a zillion people, hardly anyone follows you, and you have no posts. It’s work to follow a zillion people, so I’m suspicious. Are you using twitter as a feed reader? I sometimes post links, but that’s not really what I use twitter for. Are you just trying to gather followers?

4. You post pretty much nothing but RTs and memes. I’d rather follow people with original ideas rather than rerouters.

5. You post about your follower count. “Three more followers and I’ll be at X00!” “Yay, just hit 500 followers!” Anything like that. Even if I know you, this calls for an immediate unfollow. Sorry. I don’t want to be a notch in anyone’s belt. Clarification: posting about wondering why a bunch of people recently unfollowed you, and wondering if you’ve been offensive, doesn’t make me unfollow. It’s only if you’re demonstrating that you’re using twitter only partly to do anything other than gather enough followers to feel good about yourself.

6. Your archives consist largely of @replies. Some people say this is a display of engaging with your community, but I have my twitter set to not show me any @replies to people I don’t follow. So: if all you do is use twitter as a public chatroom, I’m not going to see your updates anyway. And I don’t think that’s a very effective use of the medium.

7. You post about specific topics that don’t interest me. I sometimes get followed by people who post mostly about life with kids kids, or entertaining kids. I don’t dislike kids, but I have no interest in reading about them on twitter. Sorry. edit: unless I know you and/or your kids. I want to hear about @halavais‘s baby, of course. just not generic stuff for kids. Well, unless it’s YA fiction, which is a whole other topic. Maybe I should think this one through some more.

8. You’re a “life coach”. Just…no.

9. All your posts appear to be automated. I don’t really understand the blip.fm-to-twitter phenomenon, and I already use a greasemonkey script to remove them from my feed. If all your updates are just blip.fm links, I’m unlikely to follow you.

10. You are arguing against gay marriage, posting about your love of the Republicans or of Stephen Harper. So not interested.

Now: 10 reasons I WILL follow you on Twitter:

1. You’re a librarian. I love following librarians. All kinds of librarians. I like to use Twitter as part of my work, so I love seeing what librarians are thinking about.

2. You work in a library. I love hearing from everyone in the library world.

3. You’re in library school. I miss being in school, so I’d be very very happy to read updates about your classes and things that interest you. I think of it as a way to listen in on classes.

4. You’re interested in social media/emerging technologies from an educational/community perspective. I’m not interested in the “social media for profit” crowd, but am very interested in the “social media for fun and learning” crowd.

5. You make me laugh. Hello, @StephenFry.

6. I know you, or I should know you. You live in Toronto, you work at the same school as me, we move in the same circles, you’re my husband, my best friend, or my dad. We’ve had dinner together. We hang out on the same IRC channel or other online community. Something like that.

7. You go to the same conferences I do. I will definitely follow you if I see you tweeting about the same conferences I’m at. I love to hear the thoughts of other conference attendees.

8. You’re at a conference I wish I were at. It’s great to hear what’s going on at a conference I can’t attend. If you’re there, I might want to keep following you after the conference too.

9. I admire your work. Academics, start-up owners, Googlers, etc.

10. You respond to me in an interesting way. I might not have noticed you before, but you responded to something I said in a way that piqued my interest. I’m a sucker for intelligence and thoughtfulness.

I bet this says a lot about what I use twitter for.

Lauren and her Laptop

Lauren and her Laptop

[youtube http://www.youtube.com/watch?v=EIS6G-HvnkU&hl=en&fs=1]

For the most part I’m not that interested in the ad war between mac and PC. I think the mac ads are cute, mostly because John Hodgman is adorable. There’s lots of talk online right now about this ad, saying that “Lauren” is an actor, she never went into the mac store as she said she did, and the PC she got is a piece of crap, etc. Dishonest marketing? Of course! What marketing isn’t dishonest?

When I first saw the ad I went to see what computer she got, and I saw that it was 8lbs and laughed.

I personally don’t care about the mac/pc war because in general I think mac will continue to produce good products regardless, they’re making plenty of money to keep them in business, they’re still producing macbooks, which will be my computer of choice for the rest of the forseeable future. I like to love my laptops, and I love using macs. I generally think that mac is good as a niche; they aren’t going to produce crap computers for the cheap audience, because they don’t cater to the cheap audience. I don’t really want to see them change that priority just to get the greater market share. So as a mac user, I like them having a healthy share of the niche market. Seems perfect to me. So if PC wants to create a persona who “isn’t cool enough to be a mac person”, that’s cool. I mean, if “Lauren” wants to spend 25K on her car but won’t spend more than 1K on a computer, well, maybe she’s really not a mac person.

But in musing about it, the “regular person” technique, a few things are jumping out at me. She wants a cheap, 17-inch laptop. Why 17-inch? Clearly not for professional reasons; the 17-inch computer she got doesn’t have the juice to do any video editing or whatnot. For watching movies? It’s funny, because things are getting smaller these days. Most of the students at my campus have laptops, but the ones who got the bigger ones generally don’t want to lug them around. (And Lauren’s laptop is 8lbs…she might as well have gotten a desktop, really, for the amount she’ll be willing to drag it around.) The smaller laptops are getting more popular because of their sheer usability as portable machines. Netbooks are all the rage because of there incredible portability; we’re entering an era where we’re finally savvy enough about our needs to not always get the biggest and best “just in case”.

Maybe that’s why this ad makes me laugh. Lauren wasn’t trying to get the biggest and best, like we used to, trying to make the most of her investment. She just wanted the biggest, for the least amount of money. Why? This request just doesn’t resonate, particularly not in our current computing climate. Big laptops are increasingly a pain in the ass for everyone who owns one. Currently, the only people who appear to really want a big laptop are professionals who have particular kinds of work to do that requires a big screen and a modicum of portability for presentations. I’m a professional who wants lots of screen real estate; I have an external monitor at work on which I extend my desktop. I wouldn’t want a 17-inch laptop. It’s just not practical.

The only laptop I regularly move around these days is my beloved netbook, which gets online and plays all my favourite tv programs for me while I’m on planes, trains and automobiles. I can sit at the bar and check my email on my netbook, and still have room for my dinner and my beer. I get more comments on that netbook than I’ve ever gotten on all of my macs put together. People love the idea of a usable, small, cheap laptop. If you’re a coolhunter, you’re probably looking at small, fast and cheap. You can buy gigs of space on USB drives for peanuts these days; why spend hundreds for a big internal hard drive? Small hard drive, small physical computer, big RAM, bloody great OS (Ubuntu, anyone?) No one’s that excited about a big laptop running Vista, no matter how cheap it is.

Apple is often a bit a head of its time, sometimes painfully. They got rid of floppy drives well before it was a good idea (even I had to buy an external in the 90s). They took out the phone jack in the last few years too; that’s what pushed me to give my dad my old wireless router so I could still get online when I was visiting. They’re usually on the right track, but they pull the plug on things a tad too early. They keep you slightly uncomfortable with the things they declare as dead. But why is it that microsoft always seems to be, just as painfully, a step behind? Everyone else is talking about cheap, fast and small, and they give us an ad about cheap, slow and huge?

Emerging

Emerging

So: new job title (“Emerging Technologies Librarian”). Definitely something that I wanted to see happen. I feel like it reflects what I actually do a lot better. Though I have pangs of regret when I think about instructional technology, but the lines are still blurry. Now I deliberately look at emerging technologies in teaching and learning, or maybe ones that haven’t quite emerged at all yet. Also emerging technologies as they apply to libraries in general, and our library in particular.

It’s exciting to have a job title that reflects what I’m already doing anyway, but it’s also kind of intimidating. I mean, keeping up with the trends was something I did as a bonus. Suddenly it’s in my job title.

So I was thinking about what trends I’m currently tracking, and I wonder how they fit into the whole “emerging” thing.

Second Life/Virtual Worlds. I’ve been on this one for a while, but I still think it’s emerging. Mostly because I think no one’s popularized the one true way to use virtual worlds in teaching and learning yet. In fact, there are so many wrong ways in practice currently that many people are getting turned off using Second Life in teaching. I’m still interested in it. I’m a builder, I’m interested in what you could use the environment for to build things and have students build things. A giant collaborative place filled with student-created expression of course content would be awesome. So I’m holding on to this one.

Twitter. I can’t believe I’m putting it on the list, but I am. Mostly because I’ve been talking about how great it is at a conference for some time now and I’m starting to see the argument come back to me from much larger places. People complain about what people twitter during events (“Too critical! Too snarky! The audience is the new keynote!”), but that’s pretty much exactly what would make things interesting in a classroom. I want to install the open source version and try it out with a willing instructor. I’m also interested in it for easy website updates, but most people would tell me that that’s a total misuse of the application. (Too bad!)

Ubiquitous Computing. I’ll say that instead of mobile devices. The hardware will come and go, but the concept of ubiquity for computing is fascinating. It’s coming in fits and starts; I want to see how I can push this one in small ways in the library. Computing without the computer. Ideally without a cell phone either. This is something I’m going to track for a good long while. I have this ubiquitous future in my head that seems like a perfect setting for a cyberpunk novel. (I might get around to writing it one of these days.)

Cheap Storage. As a rule hardware isn’t my area, but I’m interested to see what it means that storage capacity is getting so crazily cheap. If I can carry 120 gb in my pocket without even noticing it, what does that mean for computing in general?

Cloud Computing. This goes along with the cheap storage. Jeremy tells me we will never be affected by the cloud because we are a locked down environment for the most part, but I think he might be wrong. Even if we can’t fully employ the cloud because of security and legal limitations, I think the concept of cloud computing will sink into the consciousnesses of our users. We will need to be prepared to offer services as easily as the cloud can.

Netbooks. This fits in with cloud computing and cheap storage; if we can have tiny little computers with us at all times, massive amounts of physical storage and powerful applications coming down from the cloud, what does the world end up looking like?

Social Networks. Embracing the networks you have, on facebook, on IRC, on Twitter, on IM, wherever. Accepting that we are no longer a culture that uses its brain for information storage; we are processors, connectors. We store our knowledge in machines and in our networks. While social software may look like too much fun to be productive, those social networks are what’s going to scaffold us through most of the rest of our lives. Learning how to respectfully and usefully employ our networks as part of our learning (and teaching, for that matter) is an important skill.

There are some other pieces that are just never going to go away: blogging (for librarians!), wikis for everyone, IM: I think we’ve finally reached a point where we can intelligently choose the best tool for the task at hand from an incredible range of options. So I think part of the emerging trend is to use what’s best, not necessarily what’s most powerful, most expensive, or most popular. Things like twitter and netbooks are evidence of that: sometimes you don’t need all the bells and whistles.

So that’s my emerging update of the moment.

Best. Era. Ever.

Best. Era. Ever.

I was thinking, while reading various articles about twitter, and interactive learning, and participatory culture, and fandoms, that I’m so glad I live when I do. I’m glad I was able to be around to see the birth of things like blogs and virtual worlds and all kinds of interactive applications of the internet. So much is still unformed, undefined; the blessing and curse of the early days of the social internet is that we get to do the defining. We don’t have buck a trend, we get to try out the new stuff and give them meaning to the wider culture. We get to be as imaginative as we can.

That’s so cool.

Wireless in the Classroom

Wireless in the Classroom

My campus is planning the construction of a building dedicated to instruction; state of the art classroom technology, lots of computers, a space where a large class can take a monitored online test. There is, I’m told, a debate about whether or not to put wireless access into the building. Many instructors dislike the idea of students being online while they teach; “being online” means “not paying attention”, after all. The internet is fun and games, and learning is meant to be work.

No, that’s harsh, isn’t it.

Being online means chatting with your friends and goofing off. You shouldn’t be chatting with your friends and goofing off while you’re sitting in a lecture. It’s not respectful.

Except: what about people like me, who get so tied up in knots about the subject at hand that I need to spill my ideas out to SOMEone, SOMEwhere, and often use IM to channel my over-enthusiasm? (I think Jason took all my library school classes with me, virtually, through my constant stream of IMs.) What if that “chatting with friends” prevents someone like me from interrupting and turning your lecture into a one-on-one discussion? Or, what if the “chatting with friends” helps a student refine her critique? Or keeps her engaged, because otherwise her mind wanders and if reporting what she’s hearing about in the classroom to a trusted and interested friend helps her retain the knowledge better?

What if that trip to wikipedia, or google, helps clarify something? What if that internet activity is related to the process of learning the material?

Why does the instructor get to make the decisions about how students are going to learn?

Why are we more interested in optics than in allowing students to be adults and choose their own learning methods?

Why don’t we trust students?

Why do we not make use of the amazing resources available online while we’re teaching? Why not allow students to use virtual reference desks worldwide to get questions answered for the class, or check UN stats, or otherwise contribute meaningfully to the lecture? Why not harness the fact that students like to do something other than sit still in a room for three hours and ask students to go forage for elements that can enrich everyone’s learning experience? Why not be more interactive? Why not share not just expertise but a true love of seeking out information and turning it into knowledge? Why not just expect the learning part to happen after class, but in class as well?

Why not allow students to take notes collaboratively, on a wiki, or with Google notebook, or other, multi-cursor collaborative software?

Why not allow students to twitter their questions and ideas (you can easily monitor that)?

Why not give students a chance to react?

I’d like to throw together a video about why wifi in the classroom is a good thing. If you’ve got an opinion, go below and record me a little video describing your ideas, experience, anything. It doesn’t need to be long. I’ll mash them together into a video and upload them to YouTube. Please help!

Twitter and the Library

Twitter and the Library

My latest all-consuming project is working to redesign/rework/completely renew our library’s website. It’s still early days, but there are certain lessons I’ve learned from my last all-consuming project (introducing coureware to the campus); you can never communicate too much. Even when you think you’re communicating enough, you probably aren’t.

From the worst days to the best days rolling out software to faculty and students, no one ever accused me of giving them too much information. While the internet is a very social medium, it can also be a very isolating one at the same time. When people are trying to get from point A to point B using some software that you answer for (even if you don’t control it), there’s really no way you can get too far into their personal space. They want to know that you’re there, that you’re anticipating their questions, that you’re aware of the problems they’re encountering. I never, ever want to go into downtime or unexpected system backfires without the ability to send out a message saying, “I feel your pain; here’s what I’m doing to help solve the problem. I’ll keep you in the loop.” It’s a lot easier to cope with problems online when you know someone somewhere is working on it.

And this is primarily where I have a problem with the static library website. The first page always stays the same; it’s generally got all the same information on it. This is good when you’re trying to teach people where to find stuff, if you think of your website as a static structure that should be learned. But it’s terrible if you consider your website your library’s (non-expressive) face.

I think there are two ways to think about a library website: it’s either a published document (heavily planned and edited before it’s published, published, then referred to), or it’s your communication tool. As a communication tool, it’s not published in the same way that books are published. It’s available, it’s public, it’s indexable, but it’s not static, it’s not finished. I kind of wonder if we should get rid of the term “publish” from these kinds of online tools. Sure, you put stuff online and it’s in wet cement (as Larry put it best), ie, likely to be around forever, but our concept of publishing suggests a kind of frozen quality, a finished quality. To me one of the best things about the web is our ability to leave nothing untouched. A communication tool, rather than a published document, should never look the same twice. It should always be telling you something new, informing you, reflecting the real people behind it.

So as we start laying down the foundations for a new library website, I keep thinking of ways to pierce it through with holes through which the real workings of the library, the real voices of the people who work there, can come through. I want students to get a sense that the library isn’t a solid object; it’s a place filled with people, people who work very hard to make things better for them, at that. People working to make sure the collections match the needs of their instructors and their course expectations, helping them with assignments, helping them find the resources they need, helping them use the software they need to use to succeed. I’d like to see if we can use social software to help make that work more transparent to students and faculty alike. Librarians do good work; everyone should see that work.

The first most obvious way I thought about making sure this transparency and easy communication was possible was through blogs. In my dreamworld, these long thought-pieces about technology and libraries would go on a library blog, not my personal one. But I’m not the only one thinking about things like collections blogs with discipline-specific categories, or reference blogs. Once this information is shared and online in an RSS-able format, we can shoot it in all kinds of useful directions. And then I started thinking about the things students know right now that they’d like to know: busted printers, software problems, unavailable computer labs, courseware downtime. How busy the library is. (Ours is more often packed to the gills than not.) The obvious things. We know about them before the students do: isn’t there some quick way we can tell them?

So then I got to thinking about twitter. Twitter for immediate messages. It doesn’t take up that much space, embedded on a page. And it keeps everyone to 140 characters. Like facebook status messages, but about the systems you’re trying to use. You can find out if they’re having a bad day or not before even trying to wrestle with them. I like it. Transparency, a little personality, a little humanness, and lots of communication.

We’ll see how it goes.

The Value of Networks

The Value of Networks

To say that networks are important is to state the blindingly obvious. Networks have always been important, from the medieval European confraternity to 20th century golf courses. Now, most people I know go to conferences partly because of the conference program, but mostly because of the extra-conference networking. The conversation that you have between speakers is often more valuable than whatever the speaker is saying. The best thing the speaker can do for you, as a conference attendee, is to provide the common ground, topic, and language to allow a networking conversation to open up around you; even a terrible speaker, one who says things with which everyone in the room vehemently disagrees, can have this effect.

Is this a radical statement? Not to say that there isn’t value in hearing about the status of someone’s research, but that’s what journal articles and even blog posts are for. I don’t go to a conference specifically to hear about that sort of content; there are cheaper means to do so. I go to meet you, to engage with you, and to hear about what others think about what you’re doing while you talk about it. I’m there to meet with others over the common ground of our interest in what you have to say. A gathering of like minds: I’m there to get the whole collection of ideas. This may be why unconferences and camps are gaining popularity; they, at least, at upfront about where the value in a conference lies. Sure, the speakers are important, but so are the conference-goers. Everyone has something to contribute, and there are many, many means of doing so.

I feel like we acknowledge the importance of networking, but do our best to pretend it’s not true at the same time. A very dichotomous relationship to the concept of the social network: networking is everything, but to speak its name is anathema. A colleague of mine at the library tells me that as a Computer Science undergraduate student, the word they used for cheating was “collaborating”. There’s lots being said about networked intelligence, but if someone is looking at facebook or using MSN or AIM at work, they’re being unproductive. (Not to say they’re de facto being productive just because they’re using a social networking tool; that’s unclear without more information.) Networking is supposed to be a quiet activity that you do on your own time. It’s too fun to be work.

I got to thinking about networks a lot lately when I quizzed a bunch of friends on a favourite IRC channel about the differences between various CMS platforms, and then arranged to bring in an old friend to consult with us via AIM. While many people feel they need to hide their networking efforts professionally, I’ve always opted to embrace mine, and I intend to do so even more in future. My network, constructed out of the people that I know whose knowledge and experience I trust, is smarter than I am. As with all information, I must evaluate what I get from my network, but I have the context available to do so; my friend at Google knows lots about web search, but not so much about Google docs, and while one of my friends in California is always on to the new thing, his quick dismissal of popular applications things means his predictions aren’t necessarily on target. This is angle on an old concept. Social networking applications give us the ability to dig for context on our contacts when using our networks to help us form opinions and make decisions; how we know these people, what do we know about them, where does their experience lie, and who do they know: these things can have an impact on the way we interpret information gleaned from them. You can actually be on facebook and not wasting your employer’s time! (Who knew!)

It’s a give and take, of course. I’m not just talking about quizzing my networks when I have a question (though I mean that as well). My networks give me things to think about all the time; they shape my thinking, point me in new directions, give me a sense of where things are moving. They show me where the trends are, what I should be paying attention to. The network imparts knowledge not only in the direct sense, but also through ongoing participation. We are a participatory culture online: web 2.0 is pretty well ingrained into us at this point. We talk back. It’s the talkback that turns around and alters my brain chemistry.

I’ve been cultivating my networks for years. Because my personal life and my professional life cross at so many points, it’s serendipity that my social network can be so valuable to me in a professional capacity. One of the most exciting things to discover is that an old connection from another community is bringing a new vision and new interpretation to my wider network.

In short: I’m starting to think seriously that it’s part of my professional responsibility to read twitter, my feed reader, Facebook, etc. in order to be shaped by my network.

Meanwhile, the professional speaker circuit doesn’t like this. Attending a talk, as I’ve outlined, has a two-fold impact: the obvious one, gaining insight from the speaker, and the hidden one, where I am further inspired, provoked, and shaped by my network in light of what the speaker is saying, and my intrepretation of it. That’s my true professional development, in the crossroads of all those things.

In probably 80-90% of most business and conference settings speakers have a message to give – at keynote speeches and large company events – the large audience venues. It is not a groupthink or collaboration.

If this is what the speakers of the world think, that most of the time we are there only to absorb their message without interpreting it and reinterpreting it on our own, that the bigger the event the more we should shut up and absorb, I’m afraid they’re talking out of both sides of their mouths and supporting an educational system that just doesn’t work.

In this post, Bert Decker suggests that it’s rude to IM during a conference, but presumably it’s not rude to jot down notes. In fact, isn’t jotting down notes the best sign in our culture that you’re paying attention? If you walk into a meeting with no paper to jot down notes on, people tend to presume that you don’t think anything of value is going on. This is considered unprofessional, and you will be pulled aside and given a talking to about it. Always come prepared; always carry something to write down notes on. That’s how you demonsrate respect! Rudeness is in the eye of the beholder on this one; if people are tweeting during your talk, perhaps you should take it as a compliment. They feel there is something in the talk to record and share with their network.

The way we create and feed our networks is to participate in them. We share our thoughts on the ideas that come to us; we build systems of thought and method based on the interplay between primary, secondary, and tertiary information. The tweets of the guy next to me during a big keynote is my secondary source, the thing that provides more voices and opinions to the information I’m gleaning.

Constant networking is impossible, and it’s important to know when it will help you and when it will distract you. But while most traditional folks like to take notes when they attend keynotes, I like my notes to talk back to me at the same time. If you’re not ready for the rich dialogue that it allows me to enter into with you based not only on my own experience and ideas, but also those of my network, I’m not sure you’re the right person to be giving that keynote in the first place.

My network is valuable; I bet yours is too.

Thick Tweets

Thick Tweets

Another follow-up to a tweet, posted in response to David Silver’s attempt to use a Geertzian theory on twitter:

http://tinyurl.com/bwxrac bizarre categorization of tweets. With a link, this is “thick”
2:45 PM Feb 25th

I appreciate someone trying to apply thick description to tweets, but I’m not certain David Silver hasn’t missed the mark a bit here.

First: isn’t it frustrating that every time we experiment with web applications, there’s someone somewhere trying to tell us how to do it right? Case in point, back from 2005: “I just spent fifteen minutes clicking through about 20 Xanga sites and I CAN’T FIND ANY BLOGGING GOING ON! Is it me?” (my response). We like these applications to fulfill a pedagogical role, often to improve the profile of the use of the application to other academics and departmental chairs. Current case in point: some researchers/educators using Second Life don’t want to be associated with the “debauchery” of the Second Life Community Conference, and want to break out on their own in order to set the “right” tone.

So now we get to the “right” and “wrong” kinds of tweets. This is a challenging thing, since a tweet is only 140 characters long. Silver encourages students to “pack multiple layers of information within 140 characters or less,” and those layers are defined by links, primarily. Also by shout outs. And mentioning names.

I don’t think thick description is a good way to evaluate a tweet. A tweet’s value isn’t in how much information it’s conveying, it’s in the basic value of the information itself. Personally I quite like funny tweets, regardless of whether they’ve got links in them or not. The context of tweets doesn’t come from the tweet itself, it comes from the environment around the tweet, the news of the day, the location of the user, the user’s other tweets, the user’s blog, flickr stream, employment history, and the live events the user it attending. Tweets are ultimately snippets that don’t necessarily make sense in isolation. I’d suggest that to evaluate them individually is to miss a great deal of their “thickness”.

Some of my favourite tweets:

“Great design comes from an artistic or cultural impulse, not a focus group.”
11:06 PM Jan 24th from web cloudforest

Is there anything more newbish than using Internet Explorer? Question still valid after 12+ years of asking it.
2:31 PM Feb 27th from TweetDeck, BeCircle

Overheard this week: “Lent? That’s the musical with all the AIDS, right?”
3:58 PM Feb 27th from TweetDeck, RJToronto

Still ignoring Facebook 25 things requests, but copying my wife’s idea: I’ll gladly go for coffee/beer and answer them in person instead.
4:03 AM Feb 27th from web, oninformation

These tweets don’t really fulfill Silver’s “thick” requirements, but I find them interesting and valuable. They give me things to think about. How do you quantify the pithiness of 140 characters?

Hidden Reward

Hidden Reward

Block me, and I will go around you. Build a wall, and I will build a door. Lock the door and I will break a window. And if I don’t have have a leader to inspire me, I will lead. If I don’t have a team that will support me, I will recruit a team from beyond the organizational boundaries – every policy has a loophole, every system has a hidden reward.”
The Participatory Librarianship Starter Kit, via The Shifted Librarian

Audience

Audience

I wanted to follow up on and extend a recent tweet:

At what point does online sharing become performance? Is it always performance from the start, or does it morph as people start to watch?
11:21 PM Feb 21st from web

I was thinking about the fact that I’m flying out to Drupal4Lib unconference/camp at the Darien Public Library in Connecticut today, and each time I go to a conference where lots of ideas are flying around me, I try to capture the ones that really resonate with me on Twitter. I also use Twitter to respond to speakers when I can’t interrupt them. I use it particularly when I think my opinions will be unpopular or not particularly well accepted. Now that there are a few more people following me on twitter, many of whom I respect a great deal, I’m a bit hesitant to tweet as freely as I want to. As often as I want to. And that hesitation bothers me.

Sure, perhaps I need a little hesitation before I go publishing my ideas and responses and thoughts to the world, right? But I don’t like it. I like sharing, but I’m ambivalent about the general concept of an audience.

I guess deep down I don’t think about online sharing as sharing with an audience until I’m sharing with X number of people. That number isn’t something I’m aware of, I just sense that there is a tipping point in there somewhere.

I have permanent status now (i.e., tenure) , so I’m happier to share this fact: back during the process of dropping out of a phd program in history, I got deeply involved in a fandom community. I wrote a lot. I wrote somewhere around 400K words of fanfiction in the space of about 9 months. It was escapist, particularly to a world where the characters were all generated by someone else, and thus has nothing to do with the devastating and identity-altering reality of my existence. It was nice to inhabit a space where I didn’t exist. Call it a coping mechanism, but I learned more about social networks and technology in aid of collaboration and creativity in that space than I did anywhere else. I have a deep affection for fandom communities and I still try to follow their meanderings. One of the things I learned as part of a fandom community was the power of an audience.

When I started writing in fandom, I did so in total obscurity. I threw myself into writing, something I hadn’t done in years and I really enjoyed. It was like coming out of the darkness into the sunshine. It was incredibly therapeutic. I had been through some difficult times; a terrible break-up, heartbreak, depression, hatred of my program, loneliness, loss of identity. A lot of old feelings resurfaced. Writing was excellent therapy. I had a blog in my own name at the time, but I started a new one with my fandom identity on Livejournal, which was (and still is) the place where fandom congregated. I loved my livejournal. I loved talking about writing process, about ideas, scenes, character motivations; I loved writing about writing. It was profoundly internal, profoundly navel-gazing, and so much fun. I needed to be inside and outside at the same time; I needed to sort out so much but I didn’t want to face in myself. I can’t express how useful this process was; not just writing the fanfiction, but processing the whys and hows and sharing ideas. I had no idea how much of myself I was processing with it. (Easier to see in hindsight.)

My lengthy and frequent blog musings were okay at first. Not at all abnormal in a fandom community. But then I started to attract an audience. I was writing slash (gay romance) fiction revolving around a very popular pairing of characters, so there was a wide audience of readers for what I was so feverishly producing. Fanfiction writers tend to attract an audience, and they generally want to. It’s great to get feedback on what you’re writing. And that feedback is instantaneous. When I finished and posted a story, I would have responses to it within 10 minutes, and 60 or 70 responses within a half hour. (This is not a record: people writing more mainstream fanfiction with heterosexual pairings got far, far more responses than I would.) Many people in fandom have no interest in writing, but write to be a part of the community. Sharing writing is, I would argue, a form of gift exchange. Those of us who wrote a lot were presumably owed a lot in return; the return is feedback, recommendations, reviews, and attention in general. For people like me, noses stuck firmly in their own navels and there just for the sheer therapy/fun of it, this economy completely evaded my notice. I was getting more and more attention for my writing, albeit only from a segment of the fandom itself. I wasn’t at the top of the food chain when it comes to attention-getters, but the attention I received was certainly nothing to sneeze at. By this I mean a registered audience of a few thousand, and an unregistered audience of many more thousands. Not the millions people get with a viral youtube video in 2009, but a few thousand (8 or 9) is quite a bit for any normal individual, particularly back in 2001.

With a fairly large audience, the nature of my livejournal changed. While I still wanted to talk about process and ideas and all this internality that brought me to the community in the first place, somehow it wasn’t okay to do so anymore. With the podium I had, it was understood as incredibly selfish of me to only talk about myself and my own ideas. Suddenly it became important for me to talk about other people’s work at least as often as my own (ideally more often). Now that I think of it, maybe I’ve got this gift economy thing all backwards; what if the economy has nothing to do with the writing and everything to do with the attention? Increasingly I felt pressure to give back; more comments, more reviews, more shout-outs and recommendations; my livejounal couldn’t be my private writing space anymore. It now had to be more outward-looking. I had to give back to my audience, I had to give them the attention they were giving me. I didn’t have the space to just have fun with it anymore. Fun had to benefit others now, I had already got my share. Others, who didn’t have the attention I had, could do what I used to do, writing down their thoughts and sharing ideas with their friends. It was silencing and sad.

A friend of mine had many times the amount of attention that I got, and I saw how it crippled her public posting. Her livejournal had gone from, like mine, being a place to natter on about what she was thinking about and turned more into a means through which to inform her audience of something (updates, teasers for her next chapter, etc.), to discuss other people’s work, the larger themes of the community, and to weigh in on the “right” side of any debate. It became public property.

Perhaps fandom is a unique entity when it comes to relationships with online audiences, but I don’t think it is. This is why I objected to ranking librarian blogs when Walt proposed it. My reaction is over-heated, but this is where I’m coming from. I’m not a high-profile librarian blogger, and I’m planning to keep it that way. I like to be able to muse about whatever I feel like musing about, be that Second Life, or cancer, or the book I’m currently reading, or random conversations with my friends. I want to be able to use twitter in the way that fits best with my personality, too.

So in response to my own question posted above: I think there is a difference between sharing online and having an audience. Sharing online is fun and productive; I love using twitter to record my reactions to things and my epiphanies, because I like to share them with friends and family, and I like to get feedback from people with similiar or radically different opinions. I like their perspectives to shape my epiphanies as they’re being formed. I find that brings my thinking to a higher level. But somehow there’s a line in the sand there, and I’m not sure where it is, between sharing with a group and having an audience. I find the audience gratifying, but oppressive after a certain point. I don’t have the wherewithall to rise above the expectations of a full, demanding audience. Good thing I can twitter and blog in gentle near-obscurity. That’s just how I like it.

Edited to add: Hmmm. This is a pretty good example of what I’m talking about.

Real World Virtuality

Real World Virtuality

I started reading Spook Country last night before bed, the first chapter of which ends with a virtual world/real-world mashup that has the main character standing in front of the Viper Room in LA looking down at a dead River Phoenix on the sidewalk in front of her. Leaving aside a whole other post I could write about the significance of that particular moment to people born around when I was, it made me think about gaming and ubiquitous computing.

I suspect most of what I’m about to say is so passe to most people who think about gaming and the internet, but it was a fun revelation for me, at least.

When I first started talking outloud about ubiquitous computing in the library after the Copenhagen keynote about sentient cities, our chief librarian wilted a little. “We just built this place!” she said. But I think ubiquitous computing is not going to come from the walls at all; I think it’s just going to use the walls to interface with mobile computing.

Okay imagine it: you have VR goggles. You put on your goggles and you see the world around you, but also the game space. You have already entered in the usernames of your friends, who are also playing this game with you. You are synced up to GPS, so your goggles know where you are in relation to your environment. You have chosen a genre or theme, but the game is constructed on the fly by the system based on the environment you’ve chosen, the number of civilians in your view, weather information, and variables drawn from the user profiles of you and your friends.

So say you pick a large field by a river for your game space. Maybe you do a walkthrough it first with your goggles on so that the system can add more detail to the GPS and map data; that data would go into a central repository for geographical information. The system can then generate characters that wander past, hide behind bushes, sit in trees, etc. You and your friends can all see the generated characters because of the goggles, so you can all interact with them simulaneously. The characters might be generated by the game designers, or they might be created by users, like the Spore creature creator, with backstories and voices all supplied by fans, vetted by the designers. You and your friends can be costumed by the system; when you look down at your own (bare) hands, they might be wearing chain mail gloves and be carrying a sword.

Or say you pick a city block as your game space; the system connects to google map data, and then also takes in information about all the people around you, and uses them as part of the game. It could turn the city in a futuristic place, with flying cars and impossibly tall buildings. Running around the city, chasing aliens, avoiding civilians, being a big ole’ gaming geek in full view of the public. Awesome.

So now: the virtual library could come with a pair of goggles and a good series of fast databases.

That would be pretty cool. Just sayin’.

Not Everyone Lives Like Me

Not Everyone Lives Like Me

What a relevation.

When I first started this blog, back when it was on blogspot and it was pseudonymous, no one had the rules about what you should and shouldn’t put online. It was still early days. We experimented, we reflected, we discussed. I remember being told that I shouldn’t mention that my doctor had put me on an anti-depressant when I found myself unable to get excited about the phd program I was in. You can talk about a broken leg, but not a set of broken synapses. At the time I thought: why shouldn’t I write about things I don’t mind others knowing? Each person needs to determine their comfort level.

Time went on. Everyone was still talking about it (and, I suppose, they still are, aren’t they). Yes, anything you publish online can be seen by in-laws, employers, potential employers, potential dates, etc. But if you take that into account and think, yes, well, I struggled, I survived; why not talk about it? Isn’t it okay? If you accept that someone might take issue with you one day? Or if you know, if anyone WERE to take issue with you because of it, they aren’t someone you’d want to date/spend time with/work for?

I have deliberately held things back from this blog, many times, with those things in mind. Anything I wasn’t sure I really wanted my real name associated with, I didn’t put here. And when I was having biopsies and was scared out of my skull about my health, I shut up on here. That was purely out of fear and denial.

I’ve been blogging for 9 years now, and I’m fairly comfortable with what I’m willing to put on my blog. When I started working, I wondered about what was appropriate, but nearly four years in, I think I’ve mostly got a grip on that as well.

I’m not used to people being uncomfortable with it.

Most of the people I’m close to have had blogs for years and think nothing of it. When I meet up with people, they are often “my kind”, and are hip to the blog thing. I mean, so hip it’s square. Blogs are dead. Me and Jason finally agree: yes, blogs are dead, because blogs are everywhere. Everyone has one, so yeah, their novelty is gone.

But not everyone is in the same place as me. Not everyone is comfortable looking at people’s lives online. I remember once in a while someone used to tell me that they feel like voyeurs when reading blogs, but I’ve never understood that. Anyone with a blog knows someone might read it. There’s no reason to feel secretive about it.

But that’s my realization today: not everyone has gotten immune to the fact that everyone can create content at the drop of a hat with the internet. Inner dialogues now have a platform on places like twitter and facebook. Our insides are coming out.

I’m used to it. I love it. I’m comfortable with it. I like to engage with the world around me on a deep level. I don’t particularly do well having casual friends; I have intense friendships, or nothing. So this user-generated web is absolutely up my alley. Why only know the surface when you can dig deeper?

But that’s not everyone’s perspective. I know, not a revelation to you. It’s just a reminder to me. My way is not the only way, nor is it the default, or probably even that common.

So I shouldn’t be surprised if my web presence makes people uncomfortable. No one needs to consume my productions if they don’t want. I’m so used to being half online all the time that I think of my web presence as being half my identity. It feels completely natural to me.

Lifecasting

Lifecasting

Based on the previous post, I am seriously considering a day of lifecasting with Jason and Alex. Not sure about the logistics at all let alone a date (Jason prefers summer), but I think it would be an interesting challenge. In sum: we record as much as possible of our lives throughout a single day, in as many media as possible.

Current thoughts: photographs documenting where we are, what we look like; video documenting us interacting with our environments, pets, spouses, children, and possbly some video updates of us describing what we’re doing and what we’re thinking about; uploaded documents that we’re working on, email we’re sending (where feasible); playlists of what we’re listening to, lists of any movies/tv we watch; IM conversations; snippets of audio of things like our alarms going off, breakfast being cooked, etc.; descriptions and photos of any food we eat or drinks we drink; descriptions and data of basic things like maps of the area and weather reports. If we really want to get serious, we could add in things like body temperature and whatnot too. Full documentation.

At the the moment I think we should set up some separate place for all this information too be stored. The first thing that comes to mind is that we set up a blog with a lot of bells and whistles, and everyone who’s participating gets their own category. So you could see it all at once, or by person. I’d want to use twitter, but I’d want tweets to show up on the blog as well, in between the blog posts, ideally in a different colour. Marked off, so to speak. Also, I wouldn’t want to use my normal twitte account for all this. I bet that would just annoy the hell out of people. No sure if a blog will work as the basic platform, though. We still need to think that through. Jason may have a point about waiting a bit.

The general point of this exercise, as I currently understand it, is to demonstrate how much “information” we can create on a regular basis, turn it into digital, archivable material, and to force the question about how useful it really is. I’d also like to see for myself just what is and is not comfortable to reveal. Some obvious elements immediately spring to mind; can I ethically copy my email to the project? (As long as someone else’e email doesn’t show up as well? Can I ethically, or legally, make someone else’s email, addressed to me, publicly available? I suspect that would fall outside the scope of the project.) Will I modulate my behaviour because of how I want to be seen? Will I alter my behaviour because I know everything is being recorded? Is the concept of perpetual web archiving an influencing factor in what I’m prepared to share online? Does it stifle my communication? Does it inherently alter the nature of the information online? Traditional media certainly is shaped by its storage medium; I can’t imagine this would be any different. More than anything I’d worry that I’m being boring; will I spend all my time trying to be as witty and entertaining as possible? How does archiving actually become the material? I’m sure there are many more questions, these are just top of mind for me.

I think before we really get started I’ll have a look at lifecasting as it currently exists and see what I can learn from it. I don’t really want to do a life stream of video for archive, because the sheer size of the file such a video would have to be when it’s running the whole day makes me queasy. We could do ephemeral live streaming (I have no problem with that), but that sort of defeats the purpose. More investigation on this matter is required.

Anyone else interested in participating in this warped little experiment? It’s just one day. I think the reflection on the experience will be worthwhile. We might even have to write it up. We have lots of time to prepare. I think we have a lot of sorting out to do before we can really go forward. We can get together and develop some basic policy around how we’ll manage it. Jason’s probably right about the summer. It will probably take that long to sort out the details.

You in? Come on, it will be fun.

The Plight of Future Historians

The Plight of Future Historians

Today, the Guardian warns:

“Too many of us suffer from a condition that is going to leave our grandchildren bereft,” Brindley states. “I call it personal digital disorder. Think of those thousands of digital photographs that lie hidden on our computers. Few store them, so those who come after us will not be able to look at them. It’s tragic.”

She believes similar gaps could appear in the national memory, pointing out that, contrary to popular assumption, internet companies such as Google are not collecting and archiving material of this type. It is left instead to the libraries and archives which have been gathering books, periodicals, newspapers and recordings for centuries. With an interim report from communications minister Lord Carter on the future of digital Britain imminent, Brindley makes the case for the British Library as the repository that will ensure emails and websites are preserved as reliably as manuscripts and books.

I don’t have a lot of sympathy for this imaginary plight of future historians, in spite of being a librarian. And it’s not because I don’t see the value in content that’s on the web. There are two sides of the question that I take issue with.

First: “everything should be archived”. This is simply impossible, and is actually misunderstanding what the internet is. If you understand it as a vast publication domain, where things are published every day that just don’t happen to be books, then this desire to archive it all makes sense. But is the stuff of the internet really published? Well, what does “published” really mean?

To be honest, I think the term has no meaning anymore. At one point, “published” meant that a whole team of people thought what you wrote was worth producing, selling, and storing. It comes with a sense of authority, a kind of title. It’s a way we divide the masses into those we want to listen to and those we don’t, in many different arenas. It connotes a sense of value (to someone, at least). Many people object to the idea that there’s value of any kind of the wild open internet, because just anyone can “publish”. I learned in my reference class at library school that one should always check the author of a book to see who they are and what institution they’re associated with before taking them seriously; if you fall outside our institutions, why, surely you have nothing of value to say, and you’re probably lying! Wikipedia: case in point. We have our ways to determine whether we ought to consider what you’re saying not based on the content, but on who and what you are. Apparently this protects us from ever having to have critical reading skills. We are afraid of being duped, so we cling to our social structures.

So many people just turn that “publish” definition on its head and say everything on the internet is “published”, everyone has a pulpit, everyone can be heard in the same way. I object to this as well. Turning an ineffective idea upside down doesn’t get us any closer to a useful definition of a term, or a practice.

Currently, this is how I define “publication”: blocks of text that are published by a company have been vetted and determined to be sellable to whatever audience the company serves. This holds for fiction, for academic work, etc.

Is content on the web “published”? What does that even mean? I think we start shifting to turn that meaning into “available”. If I write something and post it online, it’s available to anyone who wants to see it, but it’s not “published” in any traditional sense. If I take it down, does it become unpublished? Can I only unpublish if I get to it before it gets cached by anyone’s browsers, before Google gets to it? What if I post something online, but no search engine ever finds it and no one ever visits the page? Was it published then? If I put something online but lock it up and let no one see it, is it published?

I think we need a more sophisticated conception of publication to fully incorporate the way we use and interact with the web. I don’t think the traditional notion is helpful, and I think it presumes a kind of static life for web content that just isn’t there. Web content is read/write. It’s editable, it’s alterable. Rather than dislike that about the content, we should encourage and celebrate that. That’s what’s great about it.

There has always been ephemera. Most of it has been lost. Is that sad? I suppose so. As a (former) historian-in-training, I would have loved to get my hands on the ephemera of early modern women’s lives. I would love to know more about them, more about what drove them, what they’re lives were like. But I don’t feel like I’m owed that information. Ephemera is what fills our lives; when that ephemera becomes digital, we need to come to terms with our own privacy. Just because you can record and store things doesn’t mean you should.

And this comes to the heart of the matter, the second element of the desire to archive everything that irks me. The common statement is that we are producing more information now than ever before, and this information needs archiving. The reality is this: we are not producing “more information” per capita. We simply are not, I refuse to believe that. Medieval people swam in seas of information much as we do, it’s just that the vast majority of it was oral, or otherwise unstorable (for them). These are people who believed that reading itself was a group event, they couldn’t read without speaking aloud. (Don’t be so shy if you move your lips while reading; it’s a noble tradition!) Reading and listening were a pair. In our history we just stored more of that information in our brains and less of it in portable media. If you think surviving in a medieval village required no information, consider how many things you’d need to know how to do, how many separate “trades” a medieval woman would need to be an expert in just to feed, clothe, and sustain her family. Did she have “less” information? She certainly knew her neighbours better. She knew the details of other people’s lives, from start to finish. She knew her bible without ever having looked at one. Her wikipedia was inside her own head.

Today we have stopped using our brains for storage and using them for processing power instead. Not better or worse, just different. We use media to store our knowledge and information rather than remembering it. So of course there appears to be more information. Because we keep dumping it outside ourselves, and everyone’s doing it.

Not to say that a complete archive of everyone’s ephemera, every thought, detail, bit of reference material ever produced by a person throughout their life wouldn’t make interesting history. I think it would, but that’s not what we think libraries are really for. We do generally respect a certain level of privacy. It would be a neat project for someone out there to decide to archive absolutely everything about themselves for a year of their lives and submit that to an archive. Temperature, diet, thoughts, recordings of conversations, television programs watched, books read, everything. We you want to harvest everything on the web, then you might as well use all those security cameras out there to literally record everything that goes on, for ever, and store that in the library for future historians. Set up microphones on the street corners, in homes, in classrooms, submit recordings to the library. A complete record of food bought and consumed. Everything. That’s not what we consider “published”, no matter how public any of it is. We draw the line. Somehow if it’s in writing it’s fair game.

But that’s not what people are generally talking about when they talk about “archiving information”. I know this is true because the article ends with this:

“On the other hand, we’re producing much more information these days than we used to, and not all of it is necessary. Do we want to keep the Twitter account of Stephen Fry or some of the marginalia around the edges of the Sydney Olympics? I don’t think we necessarily do.”

There’s “good” information and then this other, random ephemera. I will bet you that Stephen Fry’s twitter feed will be of more interest to these future historians than a record of the official Sydney Olympics webpage. And that’s the other side of this argument.

This isn’t about preserving information for those sacred future historians. This is about making sure the future sees us the way we want to be seen; not mired in debates about Survivor, or writing stacks and stacks of Harry Potter slash fanfiction, or coming up with captions for LOLcats. Not twitter, because that is too silly, but serious websites, like the whitehouse’s. We’re trying to shape the way the future sees us, and we want to be seen in a particular light.

I object to that process.

Cancerland Video, version two

Cancerland Video, version two

[youtube http://www.youtube.com/watch?v=r2HQGxbNMNY&hl=en&fs=1]

Everyone I know has already seen the first video, but after watching it myself a few times, I realized what pieces were missing from the build itself. To start: why didn’t I put labels on the spaces? I had names for the pieces, like the hall of terror and the scar display room, so why don’t I put proper labels on things? I also stopped making good use of audio after a certain point in the build. I didn’t want to be a one-trick pony, but I think the audio is very effective. So I added some more. I added some more interactive pieces into my office recreation too.

It’s all a big learning process, that’s for sure. Building something like this isn’t exactly instinctual, that’s for sure, even though I think it’s hitting on some very basic communication methods.

On a tangiential note: I love youtube’s high res options. You can actually read the narrative text through it. Awesome.

Why I like my Netbook

Why I like my Netbook

I’ve been puzzling over this, because I love my netbook (eeePC) more than I expected to. Now, I do like some gadgets, but only when I can see an application for them in my life. My first gadget was a second generation ipod. The moment the ipod came out I knew that was the gadget for me. Until then I would open up SoundJam on my clamshell ibook on the bus between Toronto and Guelph, stick in my earphones, and then close my ibook with my thumb inside it to keep it from turning off, and listen to my tunes. Ipods had moved into their second generation by the time I found the cash to get myself one. Now I have a cellphone as well, but I didn’t really feel that connected to it until I discovered text messaging. Now text is what I mostly use it for (other than calling my mom). It’s basically my mobile AIM client. I don’t have an iphone (data is way too expensive in Canada to make that worth it for me). I have a PDA, given to me by my employer, but I don’t use it anymore.

So why do I love the netbook so much?

It really turned my head around. Using it made me see one possible future for computing, and I’m intrigued. It’s a fairly powerful little creature, with a gig a RAM (could be better, true, but not bad), but without much storage capacity. It has 16 gig of storage space, which is more than twice what my first ibook had, to be honest. But in 2009, 16 gig is smaller than my current ipod’s capacity. So I can’t keep my tunes on it, I can’t put movies on it. I can’t put tons of software on it, either. It’s not exactly a digital “home”.

But then, what if things turn around and we use less and less software client-side? I can use google docs right now for all that word does. (My eee, running Ubuntu, has Open Office on it, however.) I can use splashup or others to edit images without a client. I can use meebo for online IM. I love twitter, and I can get a client for that on my netbook, but why bother? the online interface is pretty simple and easy. I don’t really need a mail client, since both of my main email accounts (personal and work) have decent web clients. What software on my computer do I really need?

As for storage: both my dad and Jeremy taught me important lessons in the last couple of months. I gave me dad a digital photo frame, and it accepts SD cards as well as USB flash drives. Why would he even put his pictures on his ibook? My netbook allows me to upload pictures to flickr directly off an SD card. Given how cheap SD cards are getting, my dad could buy new ones for each trip he goes on. What used to be a transient storage method (that cost serious dollars) is now so cheap you could just leave the data on them. He could have 16 gigs per trip (twice a year) and just store the cards. That kind of storage capacity is just never going to be feasible inside a computer. Talk about extendable.

Jeremy bought a 64 gig flash drive to store media on. It cost him $100. I paid more than that two years ago for my 1 gig SD card. My netbook has three USB drives. If I bought three 64 gig flash drives and plugged them in, my netbook would have more accessible storage capacity than my current macbook. If I bought as many 64 gig flash drives as I needed to partition my data, my netbook would have unlimited storage capacity. I could keep all my tunes on one drive, movies and TV on another, work docs and software on another, etc. I don’t want to have to open up my computer to add more storage. I’d like to be able to just plug it in. The size of those flash drives is only going to go up; I bet my netbook would have more storage capacity than my work and home computers put together pretty soon.

My netbook makes me think about a world where my computer is just a portal to other things, not a location in itself. Any computer can do that, sure; the netbook is just more upfront about it.

Also: my netbook fits in my purse. It’s low profile makes it perfect for use while sitting in cramped economy seats on overnight flights (Jeremy and I watched a lot of british television while on our overnight flight). I wouldn’t have to turn sideways to use it on a greyhound. It’s perfect for taking minutes, mostly because it’s so small that typing on it doesn’t hide you behind a screen. It’s great for the bar, which provides free wireless to patrons. I can use it and still have room for my meal. I can sit in a crowded auditorium and tweet about the keynote I’m hearing. I can connect with other conference goers without having to carry a whole computer with me. And if something terrible happens and it breaks? I’m out 300 bucks, not 1400. And because I don’t store anything on it directly, I didn’t lose any data. It’s a sturdy thing too, since it’s all flash memory and no moving parts.

It’s a form of casual computing that I really like. It’s the kind of gadget I would take with me while wandering around town, in case I wanted to stop and look something up, or blog something, or get in touch with someone via email or IM. It’s perfect for conferences. Why carry your whole life with you when you can just bring a relatively cheap little portal instead?

The small screen: completely not a problem. I thought it would be, but I adjusted to it really fast. I think Jeremy did too. When I returned to my macbook, it felt bloody HUGE. We’re getting spoiled by huge screens. There’s a time and a place for them, sure, but is that all the time?

The small keyboard: takes some getting used to, but I like it. I don’t have huge hands, though. (I don’t have small hands either.) Jeremy I think struggles with it more, but I can type pretty well on the reduced QWERTY. It’s just a matter of getting used to a new keyboard. But I don’t think I would suggest that it’s a keyboard to do all your writing on. It’s more a casual keyboard. This presumes that people have the cash to have more than one computer (something with a bigger screen and a big keyboard, and this little guy), but to me, the netbook is a nice addition to my computing family. I suspect I won’t be traveling with my macbook as much as I used to.

Anyone in the market for a beautiful black leather computer bag? I don’t think I’ll be needing it anymore.

Use of Video

Use of Video

I was launched awake at 4:30am this morning thinking about something I probably won’t be able to approach in the next six months, or even possibly not within the next year. Or ever. And yet.

I’m on a small team at work set on getting a brand new website. No facelift; something totally new. We’ve opened up the floodgates and are interested in anything we could do with a good web presence. Mostly I’m dreaming up more interactive things, which is a bit of a pipe dream. Library websites are usually not interactive on the level that I’m thinking about, but I’m still dreaming about it.

So this morning, out of nowhere, a very simple idea pings me and throws me out of bed. Virtual building. Videos. Navigating services and resources.

I’ve been talking about building a replica of our library in Second Life for some time. I want to do it less to get people interested in virtual worlds or in Second Life in particular, but more as a more interesting way to think and talk about the building and its purpose. I don’t want to lure people into Second Life (one of my pet peeves: people judging projects in Second Life by how many people who experience it stick around in-world). I’d rather they glean what they need from the experience and move forward in whatever way makes the most sense to them.

I want to have it more like an exhibit, a thing you look at and interact with in public places.

But then I was thinking: it would be dead easy, once you have such a replica build, to create short videos about how to get places. For instance: on our new website, we will probably have a section on reference services. Well, why not have a short video that shows you how to find the reference staff? It’s not exactly crystal clear in the building itself. Or how to find the technology centre, or the smart classroom, or the fiance learning centre. Or places for quiet study. Or places for group work. Bathrooms. Gosh, anything! Not a map: we already have those. But actually watching a person do it. Going down the stairs, turning right after the elevators, chosing the door on the left rather than on the right. Going up the stairs, turning left at the water fountain. The details that non-map people use to navigate their world.

Well, that’s not the idea that woke me up. The idea that woke me up was about videos that create themselves. I don’t know much about video, but I presume that’s not impossible; a video that is generated from pieces of existing video and strung together based on the circumstances of a particular set of variables. Does this take forever for a system to accomplish? What woke me up was this: wouldn’t it be awesome if you did a search for a book or journal, and the system showed you a video of an avatar walking up to the stacks and moving toward exactly where that book should be? If we had RFID on all the books this would be even more precise, but we should be able to roughly guess where a book (that isn’t checked out) ought to be. To the shelf, at least. And I got thinking about it because I was thinking about mobile devices, and having such a video play on a mobile device to help you navigate the stacks. A little bandwidth-heavy, but it was just a half-awake sort of thought.

Hacking Say and Reviving ELIZA Webcast

Hacking Say and Reviving ELIZA Webcast

Jason and I are doing a webcast on Wednesday, January 14th as the discussion arm of our article, Hacking Say and Reviving ELIZA. The article is our first attempt to consider our prior work in virtual worlds (text-based MOOs) in light of developments like Second Life. We still have a lot more thinking to do on the subject, as it’s a big one; we learned a lot back in the 90s about using virtual worlds in teaching and learning, and in constructing immersive experiences, and we want to bring our knowledge forward in a thoughtful, considered way.

Please feel free to join us to talk about these things. The article is really just a starting point for us, both professionally and as part of this discussion; we’re interested in a lot of topics re: immersion in virtual worlds, the lessons from MOO/MUD/MUSH, the directions we’d like to see virtual worlds heading, discussion of current projects, etc. Second Life is the darling of the moment, but we’re interested in the tools generally, not so much the company specifically, and even discussing what a future education-based virtual world might look like based on what we sense right now. Would there be one, or several, or would every school maintain their own? What’s the right thing? What about informal learning? How do we find the right blend to ensure the richest possible tools and experience?

Want to join us? The webcast is at 6pm EST, and you can find us here.

Note-taking goes Ubiquitous

Note-taking goes Ubiquitous

[youtube http://www.youtube.com/watch?v=DE-mnEdAf7g&hl=en&fs=1]

Faculty can usually tell if they’re being recorded, what with the need for some sort of obvious recorder. But what if the computer and the recording device are in the pen?

This would be pretty wicked if you could mash up all the notes from the class along with the lecture recording. If we could get the pen to work online instead of just off, you could see the notes being created in real time.

Gift Economies and Librarian Blogs

Gift Economies and Librarian Blogs

I’ve been turning over the idea of gift economies and the internet for some time now. For me it started with Henry Jenkins’ keynote at Internet Research 8 in Vancouver, when he suggested that fans who produce popular product should be paid by the company that owns the copyright. My gut turned sideways and I nearly shouted it, NO. NO NO NO. It registered at the top of the horribly wrong meter.

The more I thought about it, and examined my violent gut reaction, I started to think that adding money to the equation goes against the natural economy of fandom cultures. I’m pretty firmly convinced that fandoms revolve around gift economies, where fans create product that other fans consume, and the consumers are required to pay back the gift by providing feedback, linking others to the product, engaging in commentary about the product, or other fandom behaviours. I hesitate to say it, but another payback activity is deference. I shouldn’t shy away from it. It’s true. There are some fans who are seen to give more to the community than any individual can properly pay back, and thus resentments and frustrations are born. This is exactly gift economy theory, so I’m fairly certain it fits.

So my own reaction at the idea of adding money to the mix is justified; it’s the wrong kind of economy. It would swing the balance. It would increase resentment a million fold, because the people who get paid for their fandom production would become completely unpayable by fandom standards, and would be seen as a stooge of the original producer. I sell out. No longer fully part of the community. Untrustable. No spreading the wealth; any fandom creation is a product of the community, with inspiration and ideas from the community, build on the scaffold of commentary and conversation, beta readers, donations of art, video, songs, fandom trends and ideas, and communal construction of character interpretation. How can one person gain reward from something that is, at its heart, entirely dependent on the community?

So that said, I think I’m seeing the same thing happening in the librarian blogosphere, and I find it interesting. The Annoyed Librarian kept an anonymous blog ranting about librarianship. It was funny and wry and I don’t remember it being too terribly controversial in its blogspot form. People might have disagreed with her approach, but it was just one anonymous blog. There are many more named blogs to read.

But then Library Journal moved the Annoyed Librarian over to their website, and paid her to write her rants. Now she’s official, she’s part of the machine, and getting paid to do it. Perhaps I wasn’t paying enough attention to the blogspot blog and its comments, but I think there’s a marked difference in the kind of comments she gets.

A Selection:
Since I am an Annoyed Librarian too, do I get a cut of the profits?
Rehashing old posts is the best you can do? Couldn’t you have just said this in a comment on the original post? How about some original material? I guess the AL cheerleaders are happy so that’s all that matters.
If you like light and fluffy posts, you’re in the right place. Not much substance here so far.

Generally speaking, librarians don’t comment like this on non-profit blogs. Now that the Annoyed Librarian is being paid for her trouble, that changes things. Comments that won’t help: when her attempt at humour is criticized, the Annoyed Librarian says this:

I don’t need Comedy Central, I’ve got LJ paying me to write this stuff.

And, the post that prompted me to write this post:

Set a date, tell your overlordier, plan a big finale, whatever you like, but give it up. Soon. Because the joke’s been played, we’ve all been had, you’ve picked up a few pennies, and now the joke’s just going to get old. Fast. And you know I know you know that.

I want you to hit it and quit. Can you hit it and quit?

In a world where librarians get book deals and we actually do get paid to do the work we write about, I was a bit surprised to see what I’m used to seeing in fandoms happening in the librarian blog world. But maybe it’s not fandom that generates a gift economy; maybe it’s something inherent in online communities generally. (Could that be so?) Apparently, we librarian bloggers also understand our blogs to be gifts to the community rather than something that aught to be remunerated financially. People are feeling skimmed off for cash. The understanding seems to be: you wouldn’t exist without us. If you get paid for what you do, you’re using us for your own profit. And you will pay our price for that.

I wanted to think about it in terms of fandoms and fandom culture, but maybe it’s much broader than that.

R.I.P., Lively

R.I.P., Lively

Dear Google,

I’m sorry to hear about Lively. I guess you didn’t get the response you expected. But you know, it was a good idea. I love the idea of the same avatar turning up on many different pages, a representation of me that moves with me from web location to web location. It’s like an ID, but with features and motion. You were on to something there. It’s not your fault that people can’t figure out how to use it. This is always the way with things. When cool new apps appear on the horizon, everyone says: “Well, what’s it for?” People as a rule aren’t terribly imaginative.

Anyway, I’m sorry it didn’t get the reception you expected. I hope you’re not giving up on virtual worlds entirely. So many people are right now. Every time I turn around someone tells me how the concept is dead and no one wants to go near it. Why is this? We haven’t even BEGUN to scratch the surface of what we could do with virtual worlds. Every once in a while you see something amazing blossom out and people are stunned. I guess we just need a few more blooms to get people’s imaginations stirred.

I mean really; how many times have blogs been dead? And how many blogs are there now? Sheesh.

I hope you have a little party/wake in honour of Lively’s passing. I would go.

Love always,

Rochelle

Blogging in Education

Blogging in Education

In August, I was invited to come do a quick (about 15 minutes!) talk for new faculty about using blogging as part of teaching. Apparently the feedback was good, so I was invited to come back and do a longer piece on it. There are 40 people signed up, and the talk is today.

Normally talks don’t scare me particularly, because I do love to natter on about topics I’m interested in. (And really, a talk is very much like a blog post…I talk for a while, and then it’s open for others to comment, right?) But for some reason I’m anxious about this talk. Maybe because people signed up for it. They will be expecting things. Can I live up to their expectations? I don’t know.

I have things to say. I think they’re somewhat important things. Somewhat. I even have powerpoint! (Some cited CC flickr images and some power statements, but it’s in ppt!) But still.

The main gist of what I want to get across is something like…well first of, you have to match your tools to your content, your expectations, and your personality. There is no magic bullet technology that will work for everyone, and there’s no point using blogging if you’re not going to use it in a way that suits both the content, the syllabus, and your own style. A given?

I think the other thing I want to get across is the difference between formality and informality. If you want students to do more formal writing, I’m not sure this is the way to do it. Mostly because, in the case of undergrads, formal writing is not a comfortable form. It’s a way of distancing themselves from the material. It’s not honest for them. As they learn to use the tool of formal essay-writing better, it can become more honest, but…for most, not so much. If you want real thinking, really interest and passion and engagement, you have to toss formal essay-writing in blog form out the window. It’s too easy to plagiarize. And writing is good, and you can think of this writing as creating a portfolio of primary sources that can be drawn on later to create formal writing. I’ve been thinking of it in terms of honesty; allow students to be honest. If they don’t understand something and mention it, that will help them later, because they’ll be able to show how they come to understand something in a formal report.

Which leads me to something that bonked me on the head yesterday while reviewing for Learning Inquiry. I read this fantastic article that used some extremely bang-on terminology: productive failure, and unproductive success.

Here’s what I’m currently considering: we tend to reward unproductive success more than anything. If a student walks into a class knowing the subject material, that student will probably do extremely well. If a student spends 3/4ths of the class struggling with the material and getting things wrong, not understanding, struggling with concepts, and then really gets it, that student will probably not do as well. But that student is actually learning, and demonstrating learning. We don’t have an effective way of rewarding real learning.

Which is the key reason why I object to switching out the word “student” with the word “learner”, though I know it’s trying to get at the same idea. We don’t know whether we have “learners” or not, on a grand scale. Often we have a group of already-knowledgeable students who will unproductively get As and we feel good about it the learning experience. How do we measure learning? Real learning? Going from confusion to understanding? How do we even see it when undergrads often don’t even open their mouths in class? Do we really have a “Learning Management System”? Really? How do we really support and reward learning rather than merely unproductive success?

So I think blogging done well, set up with good expectations and with a fostered honesty, can reveal the actual learning going on, and can give students the option of displaying the learning they’re doing. And we can reward it that way. If a student struggles for the first half of the course and demonstrates that struggle, and then suddenly GETS IT, you’ll have evidence of their learning. You can reward that, you can grade them according to how they learned and how articulate they can be about the way in which they learned and why. At the moment we grade them based on whether or not they get it fast enough, for the most part. So you can use these tools to support and encourage productive failure as a means toward productive success. I’m not saying it’s enough to just try. Unproductive failure isn’t the goal either. Failure that builds into understanding is productive.

But the key part, it seems to me, is finding a way to get through to a class about how to use a blog. I’ve been thinking about this. I’m getting better at giving motivational speeches, and this one would be a challenge. I think you have to drop the formality, and encourage honesty. Perhaps a discussion about the wonders of productive failure is important. Or even to explain that formal writing isn’t objective, it’s just a tool for people to channel their confusion and passion in a culturally acceptable way. So let’s screw with what’s culturally expectable. Tell us what you really think. Have you ever heard of these ideas or concepts before? If so, where? Do you think it’s relevant? Why do you think you’re learning this? Do you understand the article? Was it too difficult to understand, the sentences too long and filled with jargon? Say so. Do you find this subject boring? Why? (Do you think political history is boring? Why? Because it seems too distant and filled with names and numbers, and not enough about juicy things like the real details of people’s lives? Valid comment!)

Undergraduate students are doing two things at university (among others): 1) learning content, and 2) learning to speak to faculty in the “right” way through their work, ie, learning formal scholarly communication methods. The second one is the harder one. Students sort of put on a voice they think faculty want to hear (which is where that dreaded word “utilize” comes in; it makes the student sound more formal, more serious; hahaha no it doesn’t). Students are often avoiding the learning part by trying to put on a show with the formal structure and language. So for get it for a second, for the blog part; let them just be honest about what they think. They can shape that into formal communication later.

As I’ve been writing this, Jeremy sent me this article about how students expect a better grade because they “tried really hard”. That’s not what I’m saying. I’m saying: let productive failure be okay in your class. Trying really hard and getting nowhere doesn’t deserve a better grade. You need to succeed to get a good grade, definitely. You have to end up at point B from point A. But how you get there might be different. I’m just saying: let students have a shot at getting there in their own way.

Using blogging to track productive failure isn’t changing the whole structure, after all. It’s just giving students one assignment, just one, where being confused about the subject is okay. If they can build on their failures and come to understand, to turn it into a productive success, just for one assignment, isn’t that a valid part of a well-rounded education?