Over the past several semesters, I think I've become a better teacher.
What I mean is, I'm slowly attaining the delicate balance of imparting something valuable rather than just giving away answers. It's a fine line between being perceived as useless (or worse, lazy) and being credited with helping a student learn something, and librarians are usually not recognized for their teaching capacities. Yet though I was never trained to be a teacher, I'm slowly picking it up, checking my initial impulse to 'give' everything away. I'm starting teach students to fish rather than giving them fish, so to speak.
Interestingly, in lectures and in-person interactions, this balance is difficult but not impossible to achieve, but I find it much more challenging when it comes to web design. There, you want to encourage certain behaviors (eg. use the catalog to find books) but it's so much easier to provide a list of books, for example. And online there are alternatives -- to persist with the example, if you create a complicated page explaining how to use the catalog to find books, students will turn elsewhere. I think learning how to function online is a repeated process of acting --> reacting --> acting (trial and error). Whereas, in a classroom or interpersonal experience, the learning process is closer to reading or listening --> absorbing --> acting.
I'm still mulling this over, but it has a lot of implications for how we should be designing our online library services to be truly useful to students.
I'm noticing a pattern at work in the library: I'm confronted with either starting something from scratch and making up standards as I go, or working with old rules that constantly require updates or work-arounds. I've been wondering broadly how to know when it is easier to make up new rules from scratch versus working with old rules.
When I think about it, there are lots of everyday examples where invention is preferred to innovation, or vice versa. Look at Wolfram Alpha (although arguably it relies on google as a standard), the QWERTY keyboard, or, in a less positive vein, Stalinism, for examples of invention rather than innovation. Office computer iconography (or for that matter web iconography) relies on pictures of already-familiar objects (file folders, recycling bins). An e-book looks very similar to a physical book, with virtual pages. Robots frequently take animalistic forms. These are examples of developments that potentially could have been radical, but instead relied on an old set of ideas to progress toward something new.
Obviously it's harder to be successful when reinventing the wheel, because it's difficult to anticipate not only what will be needed over time but also people's behavior. Objects (tea kettle, automobile, door handle) and habits (checking for messages, tracking expenses, communicating with family) frequently seem to evolve over time, rather than coming into existence fully formed.
Here are areas where I wonder whether libraries should scrap it and start all over, or persist with what we have and modify:
1) Format-bound metadata rules. When the library was only responsible for physical objects, our standards worked, and it's tough to give them up. This is not my area of expertise, but when does it become simpler to abandon standards based in a physical environment in favor of a new set of rules?
2) Lending/borrowing rather than providing access to. Integrated library systems were built on the lending/borrowing model, not the "access to" model. Why not build a new system that has both in mind, rather than having to fight systems so that they include electronic resources?
3) Reference ('how-to-use-the-library') materials. Frequently, as I revise out-of-date guides and web pages, I wonder whether it would be less time-consuming to just start from the beginning. Sometimes I spend so much time editing that I think it would have been faster to start with a blank sheet of paper in front of me.
Despite the appeal of a blank sheet of paper, invention can also be exhausting. I just wish I knew in every case which would be the better tactic. I hope this knowledge comes with experience...
Over the past several weeks (okay, months) I've been reading the book The Gutenberg Elegies while I eat lunch. In case it's not obvious from the title, it's a book about books. Specifically, it's a book of essays on various aspects of reading and books, with implications for the information society and computing, etc. etc. etc. Although the title suggested otherwise, I was half-hoping that the tone would be optimistic. Alas, while the author (Sven Birkerts) does try and stay upbeat, he can't fully suppress his disappointment about the passing of the book.
I'm right with him when he describes how books have positively impacted his life, and how they've done all kinds of wonderful, transformative, mind-expanding things for him. But when he makes it seem like books were solely responsible for his intellectual life, he loses me.
I mean, yes, I like books a lot too, but I just can't think that using a computer instead of a book for the same darn task is the end of the world. Nor can I see where fetishizing a book does any good for a person's brain. Birkerts describes how a book grants an individual uninterrupted, solitary, concentrated space, and how important a dialogue is that involves one voice (the writer's) talking to someone else (the reader) with little or no distraction. But if these qualities are so universally important, and are exclusively found with books, then books will never completely disappear, right?
I don't want to misrepresent the Gutenberg Elegies. The author acknowledges how exciting computers are, and for being published in 1994 he's remarkably prescient about a lot of what was coming. But I don't understand nostalgia about books. Maybe this would make the author sad too -- that I don't even know what I'm missing, as part of a generation of adults from a hybrid background. Then again, maybe it would be comforting. I still have books in my intellectual landscape, but I don't cling to them unhealthily, and I'm willing to admit when something else would be better. I don't long for the days before computers, but I do approach new technologies with skepticism.
All of us are faced with navigating information in whatever forms it takes. Rather than eulogize, we should celebrate.
One argument I've heard in support of a library maintaining printed books on physical shelves is the ability to browse. I've been thinking about this for a while, because web browsing seems pretty standard (and easy).
So what are the differences between browsing a shelf of books and browsing online? Linking (usually in the form of mouse-clicking) is required online, but interactivity is good, right? You can come to the end of a bookshelf fairly easily, but online you can browse nearly infinitely -- again, isn't that a good thing?
But maybe what people who favor physical book shelves don't like is that when you search online you use specific words, and your corresponding results are limited by your vocabulary. As opposed to, say, finding a book in a library catalog, going to a shelf to retrieve it, and noticing an even better book on the same topic next to the one you thought you wanted.
In the context of higher education, we should be fostering accidental discovery by encouraging students to be curious and interested in pursuing information beyond what they originally look for. It's true that a lot of online tools try and match your search terms as precisely as possible, and it's also true that if your search terms contain your biases, you are guaranteed to have those biases confirmed. But what makes a student curious enough to look beyond the results generated from a (sometimes limited, partial, or myopic) query? I think ultimately the ability to find a variety of ideas and opinions depends on curiosity, and not whether a resource is on a shelf or a web page.