Monday, May 30, 2011

Technology for Pleasure, Technology for Work

These days I'm interacting with a computer during almost all of my working hours. Even in meetings, there is usually at least a smart phone at hand, and this comes in addition to computers in my office, at the reference desk, & when teaching in the computer classroom. Maybe this is why when I am not at work I now tend to avoid computers. Computers have become associated with my job, and only when not at my job do I feel free not to be at a computer.

There are a few reasons why I bring this up. First, I always try to use my time at work productively, and I tend to scrutinize my behavior to that end. However, I've noticed that simply using a computer -- even in ways not obviously related to my job, such as reading a New York Times article -- frequently has a positive impact on my performance, be it when supporting a technical problem or explaining the difference between a wikipedia article and a Britannica article to a class.

Secondly, I seem to be losing the component of fun in my relationship to computers. Part of what used to make technology interesting to me -- its potential for teaching and learning in uniquely challenging and entertaining ways -- has morphed into something different, in that now I think of technology in the limited context of productivity and efficiency.

I find it really easy to forget that not everyone has such ready and continuous access to computers, and that the students I interact with are often still mainly interested in technology for fun activities like communicating with friends. But it's frequently this spirit of play that drives widespread adoption of a technology, rather workplace productivity.

For instance, last semester the library's instant messaging account was friended by someone who now regularly writes short stories to us. While some of the librarians find this less charming than I do, I'm glad to cultivate this type of connection. It's hard to put my finger on why (could it be something to do with exploring the possibilities and limitations of the tool?), but in the scheme of technology-centered interactions, this one seems more interesting and meaningful than those involving merely mechanics.

Tuesday, May 17, 2011

Starting the Summer

It's between semesters right now, as Commencement is on Saturday and the summer sessions don't begin until the following Monday. Usually this is a good time to play catch-up, take a breather, and plan for the future.

However, due to a budgeting miscommunication we're suddenly having to make large cuts to our subscription databases; we should have a new Library Director by the end of June; and there are several staff absences on the horizon. Variables like these tend to make planning difficult.  

Begin side note / In one episode of the TV show Parks and Recreation, the main character Leslie Knope, played by Amy Poehler, faces losing her job as a consequence of cuts in local government funding. Despite being thoroughly apathetic about their department, Knope's boss Ron Swanson ends up coming to her defense. His own job is secure because he is seen as someone who keeps costs down, compared to the exuberant Knope who continually has a new project or improvement in mind. Here's my point (finally): Sometimes I feel like Leslie Knope, in that I'm constantly having to be told no. Which I understand, I really do. This is a horrible economic climate for any department to be asking for resources. So we keep plugging along as usual, doing our best despite myriad limitations in time and money. (And now serious cuts, which I already mentioned.) On bad days, this makes me frustrated and sad; on the good days I'm motivated to face the challenges and keep fighting the good fight. / End side note 

So rather than focusing on the long-term, I'm working on the short: I'm starting to get the reserves textbook collection in order for the fall, I'm trying to resolve some ongoing computer issues, and I'm putting together the library's faculty newsletter.

Next week the college switches to a 4-day workweek. It's a good time to transition to summer.

Monday, May 9, 2011

Outsourcing our Brains?

As it becomes more socially acceptable to halt a conversation to look something up on a smartphone, I wonder what is happening to our memories. Memorization of everyday facts doesn't seem to get a lot of respect these days, compared to a person with an iPhone.

Outsourcing our memories to machines is unlikely to stop anytime soon, but it will be interesting to see how other professions besides librarianship change in the coming years. Would you trust a physician who relies on an electronic device to remember standard diagnoses and dosages, for example? What about a researcher who can't spell? Reference librarians hardly do any work that involves looking up routine facts; increasingly we have shifted to assist more with informational processes related to comprehension, analysis, and integration.

Even if computers can beat us in the memory arena, the skill is still a measure of human intelligence. Unfortunately this is only obvious when when we are offline and disconnected, which is a decreasing amount of the time. But what makes us smarter than the machines we have created? We tend to change what we mean by intelligence in order to feel smarter than the machines, but there is less and less that machines don't 'know.' Here I'm echoing thoughts from a recent New Yorker article by Adam Gopnik:

"We have been outsourcing our intelligence, and our humanity, to machines for centuries. They have long been faster, bigger, tougher, more deadly. Now they are much quicker at calculation and infinitely more adept at memory than we have ever been. And so now we decide that memory and calculation are not really part of mind. ... We place the communicative element of language above the propositional and argumentative element, not because it matters more but because it's all that's left to us." [my bold]

Or maybe this is all wrong, and it's most accurate to say that our memories have been technologically enhanced in order to compensate for the increased quantity and availability of digital information. Machines may assist us, but we will continue to rely on our analog brains for the type of information -- even dry, fact-based information -- that we use every day at work and home. This is still quicker, at least until we embed microchips in our heads. But memorizing the type of information we don't access regularly, just for the sake of it, is less and less necessary. Fair enough?