One of the most striking differences, in my opinion, between Windows 3.1 and Windows 95 was the taskbar. Windows 3.1, if you recall, relied mainly on the Alt+Tab keyboard combination to switch between programs that were running at the same time. Sure, you could use the program's minimized icon on the desktop, but then you had to minimize the window and double-click on the icon. While Windows 3.1 was designed for multitasking, the UI for switching between tasks was not as obvious as the taskbar of Windows 95 made it to be.
I remember the first time I saw my dad dial up to the World Wide Web. I think it was early 1995 or late 1994. He had to dial up the ISP using one program, which I vaguely recall as having the word "Winsock" in the name. After that program connected to the WWW, he had to Alt+Tab back to the Program Manager and launch NCSA Mosaic to actually browse the web. I remember thinking at the time that switching between the two programs in that fashion seemed unintuitive. The notion seems quaint now, with the advent of the taskbar and its glorious reminder that we can run as many programs at once as our computer's RAM can handle.
I mention all this multitasking nonsense because of an article I caught on CNET today. I've seen a bunch of these articles appear over the past couple years that describe frazzled executives and workers who can't cope with "information overload." This CNET article sounds the typical alarm about people who need to "unplug." But I think that if they talked to a younger generation of workers, perhaps those who are just entering the workforce after college, they would find that multitasking is just a fact of life, and perhaps a source of pride.
Although I am old enough to remember the pre-Internet days of communicating, I pretty much grew up using IM, email, and all the multitaskable tools of which the article speaks. I was one of the first people I knew to get broadband access in late 1998 (my mom was tired of picking up the phone and hearing those wonderful modem-screeching noises). Playing lots of video games enabled me to pay attention to many things at once. And, although I lament the fact now, watching MTV and other rapid-fire television programming left my attention span able to cope with short barrages of intense information.
This familiarity with multitasking led to my acceptance of the idea as normal. I guess I can't imagine not multitasking or not being available by cell phone/IM/etc. The workplace has changed so much over the past 15 years, but because I only entered the workforce two years ago, I slid easily into a job that required me to wear many different hats at all hours of the day. Even here at NI, being a technical writer definitely tests my ability to multitask, but I find myself almost looking forward to the challenge.
Another pertinent effect of being familiar with multitasking is that I learned very early on how to take time out for myself to unwind or unplug. That's why I can't really relate to the CNET article. Those interviewed stressed the problems they've had adjusting. But I spent my entire pre-work life "adjusting," although I didn't know it at the time. So that's why I'm comfortable with the thought of multitasking 8 or 9 hours a day: I know how to do it and I also know when to take a break and chill for a bit. I imagine it would be hard to adjust to such a multitasking-oriented workplace if you were not already used to being reached by IM, email, cell phone, etc. anywhere you went.
Maybe as more people raised similarly enter the workforce and become managers and executives, we'll see articles and studies like these disappear because we'll all have become used to these things called computers and how they keep us in touch with various facets of work at the same time. Of course, technology moves at a rapid clip, and it's all we can do just to keep up - so it's possible that what I now consider multitasking will seem as quaint as the notions I espouse in the first two paragraphs. I'm sure the next generation will have its own set of workplace issues to get acquainted with.
On a completely unrelated note, I find it amusing when software developers identify themselves with the software. Example: "The code I wrote pops up an error when a user tries to do that" becomes "I pop up an error when the user tries to do that." I know it's shorthand (or shortmouth in this case), but I always like to imagine one of our developers knocking on a customer's door and holding up a sign that reads "ERROR!" in big red letters. Of course, now when someone asks me about a help file I've written, I say "I have a paragraph about that" or "I talk about that here." The notion of "owning" help files is in itself something I've thought about recently. Maybe I'll make that the subject of my next post :-)
On an even more completely unrelated note, this is pretty sweet.