July 22, 2005

One of the most striking differences, in my opinion, between Windows 3.1 and Windows 95 was the taskbar. Windows 3.1, if you recall, relied mainly on the Alt+Tab keyboard combination to switch between programs that were running at the same time. Sure, you could use the program's minimized icon on the desktop, but then you had to minimize the window and double-click on the icon. While Windows 3.1 was designed for multitasking, the UI for switching between tasks was not as obvious as the taskbar of Windows 95 made it to be.

I remember the first time I saw my dad dial up to the World Wide Web. I think it was early 1995 or late 1994. He had to dial up the ISP using one program, which I vaguely recall as having the word "Winsock" in the name. After that program connected to the WWW, he had to Alt+Tab back to the Program Manager and launch NCSA Mosaic to actually browse the web. I remember thinking at the time that switching between the two programs in that fashion seemed unintuitive. The notion seems quaint now, with the advent of the taskbar and its glorious reminder that we can run as many programs at once as our computer's RAM can handle.

I mention all this multitasking nonsense because of an article I caught on CNET today. I've seen a bunch of these articles appear over the past couple years that describe frazzled executives and workers who can't cope with "information overload." This CNET article sounds the typical alarm about people who need to "unplug." But I think that if they talked to a younger generation of workers, perhaps those who are just entering the workforce after college, they would find that multitasking is just a fact of life, and perhaps a source of pride.

Although I am old enough to remember the pre-Internet days of communicating, I pretty much grew up using IM, email, and all the multitaskable tools of which the article speaks. I was one of the first people I knew to get broadband access in late 1998 (my mom was tired of picking up the phone and hearing those wonderful modem-screeching noises). Playing lots of video games enabled me to pay attention to many things at once. And, although I lament the fact now, watching MTV and other rapid-fire television programming left my attention span able to cope with short barrages of intense information.

This familiarity with multitasking led to my acceptance of the idea as normal. I guess I can't imagine not multitasking or not being available by cell phone/IM/etc. The workplace has changed so much over the past 15 years, but because I only entered the workforce two years ago, I slid easily into a job that required me to wear many different hats at all hours of the day. Even here at NI, being a technical writer definitely tests my ability to multitask, but I find myself almost looking forward to the challenge.

Another pertinent effect of being familiar with multitasking is that I learned very early on how to take time out for myself to unwind or unplug. That's why I can't really relate to the CNET article. Those interviewed stressed the problems they've had adjusting. But I spent my entire pre-work life "adjusting," although I didn't know it at the time. So that's why I'm comfortable with the thought of multitasking 8 or 9 hours a day: I know how to do it and I also know when to take a break and chill for a bit. I imagine it would be hard to adjust to such a multitasking-oriented workplace if you were not already used to being reached by IM, email, cell phone, etc. anywhere you went.

Maybe as more people raised similarly enter the workforce and become managers and executives, we'll see articles and studies like these disappear because we'll all have become used to these things called computers and how they keep us in touch with various facets of work at the same time. Of course, technology moves at a rapid clip, and it's all we can do just to keep up - so it's possible that what I now consider multitasking will seem as quaint as the notions I espouse in the first two paragraphs. I'm sure the next generation will have its own set of workplace issues to get acquainted with.

On a completely unrelated note, I find it amusing when software developers identify themselves with the software. Example: "The code I wrote pops up an error when a user tries to do that" becomes "I pop up an error when the user tries to do that." I know it's shorthand (or shortmouth in this case), but I always like to imagine one of our developers knocking on a customer's door and holding up a sign that reads "ERROR!" in big red letters. Of course, now when someone asks me about a help file I've written, I say "I have a paragraph about that" or "I talk about that here." The notion of "owning" help files is in itself something I've thought about recently. Maybe I'll make that the subject of my next post :-)

On an even more completely unrelated note, this is pretty sweet.

1 comment:

  1. Kaylene said...
    I WANT ONE OF THOSE KEYBOARDS!!!!!! Seriously, I do. They said that it should be as inexpensive as a mobile phone. I'm a bit of spendthrift.

    In response to your other musings, I think about this same subject occasionally. My office is filled with men over 40. Most of them are old enough to be my father and have children my age or older. That can be frustrating because they didn't always take me seriously in the beginning. Of course, now they know I'm a capable person and not just some kid. But there is a definite gap between previous generations and our own. We are the same age and it sounds like we had similar pre-Information Age experiences.

    My father got our first computer - a Tandy 1000 - in 1981 or 1982. I was 1-year-old. Within a year or two, he was an amateur programmer and I was a computer literate 4-year-old. I remember playing Jeopardy, dice games, and other monochromatic DOS games. Eventually we had Windows 3.1, WordPerfect, Compuserve (more like Telnet and via Winsock), and Carmen San Diego. My dad started building computers in the late 80s and I helped him. (I was always more of a software person and my younger brother was a hardware person. We've always joked that we should start a business together.) It wasn't until high school that I realized not everyone has a computer. And then I realized how lucky we are. We have a great base. I am surrounded by technology and I don't feel any amount of stress. I agree that multitasking is a way of life for our generation. Games like Quake and Sim City (yes, Sim City) let us multitask till our hearts' contents. I feel most comfortable when I have something to do and a list of "somethings" is even better. I've observed many of the same things you have and come to similar conclusions. I like science fiction, and this is my picture of the future:

    1. Fuel cell cars that look like the cars in A.I.

    2. Eyes that flash, like Tom Cruise's in Minority Report.

    3. A super-intelligent society, like Gattaca.

    4. A multi-tasking world in which we all work for 4 hours a day because that's all that we need, like many Phillip K. Dick novels (Do Androids Dream of Electric Sheep, Minority Report).

    Just some Tuesday morning comments to go with my coffee. :)

    I think technology is wonderful and that we live in a great time of innovation and discovery. Everything is still new. We don't yet take it all for granted....much. Yet when we are old and our grandchildren are our age, we will seem like slow dinosaurs compared to their lightning fast reflexes and minds. My son is 2 and he plays Unreal and the Simpsons: Hit and Run better than I do. My daughter is 5 and she has mastered Unreal, Grand Theft Auto: Vice City, the Sims I and II, as well as Photoshop. Their hand eye coordination is off the charts. If only we had games like THESE when we were kids; although, Duck Hunt was close. -sarcastic eye-rolling-

    Have a great day. (Sorry for once again leaving you with a book to read in the comments section of your own blog.)

    ReplyDelete