Wednesday, August 25, 2010

Projects & Relative Problems

This seems very true some days:
(Picture from http://ninapaley.com/mimiandeunice/)

Whether at work or in one's personal life, it seems to be that by default we want to prioritize our own problems, and view them as more important and difficult than those faced by others. Our needs are at the center of how we see everything else. We can look back later and put a crisis or breakdown in perspective, but in the moment it's hard to have insight like that. But when it comes to the workplace, it can be costly to an organization for an individual to take this stance. In trying to mitigate projects and the inevitable conflicts that arise, this mentality is a particularly difficult challenge for someone like a business analyst or project manager.

Say, for example, I'm trying to get two different department heads to communicate with each other, and communicate to me, about priorities, problems they need fixed, and what their requirements are for a certain project. But if they can't come up with supporting data/information about why their needs are more important, all that happens is a back and forth of "My needs are more important than theirs because... because" and reasonings that have no data to back them up. If I can't get this information, I can't validate any decisions I make in terms of the project direction, nor can I make much (if any) progress.

Anyone in an organization that is tasked with coming up with requirements or needs must be able to not only articulate those needs, but have validating data to back up the reasoning behind those needs. We can't become personally attached to "having-our-way" when there's no data to back up why our way is "better", or why your problem is the most critical. We have to be able to tell someone or many someones why we choose to do something, and what value that decision adds to the project.

Getting that data can be tricky, full of politics and personal feelings, and sometimes it simply isn't going to happen. Information is not a straightforward topic, and it is really the people that make it a challenge and "interesting" in both good ways and bad. But it is worth the effort when you can stand in front of a group and present a solid project with an information foundation you can rely upon.

Wednesday, August 18, 2010

Norman Strikes Again

You know it's going to be interesting when one of the most well known names in HCI design write a blog post called "Design Thinking: A Useful Myth".

Don Norman's post this past June is well worth the read, not only because it is well written but because it has some interesting points that might make for good discussions amoung "designers". Norman talks about the fact that the idea that designers somehow have mystical powers of intellect and perspective can be a useful myth, but that this idea of "design thinking" is simply a myth. He makes the point that it really isn't a unique characteristic to this profession. Breakthroughs simply occur "when people find fresh insights, new points of view and propagate them". Creative people are all around. But the design community, across all sectors, has a vested interest in perpetuating the myth that designers somehow have the monopoly on design thinking, which is according to Norman really simply "a public relations term for good, old fashion creative thinking.

1) The myth helps fight against the confusion that "design" equals "making things pretty". Design is so much more than that, and every little piece that might change the popular mind helps.

2) It helps get designers into the door at organizations. "Hire us, they say, and we will bring the magic of design companies to you, working wonders upon your dead, stilted, unproductive company." Ultimately, Norman points out, the "design thinking" pitch is akin to claiming a secret weapon that has to power to solve big problems, and that is a valuable tool.

Norman wraps up the post with an interesting challenge to designers:

"So, long live the phrase 'design thinking.' It will help in the transformation of design from the world of form and style to that of function and structure. It will help spread the word that designers can add value to almost any problem, from healthcare to pollution, business strategy and company organization. When this transformation takes place, the term can be put away to die a natural death. Meanwhile exploit the myth. Act as if you believe it. Just don't actually do so."

Tuesday, August 10, 2010

Why Are We Still So Paper-Based?


It continually amazes me how much paper my last job produced - I was working as a database coordinator for an academic department at the UW. Sounds like a pretty electronic-centric position, right?

Wrong.

I had a brown paper bag I used to collect mixed paper for recycling. I think I emptied it every other week. That doesn't seem too bad if you don't think too much about that. A supposedly paperless position, filling up a Safeway brown paper bag every two weeks!! Reports and presentations and notes, printouts and event planning materials.

Currently I'm interning with the university's information technology office, and it just baffles me. We print out so much electronically-based documents and artifacts. Everything from emails and reports and meeting handouts. While laptops are dominant, notes are still often taken with paper and pen. Though personally I'm thankful that what I'm doing now produces very little printed work, even though I find myself in the center of a paper-laden environment.

I think that this is partly a holdover of established working practices - it is only very relatively recently that we could make almost any document electronic. This is probably the core of the reason - people who have been working for the last 15 to 30 years are not likely to want to change how they've been doing things. In a world where technological innovations have ripped through our societies, radically changing and shifting established assumptions and practices, it is difficult to keep up with the latest anything, let alone quickly adapt.

This is not to say that there haven't been successful moves to reduce paper use - as part of a university -wide initiative, the office is required by state law to use 100% recycled paper, reduce paper use by 30% beginning July 2010, and to recycle all office paper. It seems as if there is steady support for it, but it certainly takes more than throwing away paper in a different box. Moving towards paperless or paper-scarce environments requires a shift in attitude, behavior, and an acceptance that it won't always be convenient or the same as it was before.




Friday, August 6, 2010

The End of Google Wave

A recent article by Maggie Shiels, a Technology reporter for the BBC News, proclaimed the news that "Google drops Wave because of lack of users". Reading through this, it seems a little strange to me. But it seems as if that because "Wave has not seen the adoption we would have liked", says Google, they are going to phase out the site and integrate some of the developed technology into other Google projects.

Honestly, giving up on Wave seems premature to me. I have been able to use Wave for a few different classes, both for note taking and project managing, and it has been a very interesting tool. Yes, I agree with the assessment that for it to be successful, many people need to be signed up and using it. I think pushing it within both corporate and education environments should have really taken it off. Wave has this awesome real-time communication and collaboration aspect, and it does it all within a browser. It has character-by-character live typing, and the ability to drag-and-drop files from the desktop, even “playback” the history of changes that users have made. It integrated other forms of communication and I'm really disappointed that Google is giving up on it. Perhaps it is because users required invitations, but that doesn't seem to be the case - I still have a good number of invitations that are sitting unused in my account.

I think this is indicative of the core problem with many information technology initiatives. This great technology exists, but people lack either the incentive or knowledge for how to use it to its fullest potential. So I don't think that the retirement of Wave is a technology failure, but an information one. If Google is going to "create innovations with the potential to advance technology" they're going to have to innovate better user communication and education along the way.

***

On a side note, I've been thinking about something recently. I don't read the news as much as I'd like to these days. During my undergraduate studies, I read the New York Times every morning, took the copy with me throughout the day because I would likely reference it at least once either in a class or in conversation. For the most part I focused on political and international news, and "digested" almost all the news articles through that lens.

Today, the way I get news it quite a bit different. I skim the international BBC headlines online when I get into work, briefly scan through my Twitter feed for local and technical news, and during my lunch break I'll dive deeper into a handful of stories that catch my eye. These articles are almost always technology or information related, which makes sense considering my program. Seems to be the story of grad school so far - all this information I want to process and not enough time to do it.