Log in

No account? Create an account
Previous Entry Share Next Entry

Collaborative software

On Tuesday I went to the Expo portion of Enterprise 2.0 Boston 2012 - essentially a conference on the equivalent of social-networking in business. This used to be called Groupware and the academic side equivalent is CSCW (Computer Support for Cooperative Work). Going to these expos is about my only exposure these days to technology other than what I see on my screen or install on my computers.

I like to ask the demonstrators how their products support review or inspection of documents and relate this to my 20-years ago experience in this domain.

Twenty years ago our Applied Research group in Honeywell-Bull (working with technology from a research group at the University of Illinois) built a working prototype that could be used for the Software Inspection process (a more formal kind of peer review) that some companies like ours were using. The essence of Inspection is that a document (or program code) is given to a small "inspection team" who in the Preparation Phase look at it and make notes. Then a meeting is called and the leader walks everyone through the document and the inspectors raise issues that they found. The team classifies these defects and attempts to determine the cause. Afterwards the author corrects the document or code and the statistics are rolled up so the organization can learn what kind of things are causing defects so they can improve their process. Although designed for software this process is useful for other things such as legal documents, marketing materials, presentations, etc.

In our system, called Scrutiny, inspectors look at the document on their computer and make annotations. Then at an appointed time they all sit at their screens, the moderator zooms his/her mouse over a portion of the document and that gets highlighted on everyone's screen along with the annotations made to that portion and the discussion ensues via the text messaging component to identify the defect and classify it. We first demoed at the CSCW Conference in the fall of 1992. We wrote a few papers - I was usually the lead author and among other places got the this one: Scrutiny: A Collaborative Inspection and Review System accepted to the European Software Engineering Conference in Garmisch-Partenkirchen.

We continued to develop it, got some people in the company to use, got a DARPA grant, and made several efforts to productize it including a possible spinoff- but large-company politics made it impossible so the idea was dead in 1995. I do point out that Network Technology was pretty crude in those days (Mosaic, the first Web Browser first appeared in 1993), so this was a pretty ambitious project.

None of the vendors that I talked to can easily support the synchronous meeting portion. Some use screen sharing which is a partial solution but no-one had the capability of allowing everyone to easily see other people's annotations (in our system this was automatic). Too bad, I still think this would be a useful function. There are some products available today - such as CodeCollaborator by SmartBear software that look like they do a pretty good job at this function but I've never seriously explored it.

(I used the Multics logo for this post - we started doing Peer Review of code in 1968, and had an electronic meeting tool, Forum which we used in the early 1980s, occasionally for peer review.)

  • 1
Ed Weller, collected some data about the cost of finding defects - in the field after products are released, in the test cells via the standard test process, and by inspection. Inspection was a clear winner and he used this data to help convince management that inspection could be done.

All the studies that I'm aware of on this subject show that finding defects through inspection is the most cost-efficient way of proceeding. It pisses the customers off much less, produces better code, and better designs. The difficulty is that it front-loads the project with time "delays" to do inspections and deal with issues, and with costs that are tangible, while defects found later on or in the field had their own budgets for correction and thus didn't impinge on the project or its manager, who by then will have gone on to a greater and more glorious existence on the next project.

Tom Gilb is a Cassandra on this one. Every organisation he's assisted in design and production has improved afterwards. He bangs on about how inspection and strict methodology (I forget what he calls it) of his own invention will improve the end product. But organisations just cannot see that spending time and money before line 1 of the code is written brings vast dividends.

(Deleted comment)
With regard to your Agile comment earlier, my thoughts are that small, tight iterations can be a good thing, but the methodology as a whole tends toward entropy, not an ordered system. Everything gets pushed to the next cycle, and yes, you may have a product, but it's not a very good one. Funny thing, most people have been so beaten and abused by their software, they sit down, and take it, actually believing it's *good*.

It's always jam yesterday, and jam tomorrow, but never jam today. I agree with your comment to a certain extent: a lot of the crapware out there has not been planned except in a vague, Judy-Garland-Mickey-Rooneyish way ("Let's put on a show!") I believe that Agile is just the current software development fad. It's the computing equivalent of: "It's not about the destination; it's about the journey." which is great for the developers—they get paid to go on the journey, so the longer the better. For the end users, they want to get to the destination quickly and have working software.

Since I was a tester, software doesn't beat me up: I knock it to the ground, stomp all over it, turn it over and kick it in the crotch, and make sure that it doesn't get up again. Bugs follow me like children followed the Pied Piper. It often gets in the way of using software, since I am constantly stumbling over bugs while working.

  • 1