Okay I'm thinking about definitions 0, 1, and 4 off of the What Is Analysis page:.
Group 0, study of something preexisting, to answer questions
Group 1, refined requirements
(Group 2, initial design)
(Group 3, implementation-independent design)
Group 4, determining what factors lurk around and in-between the Requirements
When you are trying to understand a software problem, what are typical errors and mistakes that are made?
The following is not an answer to the above question, but instead is offered in the same spirit as Code Smells.
Common AnalysisSmells:
Gloss For The Boss:
the analysis deliverables are too pretentious, too elaborately typeset, the diagrams are too carefully shaded or coloured(Although sloppy presentation can be a sign of sloppy thinking)
Words For The Nerds:
the terminology used in the analysis deliverables is more familiar to the analyst than to those working in the problem domain. (Although uncritical adoption of the jargon of the Problem Domain without explanation or definition doesn't help to foster a common language for the project)
Easy Peasy Lemon Squeezy:
the analysis deliverables contain lots of impressive descriptions about superficial (easily-obtained) user requirements, but not enough about other important requirements (authorization, reversal of errors, integration with other systems, logging and audit, time factors, etc). (Although many of the requirements in these areas will depend on the solution adopted rather than on the fundamental Business Requirements; that is, they are Second Order Requirements)
Paving the Cowpath:
A user or sponsor (who usually doesn't know any better) requests something that basically mimics a paper workaround, including the addition of redundant data entry in multiple systems, and the analyst (who also doesn't, but who SHOULD, know better) diligently writes it up as a requirement to be built. (Since I have done this myself more than once, I was able to, at long last, recognize it as a smell and possibly an anti-pattern. So, I am putting it here with the hope that younger/smarter analysts will not stumble over it as I have.)
A problem I've seen on a number of occasions is that an analyst has a specific implementation technology in mind when analysing the requirements. It then becomes easy to think: "feature X is probably difficult to implement -- let's hope users don't want it". The analyst, explicitly or implicitly, neglects to consider that feature. When the requirements spec is finished, it is obvious that the chosen technology is a good fit because there are no requirements that are show-stoppers. 6 months down the line, the neglected feature comes to light, but because its a show-stopper for the implementation technology, it is put off till the next version.
It is important to record all the requirements, especially those that are thought to be difficult. This is a matter of professionalism and integrity as well as good sense.
One example I've seen more than once is a save/restore feature for a simulation. Users realise that a simulation that takes several hours (or days) to run, so it would be really nice if they could save the simulation state periodically for when the computer crashes. But the super-optimized C (or C++) code that implements the simulation model has no concept of the explicit state of the model. It's the type of thing that's easy to add early but, especially with traditional development methods, is very difficult to add later. The reason for not putting it in early is that 'surely that'll slow things down'.
Databases (relational vs objects) offer a rich source of examples too. If a type of transaction is difficult (or slow) then, rather than fix the problem, it much easier to pretend that users don't really want that feature. As another example: users don't really want method grained version control for Java do they? Many tools use file based versioning, so assume users don't want finer grains.
Of course, analysis in a vacuum is also a problem: "You want the software to determine the optimal route for your sales people to visit 1000 cities, in real time? Well, I'm sure our clever programmers can do that."
-- Dave Whipp
See original on c2.com