I'm putting together information for some folks around performance. So I'm coming up with a set of golden rules. My first one is:
It is what you don't test that breaks in production.
This isn't specific to WebSphere products either. This goes pretty much across the board.
I work with quite a few people on performance issues. There are two types of problems that could have easily been avoided had the testing been conducted properly.
(a) 80/20 rule
Some people live by an 80/20 rule. We'll test the 20% of the application that 80% of the users use. Um, what about the other 80% that goes untested? What if that brings down the site even if only a single user hits it? I'd rather keep the site up and test everything. Wouldn't you?
(b) Boundary value problems
Everything takes input. Not every application validates input. This leads to problems with applications running unbounded database queries because the filter wasn't filled out correctly. Or someone fills in all 250 rows on a page and crashes the site because the testers missed that point. Every use case has boundaries. Test the zero case, the in between case and then infinity. To infinity and beyond my friends! That is where we can take our sites if we do the testing right!