Workshop On Performance & Reliability (WOPR21) – Technical Debt


I’ve attended 3 prior WOPRs (4, 5,14) and this was the best one for me.
The content owner, Mais Tawfik Ashkar, chose the theme Technical Debt which turned out to work quite well.

The first day was the most positive with Dan Downing explaining the challenges to construct an end to end architectural diagram for a widely distributed organization.
(The end diagram is a pre-requisite for the original work – performance test end to end).

Felipe Kuhn then provided the most amazing description of Agile teams focusing on quality while building an entire product in 2 months.  His definition of Technical Debt was inspiring:

Anything that slows us (the team) down (or isn’t speeding us up) is Technical Debt.

And:

Technical Debts are not stories, as they don’t deliver user value.

He provided numerous examples of the team investing in automation to build a quality product quicker.

Matt Geis described an amazing evolution of test automation for partner testing.   From long lists of setup instructions to a single (4GB) VM to moving into the Amazon Web Services cloud.   He claims they increased technical debt by automating more and more of the process using more and more tools including mock SSLservers (recompile SSL library with special config) and a Jira plugin to auto configure test suites for new partners.

Jude McQuaid described his new destructive testing group at SalesForce.com, the WOPR hosts.   Sounds like fun to me!   I also appreciate SalesForce’s transparency about its data centers by openly publishing their status: http://trust.salesforce.com

There were many other Experience Reports (ER) including my own, all of which I learned from.

Beyond the ERs, we had a daily group brainstorming session on a topic.  The final day was “test debt” and a special subset (Mais initiated) called “test decay”.  The group came up a with a draft working definition:

Test decay occurs when testing becomes less effective over time.

After the fact I researched this thought a little.  Contrast with Software Decay.  Some of what we discussed is covered by Lessons Learned in Software Testing, Lesson 117, “Automated regression tests die” which then describes “Regression tests decay for several reasons”.  And even earlier, in a book I find too infrequently read, Marick’s Craft of Software Testing discusses (page 225) how “to avoid test suite decay”.

Attendees (as best I know, probably incomplete and mis-spelled):

Mais Tawfik Ashkar
Goranka Bjedov
Dan Downing
Jane Fraser
Matt Geis
Andy Hohenner
Paul Holland
Dave Holt
Pam Holt
Felipe Kuhn
Reena Mathew
Jude McQuaid
John Meza
Eric Proegler
Keith Stobie
Tom
Mahesh
Ashok

Advertisements

About testmuse

Software Test Architect in distributed computing.
This entry was posted in software testing. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s