Tuesday, December 4, 2012

Fail Fast, Learn Fast, Act Fast


I read with some interest Alan Shalloway's blog "Why I Hate Fast Fail" -- I highly recommend you read it.    Additionally, if you are reading this there is a good chance that you may know me professionally and know I frequently say "Fast fail, Fast fail...." The title of Alan's blog does not make me feel a need to "defend myself" -- quite the contrary, I agree with his basic point -- the desire to be precise in our language on what we mean by "fast fail".    Quoting from his post -- "Fail fast is not our goal. Learn fast is."   My corollary to his statement is that we seek to "act fast".    My use of the term "fast fail" comes from seeing how it is difficult to learn when there are too many failures and/or cascading failures to analyze.

For example if we had some tests that required for a room to be illuminated.   We might "measure" how effective a reading rate is for an individual at various levels of light.    Suppose the 1st "test case" is "turn on light switch near door" and no light comes on at all.    If we waste time analyzing the "failure" of the subsequent reading rate test (in a dark room) before we learn why the light switch test failed, we waste valuable learning time.   Does the light fail to come on because of a switch faulty, the light bulb itself, the wiring, something else?   We need to quickly learn which it is, and act quickly to take corrective action so that we can again repeat the test cases.

I have spent years watching otherwise talented engineers be overwhelmed to by "too many" failures that they dismiss the feedback (learning) that is possible from the testing process.    Don't design your test to fail quickly, design them so that a failure produces a learning opportunity that can't be ignored -- that must be acted upon. Once the value of these learning opportunities is realized, no competent software engineer will resist the chance to respond (act) quickly.     I am now adjusting my mantra to "fail fast, learn fast, act fast".

Tuesday, September 11, 2012

"Continuous Testing" - Never Stop Testing

As I type this I am actively monitoring four running automated test projects my project team is leveraging (I should add there are more than four automated test projects for the project team) and the thought hit me in the face like a brick. We have so much resting running simultaneously that I was feeling 'stressed' keeping track of all of it.  Additionally, how this would stress our execution infrastructure. These are very a good problems to be stressed about -- much like "too many good pitchers" in baseball.   It indicates that we had reached an extremely important attitude shift on the project team.

 

Does the number of tests matter?

On an internal team Wiki I had written that the project team had over 10,000 automated test cases. Subsequent to that I have heard comments reacting to this fact with a 'it doesn't matter how many tests you have!". I agree with the speaker - far more important is that you value the tests that you do have.  How do you value them?
  • Do you you understand the feedback they give?
  • Do you react to that feedback?
  • Do your team members understand and react?
  • Do you find it so valuable that you wish you had more?
 If you answered "yes" to more than one of these, then you have the 'right' number.

 

If not, can you see yourself getting there?

If you have 0 tests, that number does matter and you have spoken volumes about you attitude about testing. However, if you can see the value of Continuous Testing. but can't see yourself getting "there", What do you do?  It's a lot like writer's block. The toughest part of writing is staring at the blank piece of paper. To get passed it, you just have to start writing. The same goes with testing, what is the first test you can write that you will value? To capture the essence of this idea, I tweeted the following mantra, "Test as you can, not as you can't"

 

Does automation matter?

In our resource starved environment (money, people, machines), I don't see how a software product can be successfully tested in any other fashion. See my next post -- "Automate Everything"

Monday, August 6, 2012

Passing time on a plane -- is music "free"?


While flying to las Vegas I noticed the ellectonic kiosk offering music in front of my seat. I could have used the iPod but I did not want to run the battery Dow (preferring instead to write this blog post).  It had a wide varsity of music and I was able to find 36 songs that I could add to my play list - more than enough to last my entire flight.  

I was also curious about two thing:
  1. Since I was reading a Fortune article about Megaupload --  was the music free?   With the airlines trying to squeeze every cent out of us I figured it would not be (the credit scanner below was also a strong clue).  It turns out to have been free save for the occasional ad - well worth the price in my opinion 
  2. The software itself was a bit clunky and unresponsive ( we have become spoiled by rapid response touch control mobile devices).  It took me some 40 minutes to build the playlist.   Since I was both looking for some tracks that I had not heard in some time and was looking to be distracted, this was all a welcome relief.  However,  if I were looking to just relax this would have been tiring.    Movies of course came at a the usual 'on demand' prices.   There was also a price for games.   Not to be missed  by any data nut is the graphical flight tracker along with the flight data.   


Without naming the airline I should add that this aw an airbus 320 but there was no indication as to who the software vendor was.   All in all it was a very pleasant diversion for a 3.5 hour trip.

Friday, April 13, 2012

Offsite -- Collaboration, Trust, and "Stamps" around Agile Topics


I have spent nearly an entire day at an off site meeting with seven of my development colleagues.   The day started with some discussion around communication, collaboration, and "bonding".     I was not anticipating a very positive day with the direction that the discussion was going.    Midway through the day, we made was I consider some very positive progress around two critical topics:

* Defined Acceptance Criteria for Software Consumers
* A milestone based approach to Software Delivery based on defined acceptance criteria.

I considered these major break though(s).    However, later in the day there was specific topic that came up, that spawned some very passionate (read angry discussion) around the topics of collaboration, trust, and "stamps".    The latter term was suggested as a less inflammatory word than "trust".   The word "Stamps" being a reference to the long ago retail practice of giving "stamps" that could be cashed in later.    The analogy is that we all collect "stamps" against each other to be "cashed in" at some later date.    This behavior was acknowledged, but as a group we came up with no good process to all cash in our stamps.

Unfortunately, I am leaving the off site with the feeling that our "collective stamps" will kill any progress we made have made on Agile Topics -- collectively we don't seem to have "An Agile Mindset".

Kanban It Isn't -- Part II ... A developing maturity


In a previous post, I described how excited I was about what I described as the "Kanban board".    This was also viewed similarly by some members of my team.   This was affirmed when one of them demanded, "We need to have this on our Wiki".    More encouraging was that we were able to get this done extremely quickly.  Pictured below is the Wiki version of the Kanban board.     Now if I can just get a large monitor to place on the wall to replace the White Board!

Wednesday, March 7, 2012

Kanban it isn't but tis close enough

I arrived at work today after having one of the rougher days in recent memory.   There is a white-board in our lab, that one of my team members has taken to updating regularly.    I had not asked this to be done, so I stopped to digest what was represented.   The thought in blog title, came to mind, "Kanban it isn't but tis close enough".    If you are not familiar with Kanban, here is a general Wikipedia definition.  



 "A visual process management system that tells what to produce, when to produce it, and how much to produce."

It was most certainly:
  • visual
  • what we are attempting to product
  • when it will/might be produced
  • whether it could be produced today

I hope others on my team found this as compelling as I did.   Realizing as I do many days, but still need to be reminded of, I have a great group.   My day suddenly brightened.