Thursday, May 30, 2013

How many heads can you roll off with this Automation?

...  This was exactly one of the managers in a meeting asked me when I was discussing with a group on automation that we were developing and maintaining. Hold your breath - this is not some 10-15 year old fairy tale story extract. This shows dominant view that is held amongst business stakeholders, IT execs, consultants and sadly many test managers.

For the starters in this field and to the topic - test automation (or simply automation) is an idea of some computer program "checking" behavior of another computer program (called application under test) and some sense "replicating" what a sufficiently disengaged or brain dead tester would do in the name of testing the application. In order to drive home the idea - early automation tools such as Winrunner introduced the idea of "record and playback". Wow - what a way to simplify a really complex and difficult work of testing a software application.

Thanks to IT consultants and managers - whenever the problem of "speed" showed up in meetings - automation was proposed as potential solution. This has grown to such an insanity that today for almost all problems of software projects - automation is common solution. But, very soon people in IT realized that doing automation requires Money - additional money than you pay to a tester.

Some clever fellow in the consulting company they shouted 'Return on Investment" - from that fateful day - life of tester or someone who supports tester through automation - has never been the same. Since automation requires funding extra to what is spent on testing - execs obviously want their pond of flesh in return.

This leads to popular equation. Without automation if you need 5 testers to do testing for a project/release, with 50% of automation - you would need 50% less people. That is how automation pays back itself. Since software requires repeated testing when changes happen - automation once done can be repeatedly used without paying for human tester.  That is how conventional and most popular thinking on automation goes.

While it is not very difficult to reason why and when automation cannot reduce the need of number of testers - very proposition of "automatic" testing and removal of need of some dumb tester staring at screens of automation test run is simply irresistible.

I fought many losing or lost battles in explaining my stakeholders as why automation should not be thought about as means to reduce number of testers or cost of testing. Everytime I lost, I was made to understand - it is just execs made explicit choice not to reason but to continue to insist that if automation can not reduce manpower required for testing - it is useless or at least not worth investing.

I am thinking of refusing to do automation if right expectations are not set with stakeholders - will it work? Will I be given first right of refusal not do automation if right level of awareness exists?

But then - if you are a business leader, IT manager (not someone with deep understanding or appreciation for testing and automation) - you would believe what a consultant or tool vendor would say.

As I close another pessimistic post on automation - I realize - it is tough to be in automation where everyone has opinion (strong one) and I have to force my agenda through.... tough...

But I have not given up --- trying to bring sanity in the mad world of test automation.

Shrini


Sunday, May 19, 2013

How to disagree elegantly and learn something in the process ...

I love my Zite iPad App that pulls out amazing (and latest) news just about anything. By tuning this to topics like science, philosophy, mind-body, software, programming, critical thinking (topics of my interest) - I can get hours worth reading everyday from this app. Thank you Zite.

I tweeted about Daniel Dennet's thinking tools - an article on Guardian. Lots of good stuff - take a look at it and if possible buy the book and read.

One idea that most attracted me from this article is about "how to effectively criticize/argue with someone". Here is a quick paraphrase.

3 simple rules (These rules are attributed to social scientist Anatol Rapoport - as Daniel suggests in the article)

1.  Attempt to restate (re-express) opponents idea in your words (so clearly that opponent should say "I wish I could have expressed like you did - that is precisely the idea")
2. State the points of agreement about opponents idea (especially if they are not matters of general/public agreement)
3. State what you have learned (new) from the idea

Only after doing 1.2.3 - you can do any rebuttal or criticism.

Notice - what 1, 2, 3 will do to your opponent?
By #1 - you have managed to show that you have understood the idea (even better than the opponent herself)
By #2 - you have established an emotional connect with opponent by explicitly stating what portions of idea you agree with. This will open up opponent for considering your points. This is the point where she will start actively listening to you.
By #3 - This is big one. Through this you show your humility and desire to learn when critiquing an idea.


Through this series of actions - you essentially convert a potential adversarial idea/person into a positive and collaborative interaction.

I will be putting these rules into action for situations where I am disagreeing with anyone and offering opposition or criticism. Let me see how it goes.

Pretty sound advise Daniel. Thanks.

Shrini

Tuesday, May 07, 2013

Book/Reading suggestions ...

Few days ago - a tester friend of mine approached with a request to suggest him for some books to read. I responded him with a small list that on the face of it - looked unlikely for a software tester.
I thought I would share the list with you folks ...

Here is it is 
This book introduces the idea of "systems thinking" and To a tester - I think it is most important to know and engage in general systems thinking as we engage in solving problems.


2. Surely You're Joking Mr. Feynman by Ralph Leighton and others

Richard Feynman is hero of testers in my opinion. This nobel prize winning American physicist lived life of a curious child all his life exploring  the world and never turned away from learning new things. He questioned things around him like a true tester. The encounters described in this book by him explain what it means to be a curious thinker. Although he openly hated philosophy and made fun of philosophers - we can forgive him for the enthusiasm he showed and examples he left through his life to demonstrate a human's thrust for knowledge and learning.

You can see his interview that he gave for horizon BBC "Fun to imagine" - look up in youtube.

3. Outliers by Malcolm Gladwell - This is not a book for testers in direct sense but a fascinating book that illustrates systems thinking that Jerry's book (as indicated above #1). This is one book that read from start to end. Each chapter is illustration of how to look at information that is publicly available and create a whole new interpretation of it.
Other books from same author that are worth reading are - "Tipping Point", "What the dog saw", "Turning point (this is a science book).

4. God Particle by Leon Lederman

This is again not a testing book - not a systems thinking book not a book about software. It is about amazing journey of science of understanding building blocks of our universe. I liked the narration of how to express and articulate heavy scientific stuff through metaphors and examples so that even a 7th grader can understand. What this has to do with testing ? Understanding tough subject and explaining in easy language - something that tester do all the time - find tough bugs and demystify them for our stakeholders including developers.

4. "How to think about science" - A CBC series of 14 interviews with scientists, philosophers, Writers - about emerging form of science. If we reckon testing as multidisciplinary - Look no beyond this. Download the series of interviews (mp3) and listen/absorb. These interviews left a long lasting impression on me about how think about an intellectual pursuit like science or software development or testing.

Shrini

Making a food item vs Solving a Puzzle - An attempt to characterize Testing Mindset

A Disclaimer: I am going to make some sweeping generalizations about how testers and developers (generic name including programmers, designers and business analysts) think and work. This is an attempt to characterize a /typical/ testing (or tester) mindset - a set of dominant thinking patterns, attitudes, biases, choices and behaviors.

I was reading out a bedtime story to my 9 year old daughter - I was holding a book of "Akbar-Birbal" stories. In one story - King Akbar asks Birbal, narrating how typically "giving" works. Under what circumstances - giver's hand is at bottom and receiver's hand is at the top - was Akbar's Puzzle. Under normal circumstances - giver's hand would be at top and reciever's hand below that of givers. How do you solve this puzzle? What goes on in your mind when you encounter stuff like this?

This got me into thinking how in general solving such puzzles/riddles work. When you start solving puzzles like the one above - your mind would be like water gushing out of a pipe - divergent thinking. You need to work towards solving the puzzle from definition of the problem out into vast open exploration.

Different types of puzzle require different approaches to solution - in some cases you know the answer and in some other cases you don't.

1. Math problem - Solve a simultaneous algebraic equation or solve a differential equation
2. Solve Sudoku
3. Play Chess - from initial state to win.
4. Play Scramble - how many words you can make from sets of jumbled letters?

Contrast solving puzzles to say cooking (or making) a food item from a recipe or with someone's help. Here you have more or less definitive, probably seen previously end state when you know you are done. You work with mostly known steps or incremental activities from start to end. In other words you do convergent thinking. Many acts of "construction" go from some known set of conditions and some known end state - you go from say "requirements" to "working software"

Contrast that to a testing problem or solving a riddle.

Extending these two activities - cooking a food item and solving a puzzle - I think former describes how developers work/think where as later characterizes typical testers way of thinking.

What do you think?


Support Keith - Find answers for questions about ISTQB and more....

Keith Klain is stirring the world of testing through some smart and witty comments about testing on twitter. I enjoyed his discussions with Rex Black and others related to ISTQB and other topics that are close the hearts of testers - especially context driven ones.

Here is what makes Keith a special mention - he is a Business/Technology leader (not a consultant) of a Bank and heads a software testing group. Unlike other testing leader, he talks more like a practitioner who does testing day in out (not someone manages someone who manages a team few of which are testers). It is a quite welcome change in the world of business leaders we see around.

Two things I want to bring to your attention about what Keith is doing.

1. Watch him debate with others on twitter and notice how gets people talking. In one of the tweet discussing about testing and confidence with Rex Black, Michael Bolton and others - Keith says (paraphrase) "For a change let us change our positions - how about you (Rex Black) arguing in favour of us (testing does not build confidence) !!!!

In a debate - can you take a stand that is totally opposite to what you have believed all in your life and see the world from that angle ? Confirmation Bias - No 1 Enemy for testers or that matter any intellectual - can be beaten by hanging around with folks that think differently. Well said Keith !!!!

2. Sign the petition that Keith has setup questioning some basic ideas about how ISTQB goes about doing its ("Non profitable") business. First of all read the petition and see if it makes sense - if does - please sign up.

Follow Keith (@KeithKlain) on twitter and watch out interesting debates he kicks off ...

Shrini