Blog Posts for the ‘Training’ Category

Common Languages Ain’t So Common

Tuesday, June 28th, 2011

A friend told me about a payment system he worked on once. In the system models (and in the source code), the person sending notification of a pending payment was the payer. The person who got that notice was called the payee. That person could designate somone else—the recipient—to pick up the money. The transfer agent would credit the account of the recipient, and debit the account of the person who sent notification—the payer, who at that point in the model suddenly became known as the sender. So, to make that clear: the payer sends email to the payee, who receives it. The sender pays money to the recipient (who accepts the payment.) Got that clear? It turns out there was a logical, historical reason for all this. Everything seemed okay at the beginning of the project; there was one entity named “payer” and another named “payee”. Payer A and Payee B exchanged both email and money, until someone realized that B might give someone else, C, the right to pick up the money. Needing another word for C, the development group settled on “recipient”, and then added “sender” to the model for symmetry, even though there was no real way for A to split into two roles as B had. Uh, so far.

There’s a pro-certification argument that keeps coming back to the discussion like raccoons to a garage: the claim that, whatever its flaws, “at least certification training provides us with a common language for testing.” It’s bizarre enough that some people tout this rationalization; it’s even weirder that people accept it without argument. Fortunately, there’s an appropriate and accurate response: No, it doesn’t. The “common language” argument is riddled with problems, several of which on their own would be showstoppers.

  • Which certification training, specifically, gives us a common language for testing? Aren’t there several different certification tribes? Do they all speak the same language? Do they agree, or disagree on the “common language”? What if we believe certification tribes present (at best) a shallow understanding and a shallow description of the ideas that they’re describing?
  • Who is the “us” referred to in the claim? Some might argue that “us” refers to the testing “industry”, but there isn’t one. Testing is practiced in dozens of industries, each with its own contexts, problems, and jargon.
  • Maybe “us” refers to our organization, or our development shop. Yet within our own organization, which testers have attended the training? Of those, has everyone bought into the common language? Have people learned the material for practical purposes, or have they learned it simply to pass the certification exam? Who remembers it after the exam? For how long? Even if they remember it, do they always and everafter use the language that has been taught in the class?
  • While we’re at it, have the programmers attended the classes? The managers? The product owners? Have they bought in too?
  • With that last question still hanging, who within the organization decides how we’ll label things? How does the idea of a universal language for testing fit with the notion of the self-organizing team? Shouldn’t choices about domain-specific terms in domain-specific teams be up to those teams, and specific to those domains?
  • What’s the difference between naming something and knowing something? It’s easy enough to remember a label, but what’s the underlying idea? Terms of art are labels for constructs—categories, concepts, ideas, thought-stuff. What’s in and what’s out with respect to a given category or label? Does a “common language” give us a deep understanding of such things? Please, please have a look at Richard Feynman’s take on differences between naming and knowing, http://www.youtube.com/watch?v=05WS0WN7zMQ.
  • The certification scheme has representatives from over 25 different countries, and must be translated into a roughly equivalent number of languages. Who translates? How good are the translations?
  • What happens when our understanding evolves? Exploratory testing, in some literature, is equated with “ad hoc” testing, or (worse) “error guessing”. In the 1990s, James Bach and Cem Kaner described exploratory testing as “simultaneous test design, test execution, and learning”. In 2006, participants in the Workshop on Heuristic and Exploratory Techniques discussed and elaborated their ideas on exploratory testing. Each contributed a piece to a definition synthesized by Cem Kaner: “Exploratory software testing is a style of software testing that emphasizes the personal freedom and responsibility of the individual tester to continually optimize the value of her work by treating test-related learning, test design, test execution, and test result interpretation as mutually supportive activities that run in parallel throughout the project.” That doesn’t roll off the tonque quite so quickly, but it’s a much more thorough treatment of the idea, identifying exploratory testing as an approach, a way that you do something, rather than something that you do. Exploratory work is going on all the time in any kind of complex cognitive activity, and our understanding of the work and of exploration itself evolves (as we’ve pointed out here, and here, and here, and here, and here.). Just as everyday, general-purpose languages adopt new words and ideas, so do the languages that we use in our crafts, in our communities, and with our clients.

In software development, we’re always solving new problems. Those new problems may involve people to work with entirely new technological or business domains, or to bridge existing domains with new interactions and new relationships. What happens when people don’t have a common language for testing, or for anything else in that kind of development process? Answer: they work it out. As Peter Galison notes in his work on trading zones, “Cultures in interaction frequently establish contact languages, systems of discourse that can vary from the most function-specific jargons, through semispecific pidgins, to full-fledged creoles rich enough to support activities as complex as poetry and metalinguistic reflection.”  Each person in a development group brings elements of his or her culture along for the ride; each project community develops its own culture and its own language.

Yes, we do need common languages for testing (note the plural), but that commonality should be local, not global. Anthropology shows us that meaningful language develops organically when people gather for a common purpose in a particular context. Just as we need testing that is specific to a given context, we need terms that are that way too. So instead of focusing training on memorizing glossary entries, let’s teach testers more about the relationships between words and ideas. Let’s challenge each other to speak and to listen precisely, and to ask better questions about the language we’re using, and to be wary of how words might be fooling us. And let us, like responsible professionals and thinking human beings, develop and refine language as it suits our purposes as we interact with our colleagues—which means rejecting the idea of having canned, context-free, and deeply problematic language imposed upon us.

Follow-up, 2014-08-24: This post has been slightly edited to respond to a troubling fact: the certificationists have transmogrified into the standardisers. Here’s a petition where you can add your voice to stop this egregious rent-seeking.

Questioning Test Cases, Part 2: Testers Learn, But Test Cases Don’t Teach

Wednesday, April 6th, 2011

In the last post, my LinkedIn correspondent provided a couple of reasons why she liked writing test cases, and why she thought it necessary to write them down in the early stages of the project. Then she gave a third reason:

When I’m on a project and I am the only one who knows how to test something, then I can’t move on to something new. I’d still be testing Cut/Copy/Paste if I were not able to pass that script on easily to a new recruit.

I often hear this kind of justification for test cases: “We need test cases so that testers can learn quickly how to test the product.” To me, that’s like saying that we need driving cases so that people can learn quickly how to drive in a new city. In a new city, a map or a general set of ideas might be helpful. Driving cases won’t, because to drive through a new city well, you have to know how to drive. In the same way, test cases are unlikely to be anything more than negligbly helpful. In order to learn quickly how to test a new program well, you have to know how to test.

It also seemed to me that she considers writing to be the One True Way to impart information, and reading to be the One True Way to receive it. Now, if I’m wrong about that, it would have to be because I’ve misinterpreted something she’s written. Fancy that!

So for the sake of discussion, let’s say I were to to hire (say) an older domain expert, making a career change, who had no computer experience, and let’s use Cut/Copy/Paste as an example. With the new recruit, I’d do what I’ve done many times in coaching sessions. I’d sit with her down with a copy of Microsoft Word, and open up a document that had some text and some images in it. Then I’d say something like this:

“We’re going to learn how to move text around, cutting and pasting and making copies of it so we can save time on retyping stuff that the machine can do for us. Okay. Put your cursor there, and press Shift-RightArrow there. Good. See what happened?”

(There would be lots of pauses. I’d pause after each question to give the trainee time to answer; frequently to let the trainee experiment; and often to let the trainee ask questions of her own. I won’t do that explicitly here; I’ll let you put most of the pauses in yourself, as you read. For sure, assume that there’s one at least after every question, and usually after every sentence.)

“So try doing that a few more times. What happened? So you’ve highlighted the text; people call that marking or selecting the text. What happens when you add the Ctrl key, and press Ctrl-Alt-RightArrow at the same time? Right: it selects a word at a time. Now, try pressing Ctrl-X.

“Whoa, what happened just there?! Right: pressing Ctrl-X that makes the selected text go away; you can see that. What you might not know is that the text goes onto the Windows Clipboard, a kind of temporary storage area for all kinds of stuff. So you haven’t erased it; it’s more like you’ve cut it out, and the computer still has a copy of it stashed somewhere.

“Now, put your cursor here, and hit Ctrl-V. What did that do? Yeah; it pasted the text you cut before. That is, it retrieved the text from the clipboard, and made a copy right where you pressed Ctrl-V. So, Ctrl-X cuts, and Ctrl-V pastes a copy of the thing you cut.

“Try that again. What happened? Right; another copy. So it’s a little different from regular cutting and pasting, in that you can make many copies of the same thing. When might that come in handy? Good idea! Yep, that’s a good idea too. Try it a few more times, in other places.

“What does Ctrl-V do? What does Ctrl-X do?

“Try selecting something else, and pressing Ctrl-X again. What happens when you paste now? Can you get the old one back? Well, there might be special add-ons that allow you to keep copies of lots of things you pasted, but normally, you can only paste the last thing you put on the Clipboard.

“How are you going to remember that Ctrl-X means “cut”? That’s a great idea! Myself, I think of the X as a pair of open scissors, ready to cut—but your idea is great, and better than that, it’s your own, so you’ll remember it.

“How are you going to remember Ctrl-V as paste? Yeah, it’s tough to come up with a way to remember that, isn’t it? I think of it like the tip of one of those old glue bottles from grade school. On the other hand, what’s in between X and V on the keyboard? Now, go select some text again. Remember how? Good.

“This time, try pressing Ctrl-C, instead of Ctrl-X. Right; nothing seems to happen. And yet… move your cursor here, and try pasting. Yeah; it’s a copy, even though the original stays where it was. So Ctrl-C is Copy–C for copy, which makes some kind of sense, for a change—right there in between cut and paste. And there’s another way to remember; try pressing Alt-E. E stands for Edit, that’s right. What do you see listed on the menu, there?

“Now… there’s some documentation for these cut-and-paste features; it’s in the Windows documentation. Press F1. Great. Now look for ‘Keyboard Shortcuts’.

“There’s a list. How many different ways can you find of marking, cutting, and pasting text? Make me a list, and I’ll be back in ten minutes. Be ready to show me your report, and to demonstrate the ones you’ve discovered. Oh—and one more thing: create a list of at least five circumstances in which you might use this stuff, and we’ll talk about that too.”

It’s taken quite a while to type this example. It would take considerably less time to do it. Moreover, the trainee would learn more quickly and more deeply, because she’d be interacting with the product herself, experiencing the actions in her body, literally incorporating them. The questions, the answers, and the interactions make the learning more sticky. The practice makes it more sticky still.

I try to encourage people to create mnemonics to help remember. While conversing with the trainee, I’d also be observing her. She’ll be making plenty of mistakes; people do that when they’re learning. In fact, people should be encouraged to do that when they’re learning. (Brian Marick’s book Everyday Scripting in Ruby encourages deliberate mistakes, which is one of the reasons I like it so much.) When I see mistakes, I’ll give her a chance to feel the mistake, puzzle out a way around it, and then—if she needs it—help her get out of it. (“Let’s figure out how to fix that. Hit Alt-E again. What’s listed on the Edit menu? So what’s the keyboard shortcut for Undo?” “Oops, you pressed Ctrl-Z one time too many… What does Ctrl-Y do?”).

After a while, I won’t tell her how to do everything right away. I’ll start asking her to figure out some things on her own, and only help out when she’s stuck. That way she’ll be learning it for herself, which is more challenging but which has more effect. I’ll keep trying to raise the bar, giving her more challenging assignments, increasing her independence while also increasing her responsibility to report. When I see her make a mistake, I might correct her right away, or I might let her continue with the mistake for a bit until she encountered some kind of minor consequence. That makes the learning even more sticky.

Finally, I’d cut the conversation short if she told me that she already knew how to cut and paste—but I’d get her to show me that she knew it, and I’d give her some more exercises so that I could observe her.

The trouble that I see with “passing on a script to a new recruit” is that most test scripts I’ve seen are very long on what to do or how to do it, and correspondingly short on why to do it and how to generalize it. That is, they’re short on motivation, and they’re short on risk, and they rarely include a mission to learn. In our teaching, when we’re working with test groups, and when we’re coaching individual testers, James and I put less emphasis on techniques. Instead, we focus on making sure that people have the skills to know what to do when they receive a testing mission, whether that mission is written down in detail or not.

If there are specific things that I want a tester to look for, I’ll provide specific instructions (either written or spoken), but mostly I want to set out on their own to discover things. I want to encourage testers to vary their work and their observations. Variation makes it more likely that they will find problems that would be out of sight if we were to focus on the scripts. As a bonus, giving testers a concise, open-ended task keeps writing to a minimum, which saves on development cost, maintenance cost, and opportunity cost.

We’ll finish up this series in the next post.

Questioning Test Cases, Part 1

Monday, April 4th, 2011

Over the years, LinkedIn seems to have replaced comp.software.testing as the prime repository for wooly thinking and poorly conceived questions about testing.

Recently I was involved in a conversation with someone who, at least, seemed to be more articulate than most of the people on LinkedIn. Alas, I’ve since lost the thread, and after some searching I’ve been unable to find it. No matter: the points of the discussion are, to me, are still worth addressing. She was asking a question about how much time to allocate to writing test cases before starting testing, and I questioned the usefulness of doing that. The first part of her reply went like this:

I would like to point out that I doubt anyone wants to write something that they don’t need to write.

I agree that most people probably don’t want to write things they don’t need to write. But they often feel compelled, or are compelled to write things they don’t need to write.

I find value in writing test cases for a number of reason. One is that I train more junior engineers in testing and it is a good method to have them execute tests that I have written so they learn how a good test plan is put together.

If that were so, wouldn’t your junior engineers learn even more from writing test cases themselves, and getting feedback on their design and their writing? There’s a feedback loop in the design of a test, the execution of a test, the interpretation of a test result, and the learning that happens between them; wouldn’t it be a good idea to keep the feedback loop—and the learning—as rapid as possible? Wouldn’t your junior engineers learn still more from actually testing—under your close supervision, at first, and then with the freedom and responsibility to act more independently as they gain skill? You might want to have a look at this article: http://www.developsense.com/articles/2008-06-KnowWhereYourWheelsAre.pdf.

There’s a common misconception that testing happens in the characters of a written test case. It doesn’t. Testing happens in the mind and actions of the tester. It happens in the design of the test, in the execution of the test, in the observation and interpretation of the outcome of the test. Testing happens in the discovery of problems, in the investigation of those problems, and the learning about those problems and the product. At most, a fraction of this can be written down.

A test is far less something you execute, and far more a line of inquiry that you follow. To me, a good test case is idea-stuff; it’s a question that we want to ask of the program, based on some motivating idea about discovery or risk. In my observation, in writing test cases, people generally write down the least important stuff. They appear to be trying to program the tester’s actions, rather trying than to prime the tester’s thinking and observation.

Moreover, a test plan something quite different from a pile of test cases.

Secondly, [writing test cases] communicates the testing coverage with everyone involved in developing the software. If you are a contractor, this is very important since you want to leave with the client feeling like you did your job and they have the documentation to prove that they have done due diligence if they shop their company around or look for for VC money.

Were you, as a contractor, given the mission to produce test scripts specifically, or is that a mission that you have inferred? Bear in mind, I’ve been witness to many takeovers, as a program manager at a company where our senior managers were ambitiously acquiring technologies, products and companies, and as a consultant to several companies that were taken over by larger companies. In no case did anyone ever ask to see any test case documentation. Those who are investigating the company to be acquired typically don’t go to that level of detail. In my experience, alas, due diligence largely doesn’t happen at all. I’m puzzled, too, by the appeal to the least likely instances in which people might interact with test documention, rather than the everyday.

Meanwhile, there are many ways to communicate test coverage. (For example, see here, here, and here.) There are also many ways to fool yourself (and others) into believing that the more documentation the more coverage, or the more specific the documentation the more coverage—especially when that documentation is prospective, rather than retrospective. In Rapid Testing, we don’t encourage people to eliminate documentation. We encourage people to reduce documentation to what is actually necessary for the mission, and to eliminate wasteful documentation.

We focus on documenting test ideas concisely; on producing coverage outlines and risk lists that can be used to guide, rather than control a tester and her line of inquiry; on producing records of the tester’s thought process, observations, risk ideas, motivations, and so forth (see below). The goal is to capture things far more important than the tester’s mechanical actions. If someone wants a record of that, we recommend video capture software (tools like BB Test Assistant or Camtasia). An automatic log allows the tester to focus on testing the product and recording the ideas, rather than splitting focus between operating the product and writing about operating the product.

You can find examples of the kind of test documentation I’m talking about here, in the appendices to the Rapid Software Testing class, starting at page 47. Note that each example has varying degrees of polish and formal structure. Sometime it’s highly informal, used to aid memory, to frame a conversation, or trigger ideas. Sometimes it’s more formal and polished, when the audience is outside the test group. The over-riding point is to fit the documentation to the mission and to the task at hand.

More to come…

Questions from Listeners (1): Handling Inexperienced Testers

Wednesday, May 19th, 2010

On April 19, 2010, I was interviewed by Gil Broza.  In preparation for that interview, we solicited questions from the listeners, and I promised to answer them either in the interview or in my blog.  Here’s the first one.

How to deal with un-experienced testers? is there a test approach that suits better for them?

Here’s what I’d do: I’d train them. I’d pair them up with more experienced testers (typically test leads or coaches). I’d start them with things that are relatively easy to test, gradually increasing the difficulty of the assignments and the responsibilities of the tester. I’d watch closely to make sure that the test leads and the novices were talking to each other a lot during testing sessions.  I’d debrief them both together and individually.  I’d have the novices read books, articles, and blog posts, and ask them for summary reviews, and then we’d talk about them. I’d give them experiential exercises, in in which they and other members of the team would try to solve a testing puzzle, and then we’d talk about it afterwards. I’d reward them for demonstrating increasing skill: participating in Weekend Testing sessions; writing blog posts or articles for internal or external consumption; building tools and contributing to our development group or to the wider community.

Some might object that this approach is time-consuming. It certainly consumes some time, but it’s the fastest way I know to develop skill, and it’s similar in a general way to the training approaches for doctors, pilots, and skilled trades: involve them in the work, supervise them closely, and make them increasingly responsible for increasingly challenging work.

Here’s what I wouldn’t do: I wouldn’t give them a script to follow unsupervised and then presume that they’re going to do or learn the work. Everything that we know about learning and about testing suggests that both will be highly limited with this hands-off approach.  My friend Joey McAllister recently tweeted “The Barista at this Target Starbucks didn’t know if a mocha came with a shot of espresso in it. And she was working alone.” Most of the products that we’re testing aren’t simple cups of mocha. People are likely to suffer loss or harm if we take the approach that someone took with this barista.

How Can A Trainee Improve His (Her) Skills

Thursday, February 12th, 2009

A blogger on TestRepublic asks “How can a trainee improve his/her skill sets in testing?”

This is what I do. I recommend it to all trainees (or “freshers”, as they say in India).

Find something that interests you, or something that would be useful to you or to a client, or something that you must do, or a problem that you need to solve, or something that you think might be fun. Listen, talk, ask questions, read, write, watch, learn, do, practice, teach, study. Solicit feedback. Practice.

Think critically. Monitor your mental and emotional state. Hang around with people who inspire you on some level. Offer help to them, and ask them for help; more often than not, they’ll provide it. Practice.

Think systematically. Seek the avant garde. Defocus; look elsewhere or do something else for a while.

Practice. Observe the things in your environment; direct your focus to something to which you hadn’t paid attention before. Seek connections with stuff you already know. Look to the traditional. Refocus.

Learn, by practice, to do all of the above at the same time. Tell people about what you’ve discovered, and listen to what they tell you in return. Recognize and embrace how much more you still have to learn. Get used to that; learn to love it. Repeat the cycle continuously.

Practice.

It’s the same with any skill set. For me, it has worked for testing; it has worked for playing mandolin; it has worked for being a parent—even though there’s a universe of stuff that I still have to learn about all of those things. When I use the approach above, I make progress rapidly. When I don’t, I stall pretty quickly.

My friend and colleague James Bach has a similar approach for living and learning, and he’s written a book about it. It’s called Secrets of a Buccaneer Scholar: How Self-Education and the Pursuit of Passion Can Lead to a Lifetime of Success.

These approaches are at the heart of the Rapid Software Testing mindset. They’re also a big part of what we try to teach people by example and by experience in the Rapid Software Testing course. It may sound as though there are lots of bits and pieces to cover—and there are—but they all fit together, and we give you exercises and practice in them to get you started. And these approaches seem to help people and get them inspired.

At conferences or association meetings, we present some of what we’ve learned in a formal way, but we also get up early in the morning and/or hang out in the pub in the evening, chatting with people, playing games, exchanging puzzles, trading testing stories. When we’re on the road, we try to contact other people in our network, and hang out with them. We blog, and we read blogs. We read in the forums, we write in the forums. We seek out passionate people from whom we can learn and whom we can teach. We point people to books and resources that we think would assist them in their quests to develop skill, and ask them to do the same for us. As a novice, you can do almost all of this stuff right away, and make goals of whatever is left.

In addition to Rapid Software Testing, one of the places that we regularly point new testers is the Black Box Software Testing course, available free for self-study at http://www.testingeducation.org/BBST, or in an instructor-led version from the Association for Software Testing. That course, co-authored by Cem Kaner and James Bach, but increasingly refined by collaboration between authors, instructors, and students, will give you lots of excellent knowledge and techniques and exercises.

The skill part—that comes with practice, and that’s up to you.