All posts by jonathankohl

Announcing Session Tester

UPDATE: Unfortunately, this is now a dead project. While we had amazing initial community support, the project waned after an economic downturn. We were unable to staff up a team to carry it forward. If you would like to take up the torch and start the project up again, contact me for the source code.

We’re pleased to announce a new open source, lightweight session-based testing management tool called Session Tester. Aaron West and I have been working on this project off and on for quite a while, and finally have a public beta release. We expect the beta release will be a little rough around the edges, but we’re pleased to have created something that we hope will help make session-based testing more accessible. We are following the open source model of “release early, release often” and value your feedback. Your feedback will help shape the tool as it emerges from Beta. Bug reports and usability issues are very welcome.

It’s early days yet and the tool is quite simple. User feedback will influence what features are added to the finished product(s). That’s the beauty of the open source product development process, where the evolution of the product is transparent and guided by the user community.

Session Tester was written in Java, so it should run on any operating system that has a recent version of the Java Runtime Environment. We have tested with Windows XP, Vista, Mac OS X (Intel) and Linux.

We developed this tool as an aid for testers who are using session based testing management. (Our initial interpretation of SBTM outlined in the article is more lightweight than some might be used to.) It provides an easy way to time your sessions, remind you of your testing missions, and to record your notes. It also provides an idea primer in case you get stuck, and a testing cheatsheet to review while you are in the thick of testing and would like some ideas.

Even testers who aren’t in regulated environments and don’t take detailed notes find the tool helps them add structure to their thinking when exploratory testing.

Getting Started

Download Session Tester 0.1 here. This is a beta release, so we don’t have proper installers yet. Be sure to have a recent JRE installed (at least JRE 6, update 6), or else Session Tester might not run properly.

If you are unsure about your JRE, go to the Sun site for a new version. We last used update 11, but anything recent should work: Java SE Downloads. There are also instructions on installation and usage in the user guide. Once you have a recent JRE, just unzip into a folder, double click the executable jar file, and start testing.

This is a beta release, so we expect bug reports and usability issues, as well as feature requests. Feel free to share any of those ideas on our forum. Please contribute to the community there. We will also be looking for contributors to the project itself.

Current Features

  • a timer to keep track of your session length
  • note taking that is automatically saved and formatted into XML, with extra tags for more organization
  • optional reminders
  • mission reminder so you don’t lose focus
  • session end reminder so you can start winding your session down
  • an idea primer for those times when you get stuck when trying to generate test ideas (thanks to Michael Bolton for introducing us to oblique strategies and contributing to the current list)
  • an exploratory testing cheatsheet, supplied by Elisabeth Hendrickson (check under the Help menu)

Currently, we are building the tool for individual testers to use to take notes, manage their sessions, prime ideas, and review their session files in XML format. Eventually, we will extend this to team management.

Project Goals

Our goal with this tool is to help train testers on how to do session-based testing, and make the practice easier for everyone. We also thought hard about the idea generation part of testing, and try to use the interruptions for note taking as opportunities idea generating or focusing tools, instead of just annoyances.

Thanks

Thanks to Patrick Lightbody for generously providing the space and for supporting this project, as well as to James and Jon Bach for creating SBTM, Antony Marcano for the inspiration and useful feedback, and to Jared Quinert and Mike Kelly for their usability feedback. Thanks to Michael Bolton for the inspiration for the idea primer, and for contributing test phrases to it, and thanks to Elisabeth Hendrickson for granting us permission to provide her cheatsheet as a thinking reference for testers who are using the tool.

Questions?

Sign up and post questions, concerns, comments, queries, etc. on our clearspace discussion forum.

We Need More Metaphors. How About Music?

This month, David Hussman and I collaborated on a piece for SQE’s Iterations newsletter. Check it out here: Iterations January 2009. We talk about the dominance of the manufacturing metaphor in software development, and propose music as an alternative. Be sure to read my “We Need More Metaphors” and David’s “Music and Metaphor” in the newsletter.

Both David and I have been on this line for a while; David has tirelessly presented about his coaching style using lessons from his music producer and professional musician days. I’ve written about music and testing in: Exploratory Testing: Finding the Music of Software Investigation and with Michael Bolton: Testing and Music: Parallels in Practice, Skills and Learning. Watch for more in this space.

Testing is Overrated

I stumbled across a presentation by Luke Francl called: Testing is Overrated. He could have called it: “programmer testing is overrated” but that doesn’t have as much zing to it.

His slides display a refreshing reminder to look for balance in your testing techniques and coverage models. What is especially refreshing is that this is coming from a programmer who is studying testing effectiveness seriously.

Look through the slides and his handout. They provide a nice minimal, complementary and diverse set of testing ideas that when used together can be incredibly powerful.

(If you don’t feel like clicking the links, he recommends a combination of the following: programmer unit testing, skilled manual testing, usability testing, and code reviews.)

Update: Apparently I am behind the times, and Matt Heusser blogged about this very topic in August. It’s worth being reminded of though. 🙂

“Don’t Shout at your Disks” and Performance Testing

I get asked about performance and load testing a great deal. It’s fascinating work, and sometimes the strangest causes of performance problems can occur in a system. When people ask me about what goes into performance and load testing, it’s difficult to explain simply. Here is a related video that will provide a tiny glimpse into what performance latency can look like: Unusal disk latency

The rest of the story with performance testing, how people can actually find things like this in systems without relying on chance or fluke is the enormous amount of analysis (using basic probability, statistics, counting rule) and the use of the scientific method to try to explain data anomalies.

When you see something strange in the data, and recreate those conditions using simulation, you start to look at points in the system and where failures can occur. When you make an observation during an experiment of phenonena that trigger the strange behavior in the system, you work to replicate that accurately through emulation or simulation. That’s why when you see the problem repeated for a bug report, you see someone knocking a device with their knee, or yelling at the disk drive. It looks crazy, random and like it was achieved by luck after the fact, but getting to that point is quite logical – you do whatever it takes to simulate those conditions to make the problem repeatable.

Often, looking at performance data in various ways leads you to the problem areas. In other words, creating and staring at graphs, histograms, mean/median/mode/percentile data is the first place you notice an area to target. From there, you use tools to simulate those conditions, and observe parts of the system under simulation. When you see strange behavior, you figure out how to repeat it. Believe it or not, I’ve used my knee, cold soda pop cans, soldering irons (with hardware) and other software (load testing and others) to get systems to the desired problem state in a repeatable fashion.

I’m not really a performance tester by trade, and I get my advice from people like Scott Barber, Ben Simo, Mike Kelly and Danny Faught. Each one of them have helped me in my performance testing work this year. As Scott says, 20% of a performance tester’s work is generating load, the other 80% is in the analysis, which can lead to strange and interesting conclusions, and the odd weird look from other team members. 🙂

Thanks to Aaron West for pointing out the video. It brought back several memories of various testing experiences I’ve had over the past couple of years.

Interesting Posts on Agile Challenges

Scrum Challenges

A couple of posts that describe how many teams are flailing and failing with Scrum:

I’ve observed similar patterns. However, my gripe with an oft-heard Agilist response: “they are failing because they just don’t get Agile” or “if Agile failed for you, it’s your (or management’s) fault” smacks of blaming the victim(s). An emerging response that I support is:

Maybe a pure form of ‘Agile’ isn’t appropriate for that team, in their context.

(Time for some Process fusion?) Philippe Kruchten has a great talk on this: Situated Agility – Context Does Matter, a Lot.

It’s also very difficult dealing with the scorched earth of a failed Scrum project after the Scrum trainers have left and the team is struggling on their own, feeling humiliated. “Are we the only ones failing? Why do we hear all these wonderful reports of how Scrum would solve all process ills? What’s wrong with us? We’re trying…” It’s hard to get them to retain the good practices they learned from Scrum and to encourage them not to throw out everything and return to a system that wasn’t working before either, but is more familiar, so it feels safer.

Rumours of Practice

TDD – more of a rumour of practice than actual practice? (much like some of what is described in the two posts above.)

Roy Osherove: Goodbye mocks, Farewell stubs

My own observations about these and other Agile practices being more of a rumour of practice than an actual practice leads me to wonder if Agile practices are another flavour of a bubble. Time will tell, but some of the behavior is troubling. It still galls me that many blindly parrot TDD as an un-alloyed good practice, instead of TDD as another tool to think about using, particularly when people might be basing their conclusions off of rumours, rather than personal experience. This irrational exhuberance is one reason why stock markets ramp up on empty speculation, real estate prices boom on over-valued properties (using mortgages that people can’t afford to pay back), and tulips are bought with abandon. (At least you can plant your tulip bulbs and enjoy beautiful flowers when the bubble bursts. What do you do with your old un-maintainable tests?)

My advice to those who may be struggling? Don’t worry about being “Agile”, (particularly if you’re trying and failing) and worry about providing value. That’s what really matters anyway. (That, and enjoying your work.) Providing value to the users of your software, and valuing the people you work with is important. Value, coupled with the skill and interest level of the team members, will trump methodology in the long run.

A Post-Agilist Concept: End Methodology Wars

One of the Post-Agile ideals I have witnessed and encourage is the breaking down of walls between methodology camps. When teams apply practices, processes, rituals and tools from Agile methodologies and create a fusion with other, compatible processes in order to create value, interesting things occur. In spite of apparent differences, many good ideas can be gleaned from dissimilar processes, and applied and adapted on your team with great effect. This paper:  Towards A Framework for Understanding the Relationships between Classical Software Engineering and Agile Methodologies expresses a Reagan-esque: “Mr. Process Zealot, tear down that wall!” ideal.

While I may not agree with all the details in the paper, it has some important concepts I want to point out. First of all, they describe the tension between Agile and phased or linear “waterfall” methodology pundits. They point out that this tension is sometimes referred to as a “methodology war” and say that this behavior is harmful to software development communities. They also find evidence of compatibilities between seemingly incompatible methodologies, and introduce an interesting analysis framework for analyzing software methodologies called “CHAPL.” (They get extra points for using a mnemonic, and for the “C” representing “contextual analysis”.)
An excerpt:

On one hand, [some] software engineers … dismiss agile methodologies and strongly advocate the value of classical [software engineering] practice, while others … insist that agile methodologies will replace Waterfall-like models and apply to all software projects. This heated debate is sometimes referred to as the “Methodology War” …
It appears that the typical characteristics of the debate are that the proponents of the conflicting methodologies:

  • describe each other in extreme and biased terms
  • devalue the opponents’ methodologies and/or practices
  • justify their own values through either experience-based explanation or inadequate comparisons between the methodologies.

We believe that this war is detrimental to [software engineering] practices. In order to end the Methodology War, some researchers have presented the similarities and compatibilities between the two methodologies.

Methodology wars are the inevitable outcome of process visionaries working against the grain to introduce new ideas, and the resistance they face from the more established process idealists. Sometimes radical behavior or extreme statements are an effective way to get attention for ideas that are dismissed. Now that Agilism has become as well known as other process communities, it’s time to stop fighting and find the areas where we agree, and try to improve how we all develop software. Instead of posturing over what process movement is “best”, let’s focus on the value we can create together.

The paper was published by APSO 2008: Scrutinizing Agile Practices, or “Shoot out at Process Corral”, In conjunction with 30th International Conference on Software Engineering, Leipzig, Germany, 10 – 18 May, 2008. Yes, I was on the program committee.

Software Development Process Fusion Part 2

What is it? The Short Version

Software development process fusion involves taking different kinds of processes and tools and utilizing a combination on your project to help you reach your goals. You aren’t just using one particular methodology or school of thought or toolset, you are using a combination of tools that fits your unique needs on your project to help create value.

What is it? The Very Long Version

In Part 1 of this series, I talked about fusion in music from early days of the genre, when it was somewhat controversial and aimed more at enthusiasts, to the present, when most music we hear on popular radio stations is a fusion of styles. On country stations, we hear rockabilly, pop, rock and roll, blues and traditional country fused together in many songs. Popular music now has influences from all kinds of cultures, and we are seeing hip hop music fused with traditional Indian music and pop. In my collection, Canadian artist Cat Jahnke includes folk, pop, rock, gospel and film music in her songwriting and performing. A more obvious fusion might be found in fellow Canadian artist Rebekah Higgs music, categorized in the “folktronica” genre, a combination of electronica music and folk. Another Canadian group with a wide variety of styles fused together is the Duhks, who “…play a blend of Canadian soul, gospel, North American folk, Brazilian samba, old time country string band, zydeco, and Irish dance music…” according to wikipedia.

These kinds of fusions of ideas are all around us. The fusions of styles from different traditions, cultures and ideas are due in part to our increasing interconnectedness and mass media and communication. In the effort to create something new in the market, we often borrow something old or unfamiliar in our culture and mix it with the current and familiar. We have fusion cuisine, for example (a Chinese restaurant near our home became an Italian restaurant and serves delicious Asian-Italian fusion cuisine). We see it in exercise with holistic training, and exercise regimes that combine eastern, western, kinesiology and spiritual elements. Often a combination of ideas helps us reach our ultimate goal, which isn’t to create a fusion of styles, or to adhere to just one style, but to achieve a desired effect or outcome.

The goal of each of these examples is quite clear. With music, the musician’s goal is to create something that resonates with them, an expression of their art and their personality. Their other goal is to produce something that is satisfactory and enjoyable for their audience. With restaurants, they want to provide fresh, delicious food. With exercise programs, the goal is better health and fitness. Another important underlying theme is financial success. We all need to make money somehow to live, and even though we may produce something that is wonderful, it may not be recognized by the market. Sometimes our goals in software development aren’t quite as clear, particularly for those of us down in the details of coding, testing, writing, etc. It can be hard to see the big picture and measure ourselves against it. It can also be hard to deal with the imprecise environment that our software is released into and instead cling to something that feels predictable and stable, like a well-defined process.

Software development process fusion involves taking different kinds of processes and tools and utilizing a combination on your project to help you reach your goals. You aren’t just using one particular methodology or school of thought or toolset, you are using a combination of tools that fits your unique needs on your project to help create value.

This paper is a recent example: Process fusion: An industrial case study on agile software product line engineering that talks about the fusion of two bodies of practice. I’d like to find more that have identified several different schools of thought: iterative and incremental, Agile, phased or “waterfall”, spiral, user experience, etc. etc.

Process Mashups

A couple of years ago, I was interviewed about post-Agilism by a company that does industry analysis of the software field. One of the interviewers used an interesting term when she described the message she got from my work. She said something like this: “We really see that teams in the future will be less dogmatic about what particular process ideology they need to follow, and will be more focused on using different ideas to get the results they need. We’ll see all sorts of interesting process mashups as people combine different process ideas on their own projects to reach their particular goals for that particular project.” Wow. She got that from my writing? “Process mashup” wasn’t a term I had used, but it’s another way of explaining what I am trying to get across.
Mashup seems to be a relatively new term that describes combining different sources into one form. Wikipedia has some different examples of mashups.

Here in Canada, a fusion of ideas is built into our culture, since our society is modeled as a “cultural mosaic” which means people retain and continue to practice their original culture when they move here to live. On CBC (Canadian Broadcasting Corporation) radio, there is an interesting show called Mashup, hosted by Geeta Nadkarni, that I enjoy listening to. The website describes the show:

Over the summer Mashup will explore what really happens when cultures intersect in love, at work and at play. You’ll hear from immigrants, second-generation Canadians, mixed-race Canadians, people who’ve been in Canada for decades – each with a personal story about how cultures collide in their daily lives. Canada is a country of mashups. People from different cultures find themselves living and working together here – bumping into different values, assumptions and different ways of doing things.

When I listen to the stories and challenges of how people overcome the collisions of culture, I see many parallels on software development teams. In fact, we tend to have our own little mishmash of cultures on our teams due to our ability to collaborate with technology, and there is often a shortage of skilled people in one particular area, so people from different countries are often on the same teams. I see this culture mashup as a more accurate description of what most teams experience and how they implement processes, so why not embrace it?

We’re Doing That Anyway

Most teams I work with tend to use a blend of process ideas in practice. Often, we like to talk about our Scrum or XP process in the pure sense on mailing lists or at conferences or user group meetings, but what we are really doing is a blend of Scrum or XP, corporate culture and practices (what we’ve learned through experience that seems to work for our project, but doesn’t necessarily fit the process literature.) Often teams apologize to me if they are doing something that isn’t by the book process-wise. I ask: “Is it working for you?” and if they say “yes”, I tell them not to worry about it.

It’s important to realize that pure “Agile” and pure “waterfall” don’t really exist on projects. They are ideals, or strawmen, depending on what your particular software religion is. (That includes my writing here, it is a model of software development that I and others find ideal. We strive towards reaching a goal of using our process to serve us, rather than working to serve the process.) There is nothing wrong with ideals, but they can be carried too far. Many feel that Royce’s original “waterfall” paper described an ideal, and that process wonks got to that diagram in Figure 2 on about the second page or so and stopped reading, adopted it on that alone and the waterfall practice was born Somewhere along the way, most teams that were focused on results figured out how to adapt their phased or “waterfall” approach to get the job done. Others got too caught up in the ideal of the process and created bureaucratic nightmares that produced more paper and procedures than working software. We see the same thing with Agile extremism – the process is held up at the expense of the people on the project who are blamed if anything goes wrong. Roles like testing and technical writing are marginalized (“there is no tester role on Agile projects”): tests are automated, and after the fact manual testing skills are marginalized as “being negative.” Testers are twisted into any role but testing, such as development or business analysis. Tests become “documentation” or “requirements” that drive development. While there’s nothing wrong with experimenting with ideas, it should not be at the cost of dehumanizing skilled people who are trying to deliver the best working software they can. What exists on most successful teams I’ve worked with, who realize that they need to reach goals for the organization, for their users, for their teams, and for each individual, is usually a combination of changing process ideas and practices at work at any given time. Some are recognizable and named, and others are just what the team does in that environment.

In the Agile world, most process adoptions tend to be a blend of Scrum and XP. Some teams I’ve worked with couldn’t do a pure implementation of either because of their unique circumstances. One team couldn’t completely adopt XP because of the physical layout of the building prevented them from arranging themselves all together. Shrinkwrap software companies often have trouble getting a real customer representative on their team, and often have a product manager, business analyst or someone with a customer-facing role stand in. Sometimes teams are successful at delivering working software in spite of process adoption limitations, and sometimes they probably aren’t. (Usually, failures I’ve witnessed are due to a lack of skills rather than a process failure.) There are a lot of perfectly good reasons why Agile process adoptions aren’t implemented in the purest sense and yet still succeed. (Hint, skill is usually a big factor.)

People have also adapted so-called “waterfall” or phased lifecycle approaches as well. Furthermore, there are different ways of viewing software development processes. Steve McConnell explains this in a comment on his blog post:

I think it’s important to remember that Waterfall and Agile aren’t the only two options. “Agile” is a very large umbrella that includes many, many practices. “Waterfall” is one specific way of approaching projects that’s in the broader family of “sequential” development practices. Staged delivery, spiral, and design to cost are three other members of the sequential family. I agree that waterfall will only rarely do better at providing predictability than agile practices will. But there are other non-Waterfall practices within the sequential family that eliminate 90%+ of the weaknesses of waterfall and that are more applicable than full-blown agile practices in many contexts. (By full blown, I mean like the project in the cautionary tale–fully iterative requirements, etc.)
…There is no One True Way. When people think about the fact that there’s software in toasters, airplanes, video games, movies, medical devices, and thousands of other places, it seems kind of obvious that the best approaches are going to arise when people pay close attention to the needs of their specific circumstances and then choose appropriate practices.

That’s contextualist thinking expressed eloquently, and is easier to hang your hat on than the: “doing what works for you” post-Agilism maxim.

Software Development Process Fusion – Know Your Goals

To get this fusion concept to work properly, it is incredibly, incredibly important to know what your goals are for providing value to your customers while building value on your teams. Otherwise, you may end up with a mishmash of watered down practices and have no way to measure whether they are helping you or not. Without an understanding of what success looks like, your team may end up with a “we’re doing what works for us” combination of process ideas that get you no further than with what you were doing before. I have seen this on countless teams adapting Agile processes. They thought adding daily standups, using iterative development, TDD, and getting rid of up-front planning and documentation was enough for success, and they ended up worse off value-wise than with a heavyweight process implementation. My response to the “We’re only adopting what works for us” concept is: a) Have you tried it? and b) If you have, can you evaluate whether it is helping your project or not? If you can answer both of those, then that phrase is completely appropriate. If not, we need to be sure we really do know what works for us and have a standard of measure to evaluate whether what we are adopting is helping us now, and if it helped us in the past, is it still helping us now? Believe it or not, sometimes the best processes can become stale and ineffective over time. Can you tell what is working on your project?

One team that I was on in the pre-Agile era was ruthless with tools and processes. Our development team lead would always say: “Does the tool suck, or do we need more practice or training with it?” whenever a tool or process wasn’t working for us as advertised. Notice the people focus – he empowered us, and made sure that we had a way to measure our tool and process adoption against our project goals. If things weren’t working, the finger was first pointed at the tool or process, not the people doing the work. We ended up with a combination of practices that evolved over time. We had clear goals on what we needed to do and what success looked like, and the following characteristics. We used :

  • an iterative and incremental delivery lifecycle
  • experimental programming/development
  • prototyping
  • strong customer involvement in planning and development
  • a strong emphasis on individuals developing their skilsl
  • frequent communication (standups, quality circles, pairing, collaborating, regular meetings with stakeholders and executives on goals and vision)
  • varied methods of developing requirements
  • varied methods of frequent testing from project beginning during the planning and idea phase, to product critiquing with serious exploratory testing on anything delivered, and at project end
  • varied automation in testing, build processes, and anything else that helped us be more productive

Anything went within reason, as long as it wasn’t unethical and didn’t hurt someone and didn’t threaten our deadlines, we were encouraged to experiment with different processes and tools in the ongoing effort to build the best software that we possibly could that not only satisfied, but impressed our user community. This was true “continuous improvement” in action. Sadly, most process ideals that I see completely miss out on most of this. They may have an iterative lifecycle, but don’t realize that the point is to help you deliver something your customer needs in stages to get their feedback, and to be able to adjust your plan as it hits the reality of the project. They do testing, but they artificially constrain it by trying to automate everything, or severely constrain requirements by forcing them into “tests”. They talk to each other, but have daily standups and iteration meetings whether they are really communicating anything useful or not.

The teams that seem to miss out on creating value over a sustained period of time are not open to ideas outside their favorite process, and belittle and marginalize people who have ideas on how to solve real problems. They look to the process to solve those tough problems, and cling to it instead of looking at the bigger picture. Successful teams I’ve worked with, on the other hand adapt, and change their process and understand that the process is yet another tool in the software toolbox to help them reach their goals. Process isn’t king – skilled people are. (Lacking in skill? Invest in skill development before worrying about your process too much.)

On that team I described above, we didn’t care what the role on the team was as long as they provided a service that helped us create value in our product. We needed people to translate requirements and product vision from something vague to something concrete that programmers could work on. We used a variety of lightweight ways to express this, and didn’t have rules about it. If it worked, it worked and we used it until it stopped working for us. The same went for testers. Those that were skilled at finding problems in designs and in the product and provided an information service were valued and encouraged, no matter what tools or processes they used. The quality of their information was what was important. No one walked around saying: “That’s not Agile!” or “That’s not [process we were using]” and discouraged you if you were doing something different. If it worked, the creativity was celebrated, not feared and driven out because it wasn’t recorded in some book somewhere. When the Agile Manifesto came out, and processes like Scrum and XP were gaining traction, we tried the ideas and adapted them to our process fusion. Processes and tools that worked were retained, and surprisingly, some practices like TDD were jettisoned over time, with a focus moving towards developing programming skills with some sort of lightweight code inspection process taking its place. We heard success stories of other teams who were doing wonders with things that had stopped working for us, and we wondered a bit why we were different, but at the end of the day, we were reaching our goals. We had stable, working software, a process that worked, satisfied customers, and a highly skilled team that valued each other and the diversity that individuals brought to it.

The Rule is There Are No Rules

I’ve seen too many process zealots or snake oil salesmen display bigotry towards others with different ideas that don’t fit their particular model. It’s easy to pick on the Agile movement because it’s a big fad right now, so there are a lot of readily available examples of people going around saying “That’s not Agile!” and creating an elitist club. Over my career, I’ve experienced people in the Object Oriented movement do this, and some RAD folks looked down their noses at one team I was on because we didn’t use the “approved” prototyping tools they used. Teams with a high level of CMM were also elitist snobs, as were some RUP practitioners, consultants and tool floggers. There are a lot of people out there who are more than happy to set an ideal standard of measure for us to live up to, make us feel guilty for our software “sins” and then profit from telling us we’re doing it wrong. A wise theologian once said something like this: “without sins the priest would be out of work.” Next time you feel you are doing something wrong, or someone else makes you feel that way, evaluate how they are profiting from making you feel that way. If you are creating value even though you’re “doing it wrong,” ignore them.

I’ve seen novel ideas to real life project problems turned aside because they didn’t follow somebody’s idea of process rules. If a pure process adoption is your goal, then you may have to do that sort of thing, but if a successful product that delivers value is your goal, following arbitrary process rules can be a real hindrance. If the software is well developed, who cares that you did some up-front planning. Who cares if you didn’t use story cards? If the team has great communication, who cares if you don’t do daily standups? If testing is done well, who cares if it isn’t completely automated? If you are good at eliciting and expressing requirements, who cares if you didn’t use ATDD or some other Agile automated test ideal? If your code is stable and maintainable, who cares that you didn’t use TDD? If you deliver value, who cares that you needed some up-front design? If your software is usable, who cares that you didn’t use BDD, but used traditional user experience techniques instead? (I’m not discouraging you from trying any of those Agile practices, indeed, try what you like as you strive to improve your process, but do it on your own terms – don’t feel pressured to try them just because it seems everyone else is doing it.)

As I mentioned earlier, we can put artificial bounds around what we do in software development, and invent rules that can impede our goals. Furthermore, rules that worked really well on some high profile project may not be appropriate for our project. Also, rigid rules can be a barrier to creativity and creating novel solutions, which are both the lifeblood of technological innovation.

My stance on all of this: if the particular process or process fusion you are using is working for you, do that. I really don’t care what it is, whether it is an Agile process, Cleanroom, RUP, Evo, some phased “waterfall” variant. If you have a bang up XP implementation that is working for you, your team and your customers, that’s great. Keep doing it. If you have a process fusion, don’t feel badly because someone says: “That isn’t Agile.” All I am encouraging is that you understand your goals, have a way to measure whether your tools and processes are helping you or not, and be open to other ideas when you need to adapt and change. Look at the history of software development and other ideas that have come before and try to learn from as many different sources as possible. Enlarge your software development process toolbox, and try combinations of ideas. Others have done this before, so it isn’t really that radical. Google the term for more ideas.

Agilism all too often ends up people being much more concerned with following “the rules” instead of being concerned with providing value and reaching goals. Merely following a good process in the hopes that all those tough problems will be solved by strict adherence to that process may not work for you. There is a difference between understanding what you need to do, and adapting as you go, and merely following a ritual without understanding the meaning behind it.

What Process Combinations Have Your Teams Created?

I see this sort of thing as having a future in software development processes partly because successful teams I’ve worked on have always changed and adapted not only their plans, designs and their code, but their tools and processes as well. We’ve also seen a fusion of ideas become popular in other areas, and it seems like a natural evolution. First we work through various extremes, and then we find some sort of balance. I’d like to hear about the combinations and adaptation of processes on your team. One day I hope to hear of a team that says: “We created a process mashup like this: we learned how to measure performance requirements of our development efforts and software inspection from Evo, iteration planning and management from Scrum, continuous integration from XP, persona creation from the user experience world, user testing from Cleanroom, and a large variety of testing ideas from various schools of thought in testing, combined with this other stuff we do on our teams that isn’t written down in a book or talked about by experts.” Most importantly, what are you doing to create value for your customers and your team? Are you using a purist implementation of a process, or are you combining different process aspects to reach your goals?

We Need Testing Example Videos

A few days ago, Michael Bolton, Pradeep Soundararajan and I were sitting around talking about software testing. (Shocking, I know.) Pradeep brought something up that has been on my mind lately: we need more testing videos.

Importance of Visual Examples

Several years ago, I spent a good deal of time with James Bach over a period of a few days. This experience involved a lot of testing exercises, but I learned the most by watching James solve problems and test software.

When I first saw him invoke SFDPOT in real time, while testing an application (instead of as a preliminary thinking tool before hand, which is what I had been doing) I was blown away. I immediately wanted to do that myself. I’d been reading and following James’ work for several years up to then, but seeing him put his ideas into practice was a huge lesson for me. My testing benefited greatly.

Fast forward to my own consulting career. I frequently do live exploratory testing demonstrations during or at the end of my talks. Sometimes they crash and burn and I feel foolish because I said something dumb in the moment and was wrong (Heavens! Can you imagine?) and it was captured on video, and what will people think… but, even when that happens, a small number of people come up and thank me. I’ve even had programmer colleagues attend a talk and say that it all fell into place when they saw me do the demo at the end, and it all made so much sense when they saw it in action.

Video Format

I’ve seen some recordings that are pretty good that use a screen recorder like Test Explorer or BlueBerry Test Assistant. These aren’t bad, but they lack that human aspect and prevent me from completely identifying with what is going on onscreen. I propose that we have demos that use screen recording, but also show different perspectives of the tester sitting at their machine, with audio of the demonstrator expressing their thoughts, perhaps prompted by a narrator. I want to see someone sitting at a computer, and hear them talk about what they are doing and why, as well as the recordings on the screen. I’ve seen some videos similar to this from one of the CASTs, but I want more.

We Need More Testing Videos

I don’t have the time to invest in this right now, so I’m asking people in the testing community to record demos, and make them available. (Maybe we could see if Antony or Rosie could host testing examples for learning, or link to them? Or perhaps someone would like to start a hosting service for example videos?)

I don’t care what flavor of testing you subscribe to, just show us what you enjoy. Show us how you think and test. Demonstrate your skill and use it as a teaching tool. Surprise us. Inspire us to try out your ideas for ourselves.

I’m Speaking in Toronto this Summer

I’ll be traveling to Toronto twice this summer for testing presentations. The first is for TASSQ, on June 24. I’ll be presenting my “Man and Machine” talk where I talk about interactive automated testing. This is a different approach to automation that seems like common sense to some, and is deeply controversial to others. (Usually tool vendors and “automate all test” types.)

July 15, I’ll be co-presenting with Michael Bolton at CAST 2008. We’ll be talking about parallels in testing and music, a topic I’ve written about before.

Michael has a nice blog post describing CAST that is worth a read.

I hope to see you in Toronto this summer, and finally meet some of you face-to-face. If you’re from the area, I hope you’ll come.