Skip to main content

Posts

Google testing blog comment...

I recently read a post on the G o o g l e Testing blog titled:  How Google Tests Software - Part Three . I added a comment to the post, but that comment has yet to appear . I thought I'd add post my comment here in the mean time. (I've added some links here, for the curious) “I agree that 'quality' can not be 'tested in'. But the approach you describe appears to go-ahead and attempt to do something just , if not more, difficult. You suggest that a programmer will produce quality work by just coding 'better'. While a skilled and experienced programmer is capable of producing high quality software, who will tell them when they don't or can't? We are all potentially victims of the Dunning–Kruger effect, and as such we need co-workers to help. There are a host of biases that stop a programmer, product owner or project manager from questioning their work. The confirmation and congruence bias to name just two. These are magnified by group-think

2.2250738585072012e-308

Meet my new friend 2.2250738585072012e-308, We've been hanging out recently. If you've not heard of him, he's about ten years old but thats pretty old in [dog and in] software years. He's getting pretty famous in his old age, but he had humble beginnings as a lowly bug report on a Sun Microsystems website. It's rumoured he was first discovered back in 2001, but his big break didn't come until recently , when it was realised that he has the potential to be a key component of a Denial of Service attack that could bring down many java based systems [that accept floating point numbers  as input]. This includes commonplace application servers like Tomcat, who accept floating point numbers as part of the HTTP protocol. 2.2250738585072012e-308 has now been placed firmly in my mental bag of tricks along with divide by zero, 2^32, null, imaginary numbers, localised floats and all the others that routinely get brought out to help me test and investigate software. Bu

Why so negative?

Have you ever had trouble explaining to a non-tester why you appear to be intent on breaking their software? It can be difficult to explain why it's important. So I thought a video might help... If you want to read more about the scientific method, check out the hunky-dory hypothesis .

Believing you don't know

People want to believe. If you are a tester then you've probably seen this in your work. A new build of the software has been delivered. "Just check it works" you're told, It's 'a given ' for most people that the software will work. This isn't unusual, it's normal human behaviour. Research has suggested  that it's in our nature to believe first. Then secondly, given the opportunity, we might have doubt cast upon those beliefs. We see this problem everywhere in our daily lives. Quack remedies thrive on our need to believe there is a simple [sugar] pill or even an mp3  file that will solve our medical problems. Software like medicine is meant to solve problems, make our lives or businesses better, healthier. Software release are often presented to us as a one-build solution to a missing feature or a nasty bug in the code. As teams, we often under-estimate the 'unknown' areas of our work. We frequently under-estimate the time taken to

Into the testing hinterland.

Why do we refer to our ancestors as Cavemen? The evidence of course! The cave paintings, the rubbish piles found in caves all round the world. It's simple, Cavemen lived in caves, they painted on the walls and threw rubbish into the corner of the cave. Thousands of years later we find the evidence, demonstrating they lived in caves. Hence the moniker 'caveman'. How many caves have you seen? Seriously, How many have you seen or even heard of? Now I'm lucky, as former resident of Nottingham [in the UK], I've at least heard of a few . But if you think about it, you probably haven't seen that many. Even assuming you've seen a fair-few, how many were dry, spacious and safe enough for human habitation? As you can guess, my point is: there probably isn't a great selection of prime cave real-estate available. It doesn't add up: The whole of mankind descended from cave [dwelling] men? Before you roll your eyes, and think I'm some sort of Creationist ,

Just ban just!

Office meetings are interesting events. Seriously, even the most boring ones. There's usually an important reason for the meeting, even if that meeting has been lost in the sands of Outlook repeat-booking, or the present attendees feel strongly otherwise. There's usually some kind of agenda, although as the meeting progresses, you may suspect the real agenda is hidden. But none of that is what captures my imagination. What sticks in my mind is the perspectives of the people in the room, especially when they are talking about a subject close to my heart like testing. We can't help but give away our positions and perceived hierarchies when discussing what work we need to do, and how. For example have you heard someone utter, in a meeting, words to the effect of: "Just create a fully automated framework for our regression test pack, to test the future releases." Then continue on, as if they had asked to "borrow a pen", or if you could "just adjust t

ID Skeptic

At a client site, a few years ago, I had an interesting discussion with a 'senior programmer'. Our discussion centred on a configurable home-page. A user could decide what news or other information etc, they wanted to display on their home page. They'd start off by being given a generic page - and the customer could add or remove certain types of content to customise their page. Once they saved the 'new look' site, their choices would be saved on the web-server. The company didn't want to force people to login, or even make them sign-up for an account. The goal was to keep it simple for the user. But they needed a way to uniquely identify the users, so they used an existing feature of the website. The first time a user came to the site, they were cookied with a 'unique' id 'code'. We could then use this identifier as a key in our database to store the details of what the users had configured their homepage to look like. The testers reading th