MMP11: How to Break Things Really Good

Jon Boyd 
Director of Digital Communications, North Park University


The audio for this podcast can be downloaded at http://2011.highedweb.org/presentations/MMP11.mp3



Announcer: This is one in a series of podcasts from the HighEd Web Conference in Austin 2011.

[Applause]

Jon Boyd: Since most of you can read you already know that how to break things is really good is the title of this session and I really mean it. Breaking things can be very good. In fact I have a slide about that. Good is breaking things. There's a little bit of vandal in all of us I think. You know you at least like to step on the puddles when they're just freshly frozen, break the ice or throw rocks in the lake or throw rocks at something else.

But that's not so much what I'm talking about naturally enough. It's quality assurance that's good and quality assurance testing requires that you break things or at least try to. So that's why we're talking about breaking things today and if it taps into your little inner vandal that's OK too. I know I get a little buzzed when I really make something bad happen on the screen like, yes! I nailed it.
01:03 So if that appeals to you I think you're in the right place. But breaking things when you test is good because you are doing it before your users get a chance to. That's the whole point of breaking things when you're testing because your users will break things if you don't do it first. So that's what this session is about. Some principles and techniques for testing web development projects to assure that they're ready for your users because that's what you want.

Now what kind of testing exactly am I talking about. Here are my assumptions. Right or wrong and even if some of these aren't quite true for you it doesn't mean that you should leave so please don't because that would be depressing. But I assume that you launch elements that go outside or beyond your site's usual functionality at least occasionally if not regularly.
02:00
You got, it always got new stuff, not just new content, you know, in an existing layout but some new gizmo. Otherwise your testing is called copy editing. I assume that you are somehow responsible for overall quality in some way. That might be in your title and your job description or it might just because you care but one or another if you feel inertly responsible or you lost things you're responsible for quality I'm assuming that that's one of the reasons you're interested in this topic.

And then finally I assume that you don't like things to suck. If you do like things to suck then again this session will really not be much of interest. By sucking I mean letting your users have problems with your site. And again I have a slide about this, here problems.
03:00
What kind of problems, thank you. What kind of problems am I talking about? There's, I'm going to have to let your imagination run wild here but I'm talking everything that could start with actually things that are so basic as copy editing. In fact all quality assurance testing should include at least copy editing because typos and forms are just as stupid as typos and pros. So when you're testing be sure to actually read the language as well.

But it's everything from there on up through what maybe for you very sophisticated routines of usability and user experience. The sessions and necessarily about those because of all other topics but it's really a wide range of things. The point is you want to solve problems before your users have them. And again I noticed that I have not assumed that you're in any particular department in your university. You might be in admissions and you know that it'll be your prospect who will call confused.
04:04
You might be in IT and you know that you'll get called what we'll get to have those tickets. You might be in marketing, any, you know, any of us no matter where we are can catch flack for problems on our sites. So that's why I think it actually makes sense for testing and some tips and principles on testing to be useful for just about anybody.

Mostly though I am talking about projects with more technical moving parts not just copy editing. Things like, big things like online application, admission application forms and forms of all kinds actually. In fact a lot of my examples are going to be drawn from forms because I don't know about you but we make a lot of forms. They're ultimately easy to make these days and so we produce a lot of them. So they need a lot of testing and they make good examples but you might have payment processes. You might even have just multichannel communications that involve an interface between a printed piece that has a URL in and how is that going then when people actually connect.
05:08
You got to test something like that even when it's real totally straightforward especially let alone when you get into social media and making sure links go to the right places, you analytics work that kind of stuff. Stand alone mobile apps, I mean you've been at this conference here a couple of days you know the full of range of stuff that might need testing. In fact I'm sure you can tell me plenty of other examples. Maybe have some examples of things you're testing these days, you wished you had tested before you launch them recently? Yeah, just call them out.

Audience 1: We just launched a new site at Providence that we tested.

Jon Boyd: Well there are many events going on probably so that's no problem.

Audience 2: There's so many, you know they were good but we're the ones entering an ID...

Jon Boyd: Right.

Audience 2: We don't want to do that because it's not they're problem.
06:00
Jon Boyd: Right. There you go that's awesome. Yeah.

Audience 3: We just found out that our... they're turning this tiny campus thing that was created and not.

Jon Boyd: Ha... Wait that's my RSS Feed you're talking about. I've seen that same thing. Yeah, it's, why I already missed it. I mean we actually we really got calls from people worried that they'd miss the event. It was three months out but they thought they saw the time stamp for today.

Audience 3: We launched a templates for and first the CMS was not built for the content managing system. So you can add bold headers on to the top and they would not appear.

Jon Boyd: Right. I know.

Audience 3: That looks kind of...

Jon Boyd: It looks fine in the CMS, right?

Audience 3: It was fine to see that and not only that they also didn't build the stuff in the background that generate navigation. So you had to do the navigation by hand in the.
07:00
Jon Boyd: OK. These are, we're going to stop now because the nightmare scenarios are too disturbing to me. I'd like to sleep tonight. These are all great examples though. You noticed how some of them are more technical, some of them are processes, some of them are editorial issues. There's a wide range here so keep all of this stuff in your head here as we talk about the rest of this.

Today I have four principles and seven techniques to share with you today. That adds up to 11 because this is mmp11 and it's, actually not it's just a coincidence but... When you're wondering what do we do when we're coming up on needing to launch something whether it's small or large, I hope that these four principles and seven techniques might help you, keeping the principles in mind and then the techniques here to turn those principles into some practical steps. So there's one slide with all of them on here because I know some of you like that but don't worry you don't have to write them all down. We're going to be going through them.
08:00
And in fact you may have noticed some of the slide at the beginning, the slide deck is online and there are all the links that I mentioned are also online at that link so if you don't want to write anything down you don't have to or at least if it's on the screen. So four principles, don't assume it works. Remember the humans, it's not paranoia. If they really are trying to hack you and let things break well. Those are the four principles and then seven techniques then I'll flash out a lot more. Time, notes, instructions, variations and mutations, the end of the pipe, that little voice and watching for value.

OK. So let's move through these. First of four principles. And you noticed I numbered them all I thought about doing A and B and numbers so if it resets to one in the middle, you're not crazy it's actually we're just on the techniques. Four principles, number one don't assume it works. In fact this is principle number one we could just stop right here.
09:01
I think in my experience this is the number one thing literally that a tester should keep in mind. And if you've watched people testing poorly it's almost always this one that they forget. I have literally seen people pull up a form on a, you know, having been told that it was ready, oh that looks good. Yeah green light. They're assuming that it works. It could be a flat JPEG, you know in an HTML and hey it looks good.

In fact, I really maybe ought to strengthen this, I maybe ought to say assume that it doesn't work and that's, again now that's just a little too depressing so... This is really the fundamental session. You probably know that all copy has typos, right? There's no, it is impossible to publish it both without at least one typo. It is impossible to do an alumni magazine, just about anything with any kind of length.
10:03
That is true of everything we make as humans. All software has bugs period. There's an anthem for quality testers. It's Bob Dylan's song Everything is Broken. And if you, in fact I've got the link here, go read the lyrics and stream the song here. If you don't know this one. It's a great song. It's basically just he goes on about everything's broken and how everything's broken. The first like in fact could be written as if he were a coder because the very first line of the song is broken lines, broken strings and I don't think he's actually talking about our kind of lines and strings but that's a song in fact queue that up in your head and hum it along while you're doing your testing and you'll be just fine. Anyway don't assume that it works.

Principle number two is remember the humans. And what I mean by this is that technology is personal and it's relational and it's emotional and it's social. Yes even for the humans who developed our web technology, they are real humans and so they would like you to be friendly when you're testing. In fact maybe not just friendly but it's better if you're friends with them.
11:20
So a couple of things I suggest here to ease this is meet face to face as often as you can with the people who are developing your web projects. Talk about what you love about the projects when you're testing. Sometimes it's tempting to report only the bugs but if once in a while it's something you relay wow this worked really well or wow you nailed this one, that helps. Another one in my experience it's very important is to admit your own ignorance about what they're good at and ask for them to help you come up to speed on how they approach a particular problem.
12:05
It's really easy to sit back and take pop shots especially when there's something that turned out to be a judgment call on their point, on their part in terms of making the code. But pay homage to their skills with a Z at the end and that usually helps again under the heading of remembering the humans. And if all else fails bring them food. This is very effective I'm not kidding. I, fortunately, we do a lot of baking in my house and I walk over to IT a lot with cookies and brownies and stuff like that.

Then that means that every time I show up they're not frightened that I'm bringing bad news but it might be brownies this time. And actually, the other thing about remember the humans is don't be afraid, if you do all these things good developers will actually really appreciate that you're interacting with them and finding stuff that they didn't see.
13:05
Bad developers don't always, so, but you actually want to know who those and which is which. Anyway, but good developers are the ones you want to be working with anyway in the first place, actually love hair in your bug reports. OK. I realized, see that was a principle but I got into some tips and techniques there too so, this presentation's a mess.

Number three it's not paranoia if they really are trying to hack you. Of course they are trying to hack you, right? Do we have large bodies of intelligent young people with lots of time on their hands interested in mischief and getting ahead? Yes. It is inevitable that somebody is probing the edges of what you're doing whether that's for a little gain, you know, they just want to make funny tweets on the university account or whether they actually want to get into your payment system or something like that.
14:02
So this is a, this is one again that really could be a whole not only just a whole seminar in its own right but a whole workshop or a whole a conference on its own. Some donors really get into all the things you need to do to worry about security here but if you keep in mind that you ought to be maybe a little more paranoid than you naturally are. Now actually I don't know how paranoid you are to start with so somebody of you maybe should be a little less paranoid but I think the marketing and enrollment people who might be here tend to be pretty happy and assume that things are going to go well and you need to be just a little bit more concerned about the various purposes that the stuff that you're testing could be put to.

And I have an example of this. This is fortunately one that didn't actually go live before we caught it but the, I work for a non-profit where you could, there was a great portal and you could log in and everybody's at their own fund raising and you could basically tap into the oracle database and see how donations toward your work had been coming in.
15:20
I just noticed the easiest of all hacks which is that my employee number was in the URL. So it's not hard to learn everybody else's employee number because it's in the directory because their directory, so I tried the president's ID number in the URL and then I covered my eyes because I really didn't, I wanted to be able to exclaim that I had not actually looked at his financial reports.

The first reaction of the guy who developed this like oh nobody's going to do that. I was like well I just did and I thought of it and I don't think I'm going to be the only one. So anyway a little paranoia goes a long way.
16:00
I'd actually got an illustration to this. Here, there's another there's a different, slightly different example. It's kind of small I don't know if you're going to be able to see this here but for a while we were looking at a recommendation, an online application system that didn't, recommendations and it was the same kind of thing, There were parameters right in the URL. So ID 12345 and then your recommender's email address was here. Well as the applicant you know your number and of course you'd entered the emails of the people you were recommending and yet we claimed, we were going to have to claim that these recommendations were confidential and if the recommender had not waived their right to privacy and yet it would be super easy to URL hack into the form of these folks we're using.

So it's not, again, never went live. I suppose I shouldn't have put that right here on my university's mess, right there. Do not associate north park with that problem because we didn't go live with it. OK, guys?
17:07
Finally the fourth principle let things break well. I don't know if you've heard this before but things can break badly or they can break well. When something breaks well, you, there's some kind of feel safe for backup or it just doesn't, the whole thing doesn't explode. But when things break badly, they either cascade or get out of hand. And you have way bigger problems than if it simply stopped working.

So in unfortunate, worst case and classic example of something that broke badly is the O-rings on the shuttle booster rockets in the 80s. It turns out if they gets cold they shrink a little bit, no problem they open up a little bit. Oh yeah and then the whole thing explodes in space.
18:00
I mean it's a problem that then becomes a much bigger problem because it breaks badly. And you've probably seen this too, a form can break well and you just get, the form reloads and it says we weren't able to process your request or something. Or it can break badly and it goes blank and you don't even get a warning message so the browser crashes or something like that. You're looking for ways during your testing to make sure that even if you can't fix everything, you can help things break badly. A classic instance of this, then we'll talk about a little bit more as I go along is validation errors on forms.

If you really want people to put in five digits of codes if they live in the US then you can program that so that it says please enter a valid zip code if they've failed to do that. Whereas if you didn't do that all of a sudden you don't have a zip code. You don't have any way to mail them at all. They've gotten away maybe they didn't put any email address and you've got no prospect at all. That would be breaking badly in your case.
19:01
So you can expect perfection when you're testing but I don't know my advice is you should be happy with a little less than perfection but you ought to be giving some thought to well at least does it break well? OK. Yeah you do, sometimes you do have to settle eventually you really do have to launch and again this principle comes into play.

OK onto the techniques. Those are the four principles. Here are just some seven techniques and I'm going to be really interested in kind of conversation when I'm done with these because you may have other particular kinds of things but these are the sorts of things that I keep in mind when I'm testing and some things that have helped me at least understand what the problems are in testing. The first one is take the time to test. Somebody mentioned already, oh well launch was coming up and we didn't actually have time to test. This is the, I mean this is classic advice, right? Oh thanks for the advice. Leave plenty of time. Don't leave home at the last minute to get to the wedding. You're supposed to be out in ten minutes.
20:12
So this easy advice to give, hard advice to take but that's why I put it in the techniques, you call them disciplines if you want. If you need to figure out how to build this into a, as a discipline to your development cycles, to your launch cycles then that's what you have to do. If you can't consistently build in testing time before launch, you're going to need to learn how to postpone launch dates so that you can. It's just really important to do.

Testing is time consuming. It's like copy editing or design reviews or anything else, when you're trying to get it right. You have to slow down. I don't know if you've ever seen a professional copy editor work. I've actually watched them work.
21:02
I worked at an amazing publishing company for a while and the copy editors there were trained. I've never seen this anywhere else but it might be more widespread but copy editors there were trained to put a dot above every single word so that they would, to prove basically, it's like the nightwatch man going around and punching into all the stations. I put a dot here. I read this word. I read this word, going through everyone because as you know intelligent readers read the whole sentences at a time or whole paragraphs even and the copy editors job is actually to slow down.

That's actually what you're, that when you're doing quality assurance testing, that's what your job is to do is to slow down too. It's not to like fill out the form as fast as you can. That's what the users are going to do. Your job is to slow down and you don't get any points for finishing first when you're testing. The other reason to take the time is you are going to find problems. You're going to need time to test. You're going to need time to report them well and more on that in a second.
22:01
You're going to need time to get those things fixed and you're going to need time to test the fixes and also look for other stuff that might have broken while they were fixing stuff. So you're going to have to, all of that takes time. Now, how much time, this is a judgment call and it depends on the project, you know a big major site launch, you're going to want weeks at least for a new story with an embedded video, you know, you need an extra 15 minutes maybe.

So in between those extremes, you're going to need to know how to make that call. But it will affect how you manage projects and it should, I hope you've been to some of the other project management sessions here this week because Atilla the Hun, I don't know he'd take your hands off or something like that if you missed the book I don't what. Actually speaking of taking the time. It's great to be able to test parts early if you can.
23:01
If you wait to test the whole thing until it's done, sometimes it's the only way to do it but of course then that slows you down if you're going to have to go back and test stuff that you might have been testing along. So work with your developer on whether there are little bits that you can test as you go along. You still need to test the whole thing at the end to make sure they're working well together of course. But you can save some time that way.

And, you know, like I said about copy editors looking everywhere, force yourself to look everywhere. Read the browser title, the window title, right? Because that's actually important that's going to show up in Google and some places and other search results and it's going to be important for your search. Don't just scan the page as if you're actually a user. You got to discipline yourself to slow down and take the time.

OK. Technique number two, take great notes and write great bug reports. This one, it probably goes a little bit without saying but I want to share with you a couple of particular things here about why you need to write this up.
24:08
This is again, this is tied back to, remember the humans? People are going to have to fix stuff. It's really frustrating if you just say, oh the page doesn't work. OK. That's not a bug report, that is incoherent babble. So if you can actually learn how to write, if you don't already know learn how to write good bug reports, everybody's better off. It also gives you an archive them when it's time to test the fixes, you can go back and actually see what needs to get done.

Let me show you a couple, there are a couple of bug noting resources I have for you. First of all is a fabulous blog post by a legendary developer named Jules Bolsky. This is, got a title I cannot remember but it's actually just about how to write bug reports. Let me give you the cheat cheat version. There's three things. You have to identify the steps to reproduce. This is the one that everybody always forgets. It's just like, oh the page didn't load.
25:02
Well on what platform? Where were you coming from? What link did you click on? All those things. You got to identify as much as you can. This is also time consuming. Identifying the steps to reproduce. If you can't reproduce it, the developers are not going to be able to reproduce it so you'd better take the time to do it. Anyway the step one is identify the steps to reproduce. Two is what you expected to see and three is what you saw instead. Anyway it's all written up in this URL that's 8TO.mebug-reports.

A couple of things that you might want to consider and turn it for doing reporting bugs, really kind of platforms. Acrobat Pro and the annotation tools there can be very useful if mostly,especially if you're actually talking about something that's print and online. Speeding up PDFs and then using the commenting tools, you probably all have used commenting tools but if you haven't, one of the newer version, click on that comment, there's I'm sorry I need to do it in mirror image.
26:13
The comment tab in the menu of a new acrobat interface which I still hate will open a panel on the left which shows you all the tools and then also as you add comments, they get shown there so that even if there's little things and you're like striking through an empty space, it'll still show up in that list.

So consider that it can be really useful especially if what you're testing is mostly the inner, you know, the human interface. Another one that I have liked and have used to good effect from time to time is a product called notable. It's pretty cool. It's good for sharing bug reports online. It's actually pretty good at allowing you to markup things whether you're seeing them in the interface or whether they're in the source code behind or even in the CSS and you can kind of circle stuff on the web page with a marker and then add comments and then your colleagues can come and either comment on it or mark it as closed whatever.
27:14
It can be pretty cool when, especially for people who are not working immediately with and outsource developer something like that. It's easy to share stuff back and forth. I don't know Stock and any of these companies unfortunately so I have no conflict of interest to declare. Bugherd is another cooler one. I've only used the demo of this and then it was beta and it was free and it was awesome and then they started charging it, I'm too cheap so I, I haven't been using it lately.

But it's even cooler it's really just plugin in your browser and you can drop little. You're actually look at the live site and you drop little marks around and people then could come back and you're going to sign them and stuff like that it's pretty cool. You may want to check it out. And then another one, if you're looking for a way to actually do a little bit more traditional issue tracking, you know with tickets that can get a sign and resolved, Fogbug is good and it's for small teams it's even free and it's quite reasonably priced and that's pretty cool.
28:14
I'm really embarrassed to admit that my wife and I use this at home for house projects. It's, you know, as if my Lego iPhone case didn't cement my geek status, this just really would, totally does it. OK. So anyway learn how to write good bug reports and then use one of those tools to pass them along.

Technique number three pay attention to instructions on whatever is your testing and then violate them. So of course you want to actually read the instructions to make sure things are clear, that's part of your testing but you really ought to ignore the instructions or not even, you know pretend that you hadn't read them at all. Do you know why? Because the users are going to do that very thing, So you might as well pretend you're a user in a sense.
29:02
So it says enter your phone number in this format, well don't enter it in that format and see what happens. If it can't handle when you don't put the hyphens in in your phone number and it, the whole form breaks well that's breaking badly. You want to know, again you want to learn this because people are not going to type the hyphen in.

So validation on forms is a classic thing to, you know, look for the little red asterisk and then leave that one blank, that's new, see if it can handle that. And see what kind of error message you get and you know, can you go back and fill it in or does it not identify with what you did wrong? Does it wipe out other fields once you throw validation errors. That's one of my favorite ones like I spent all this time filling out this form and, but I didn't put in United States and now all of a sudden everything's gone. Man that's a sure way to make users mad. So yeah, I mean when I'm Canada don't insist that I have a five-digit zip code.
30:06
If fields are left blank, will you still be able to follow up? Like if it says emails, you know, if email's the only thing that's required on there but actually let's it get through, the person thinks they have submitted their form, they're going to be mad at you that you didn't get back to them. This is a kind of thing you got to be able to follow up on. A form or almost anything else that needs a manual, it needs to be redone. You do not need to, you don't want to have lots of instructions particularly on a simpler kind of thing.

OK. And this is actually segue so I'm to technique number four, this was in that earlier screen this was what was called variations and mutations. Don't try it only once especially if you did everything correct. That's the other thing, we talked about having somebody pull up, oh that form looks great no problem.
31:00
But I've also seen people pull it up fill it in correctly and submitted and think that it works. No. You got to try that a couple of different ways. Again violate the instructions leave some stuff off. You, in testing, you're trying to find the edge cases. You're trying to find where things stop working. And of course everything has got an edge case. I mean if you leave the whole form blank of course you don't expect you to get people's emails out of that. But you want to know where those are. So you got to try multiple things. You got to try things that are close you know leave one digit off a credit card number. See if the whole thing shuts down because they failed to talk to the credit card merchant correctly that kind of thing.

So be a weirdo. I mean this variations and mutations, right? This is how genetics works. Sometimes it goes well. Sometimes it goes badly and we get big brains because there was a mutation somewhere but... So it's not all bad.
32:02
Technique number five. Watch the other end of the pipe. Again, I'm talking largely about things where there's user input and you're trying to get something out of it here but with thinking about forms but again you got to test these kinds of things. You might think, oh, again well I did just so you know six different I filled out the form it seemed to work fine. Well no. You got to go and loon in the admin interface and see whether things are coming through. You have to test the results of their actions not just the actions, you know, how they're going to actually take place.

So I got a couple of things that I'm talking about here. Confirmation pages, again these examples are mostly from forms but you can imagine how this works. Confirmation pages and emails that go to the user. You got to read those, right? Especially when you're using a vendor who's worked with other schools. I mean we had one. Thank you, from North Park we were testing it. Thank you for contacting Azusa Pacific University and we're like, OK, that's actually not the idea that we had for thanking them.
33:03
So that's something that came up the other end of the pipe that we'd might not noticed if we hadn't actually gone through and smoothed the form and then went to look at it. Equally important notification sent to you especially if you're getting data by email or you're being notified that new data has been put in the database or something like that. You want to make sure that those are clear and complete and you're getting the information that you need to out of there.

We recently, our forms broke in this regard. We have the email notifications that our recruiters were going to count on using information from where broken. Everything else was working fine. Nobody, you know the users didn't care so it was not as urgent a problem as it might have been but we couldn't add in any other forms without breaking the notifications we were getting and that was bad. And then thirdly insistence in the admin interface. If you're, we had one form that was scrambling the social security numbers.
34:01
It was originally a security thing and that was great but then we actually didn't know what the social security numbers were. Even in the secure interface. So you got to do that. OK. Principle or I'm sorry technique number six listen to that nagging little voice. Do you know that little voice when you, I mean editors sometimes know this the thing that you in the third read you're like oh yeah, I'm finally going to realize that sentence isn't working.

It was I like that word choice or sometimes that little voice that says no it's not going to be good. Well it's true in technical stuff too. And as it says because no it's not going to be OK. If it bugs you a little bit and you got to tune in to that voice that things aren't going well, again, that's what some other users are going to discover. This is a real world example, I'm sorry to say. This is an event registration or part of an event registration form we are currently using.
35:00
And when we were testing it we thought well yeah there are three buttons at the bottom. Maybe that'll be confusing. No it'll be OK. Nobody's going to be bothered by knowing which of the three buttons to click. Well, yeah turns out they really are bothered by which three but everybody clicks the big blue one because it says continue, right? Well no. The one you actually have to click is the check availability button. So we're frustrated about this and we're working on it. But it's that nagging little voice that suggests there might be something for you to fix.

Finally technique number seven is watch for value in what you're doing. This is partly for your own, to keep your own spirits up. You're taking extra time to do this. You're finding problems which can be sad but if you remember that there's value in what you're doing, not only is that good for you emotionally I think but actually it helps explain what you're doing to your clients, to your bosses, to anybody who's like how come nothing's not done yet? I can't believe you're sending it in back again. Come on we got to launch.
36:09
If you remember that you're creating value here, both for you and your users and it really can be big value. One of the first, I mean this is pure luck so I really tell this story about myself but through no virtue of my own but right when I started at North Park, I was just trying to do some routine testing of stuff that was on a site and so I filled out the prospect inquiry form from the graduates, our bread and butter revenue stream and then I asked OK who gets that? And I walked around and nobody seemed to have gotten that one and I was like well who gets them? And oh well you know, Cherry should get it and, Cherry when's the last time you got one of these? I don't know let me look. I don't have any in my inbox I don't know when.
37:01
So again, no virtue of my own I just happened to stumble across a form that was silently failing. OK. That's a great example of fails badly, right? It doesn't have to blow up in the sky for it to be a fail badly. All these kids are filling out these forms. Who knows how many, right? And they think, can you see what I'm saying? This is how bad this is. They're happy, lalala, confirmation message on the screen they get the email and we weren't getting the confirmation so we turned it back on in the next just couple of weeks before that form was actually going to be retired anyway, 40 came in or something like that.

My point here on watch for value. I knew that I had just paid my whole salary for that whole year because of the rate those hot prospects become qualify and apply and yield for us. There was tons of revenue in just that one thing so I felt happy.
38:00
Every time you fixed something, you should be watching for it. The other reason you should watch for it is because you were not testing to achieve perfection for perfection's sake. You may be a perfectionist naturally and so you have a strong drive to do this. You do sometimes have to stop yourself from just wanting to polish everything until it's absolutely perfect. The point of doing this is to find where the value is for the user so they have a good experience and don't get confused then go away and finding value for your institution in the ways like not losing on the graduate prospects.

So whatever you do make it fail before your users do that. It's your job to make it fail. So break it really good. Here's the, my Twitter handle if you want to talk more on Twitter. You got those couple of tags, I see some are even tweeting already. People seemed to be depressed by the fact that my wife and I use Fogbugs at home.
39:01
And then that link there at the bottom, it's octome and mmp11 there at the end. So at that link you'll find the PDF of the slides and links to the to all those URLs I mentioned. So I think we got some time left for questions and/or comments or to hear other techniques that you've got for breaking things when you're testing. Yeah.

Audience 4:  How much do you rely on beta testing and its elements? They've had a showing of four pm and...

[Cross-talk]

Jon Boyd
:  Right. Are you talking about services they'll actually scan for broken links and that kind of thing or?

Audience 4:  Yeah or the link and stuff and right?
40:04
Jon Boyd:  Right. Yeah, I think  there are, I mean that's great if you're sure that the automation works. I mean sometimes you're just building two things on top of each other that needs testing. We don't do a lot of that. Frankly, we want to hand tough stuff so I don't have too much to say about that. There's another hand over here somewhere.

Audience 5:  That was me. You got an accident from a user who like has done things nobody ever thought.

Jon Boyd:  Right.Yeah. That's awesome. I mean there's something... I'm glad there was a happy ending to that story because there actually are some users who just will complain about everything and so really you want to find the edge cases. You don't necessarily want to eliminate all the edge cases. There are some things that are not going to be worth fixing or actually aren't above they think it is.
41:01
But yeah, no, a person who's particular, in fact I just read, I wish I had a citation for you but I saw this online, there's a firm outside of Chicago that's employing people with Asperger syndrome who are obsessive about detail and willing to repeat things and do things over and over again and pay really close attention to detail and apparently they're awesome quality testers. So you might have some people on your staff who are suited for that.

I mean no with all due respect, I don't mean to make lie to that but I mean this is a great idea. This firm is actually giving folks something that they are actually really good at and finding the value they can contribute to people who are testing. Yeah?

Audience 6:  How can I add it try it only once. We had a form that is for the firm that they would compile all the security rules and they were supposed to renew it annually. It sounds like the notification to renew broke.
42:05
Jon Boyd:  Oh yeah. Right.

Audience 6:  But I haven't like go and done something about that kind of thing in a long time. So I'm like just test it once but keep testing.

Jon Boyd:  Right. Yeah that's a good point. I mean that's something and like the example I gave about the, something that's been working for years, you forget about it. Actually having some disciplines where you actually go back and review especially your core, you know, really the important thing is either that are legally important like that or financially important. You want those to work. Yeah?

Audience 6:  So was that excessive retesting?

Jon Boyd:  Yeah again, that's a whole other subject. I think some of these things, I don't myself know as much as I'd like about it but you need to, it's actually under the principle of keeping, remembering that you're looking for what users are going to experience. I thought that the keynote yesterday was great being aware of the kinds of experience that people will have with your site.
43:05
It really definitely raises the bar and increases the complexity of this task when you're doing technical testing. To realize it, wow. What's a screen mirror going to do with your forms? Yeah.

Audience 7:  Can you make recommendation for rules on signing that forms or an alert about the requirement, site and what garbage task into fields and it accepted it for like the phone number and then realized that somebody had associated proper jobs as your file to validate those, you just throw whatever you want in here. You don't have to...

Jon Boyd:  Right. Yeah I mean sometimes it's helpful to call yourself, you know, test boy or whatever so that you can wean those out when it's time but actually having junk in there, again, you'll learn. You'll learn how long, how long can my middle name be. If you just pound on the keyboard for a while too. That's a really good one. I think we got time for one question left.
44:00
Audience 8:  We use Word Press, I'm on the Content Communication sites so I review something and it might be copy editing or it might be a link that's messed up. It's based on the RSS feed and not the category something, get annoying because I point it out he just wants me to fix it but I feel like if I fix it then he's going to keep making the same mistake. So we have compromised I pointed out. So I feel like that undermines a bit. It was motivation to now keep repeating the mistakes.

Jon Boyd:  Right. Yeah, but it's especially if it's something you'd think he might not be aware of. You know you got to work on it but yeah, I mean that's just an age-old problem of how to give a man a fish or a fishing pole kind of stuff. Excellent. I think we're out of time.

Audience 9:  Thanks for the time. Thanks.

Jon Boyd:  Thanks a lot.

[Applause]