Thoughts on Strategy: Tracking Bugs

I’m beginning to love the challenges of this new job.� I hope the love affair lasts.

There’s a couple of things I’m digging about the new job.� One is that the more strategic role I have in the� new job’s� Learning Center provides me with the opportunity to blog professionally again.� When I’m thinking through some of the issues I’m working through (content life cycle strategy, repository management, rapid learning development, aligning learning objectives to business performance goals, etc), it helps a lot to have a place to journal.� Hopefully, my sharing helps other learning, training and performance peers (and peeps, yo) that are in the same spot as me, or even further down the long tail.

So yesterday I was assigned as the “designer” on two courses currently in the last throes of review.� What that means in this scenario is that I have to review the content as both an ISD and as a technical reviewer, making sure that the content works as intended.� We don’t have a process defined for this kind of review.� We don’t have specific criteria defined to base a review on.� We don’t have a format in which feedback from such a review should be in.� We don’t have post-review actions defined.� It’s not like this team has never reviewed courses for delivery before.� It’s just not a clean and consistent (or necessarily understood) methodology for producing consistent quality training.

Guess who’s job it is to set the process in place and get the stakeholders in such a process to take ownership of it? :)� I love it.� It’s a good challenge.� It’s a good place to see impact, both positive and negative, of what I’m bringing to the table.� I think it will be pretty validating (or a really powerful gut-check, but let’s be optimistic).

Admittedly, I was never interested much in process, or quality measures and things like that.� But I never understood the impact of QA like I do right now.� So, I’m going to start with what I know and rely on the better practices I saw at CTC (and specifically ADL, which has probably the strongest quality process I’ve ever seen).� The first thing I can do with minimal buy-in necessary is on the tech side.� I need to define a consistent methodology for tracking bugs and defects in these projects that can be shared, archived and searched quickly and easily.� So I’m going to evaluate BugZilla on my own this weekend, and if anyone has ideas on other tools that can be pretty much off-the-shelf, please comment on this post.� I can certainly build my own php/mysql bug tracking solution that will be adequate enough, but it’d sure save me some time if I didn’t have to build it.

Basically, what I need is to be able to track each page/screen of a content object, as well as global issues relating to the content object as a whole.� The range of issues can be technical (e.g. content not initializing the SCORM API) to grammatical issues to instructional (e.g. the branching of a given scenario needs to be redefined).� I also need to be able to organize people either around a tool, or customize the tool use around the QA team’s roles.

A must is a way to integrate into content so that each page can launch a contextually relevant feedback form, so any reviewer can enter and view tickets related to the current screen.

One reply on “Thoughts on Strategy: Tracking Bugs”