I’m beginning to love the challenges of this new job.ï¿½ I hope the love affair lasts.
There’s a couple of things I’m digging about the new job.ï¿½ One is that the more strategic role I have in theï¿½ new job’sï¿½ Learning Center provides me with the opportunity to blog professionally again.ï¿½ When I’m thinking through some of the issues I’m working through (content life cycle strategy, repository management, rapid learning development, aligning learning objectives to business performance goals, etc), it helps a lot to have a place to journal.ï¿½ Hopefully, my sharing helps other learning, training and performance peers (and peeps, yo) that are in the same spot as me, or even further down the long tail.
So yesterday I was assigned as the “designer” on two courses currently in the last throes of review.ï¿½ What that means in this scenario is that I have to review the content as both an ISD and as a technical reviewer, making sure that the content works as intended.ï¿½ We don’t have a process defined for this kind of review.ï¿½ We don’t have specific criteria defined to base a review on.ï¿½ We don’t have a format in which feedback from such a review should be in.ï¿½ We don’t have post-review actions defined.ï¿½ It’s not like this team has never reviewed courses for delivery before.ï¿½ It’s just not a clean and consistent (or necessarily understood) methodology for producing consistent quality training.
Guess who’s job it is to set the process in place and get the stakeholders in such a process to take ownership of it? :)ï¿½ I love it.ï¿½ It’s a good challenge.ï¿½ It’s a good place to see impact, both positive and negative, of what I’m bringing to the table.ï¿½ I think it will be pretty validating (or a really powerful gut-check, but let’s be optimistic).
Admittedly, I was never interested much in process, or quality measures and things like that.ï¿½ But I never understood the impact of QA like I do right now.ï¿½ So, I’m going to start with what I know and rely on the better practices I saw at CTC (and specifically ADL, which has probably the strongest quality process I’ve ever seen).ï¿½ The first thing I can do with minimal buy-in necessary is on the tech side.ï¿½ I need to define a consistent methodology for tracking bugs and defects in these projects that can be shared, archived and searched quickly and easily.ï¿½ So I’m going to evaluate BugZilla on my own this weekend, and if anyone has ideas on other tools that can be pretty much off-the-shelf, please comment on this post.ï¿½ I can certainly build my own php/mysql bug tracking solution that will be adequate enough, but it’d sure save me some time if I didn’t have to build it.
Basically, what I need is to be able to track each page/screen of a content object, as well as global issues relating to the content object as a whole.ï¿½ The range of issues can be technical (e.g. content not initializing the SCORM API) to grammatical issues to instructional (e.g. the branching of a given scenario needs to be redefined).ï¿½ I also need to be able to organize people either around a tool, or customize the tool use around the QA team’s roles.
A must is a way to integrate into content so that each page can launch a contextually relevant feedback form, so any reviewer can enter and view tickets related to the current screen.