I've always had a talent for being particularly observant of my surroundings. As a QA Analyst, I have the chance to put this skill to good use by seeking out flaws in digital products. I also specialize in documenting records of defects, creating organized test cases, and suggesting usability improvements.
Below is the history of my experience working in Quality Assurance. Also, check out my Top Ten QA Tips.
Pearson is an expansive company that creates digital and print solutions for K-12 schools and higher education. I was the forerunner of the Courseware QA Team, who acted as "the last line of defense" for checking digital products for higher education.
My primary goal was to oversee the quality of our digital courseware products by implementing black box testing. When a new course was ready for testing, I would communicate with the Digital Producers and the Courseware QA Team to perform rounds of functional or regression testing on the course.
Due to the constant release of new digital textbooks and courses, the position also required proficiency in project management. If there was an abnormally large workload that the QA team could not handle, the team could also work with external QA vendors to perform some of the testing. Most of this communication was handled in JIRA.
In this unique "QA/PM" hybrid role, I enjoyed overseeing multiple projects and ensuring that all questions were answered and things were running smoothly. Since our team was growing rapidly, I also created a Process Guide which listed out all of the steps and various strategies for handling the rounds of QA.
American Eagle is an agency that creates websites and other web-based solutions for clients on a case-by-case basis. The company is well-known for their e-commerce sites and their comprehensive content management system.
In my role at Americaneagle.com, I learned to balance multiple projects, while finding similarities in the various websites I tested. Noticing these patterns helped me test more efficiently.
Additionally, I also tested customer change requests. My exposure to so many different websites was a great way to increase my exposure to various web designs and develop a sense of factors which are important in e-commerce.
Blackboard is a company that creates web portal solutions for schools and colleges. I primarily worked with "Blackboard Engage" (also known as Edline), which is specifically designed for K-12 schools.
The first few months at Blackboard (Edline) were spent in a Technical Support role. This was a unique idea that served as a method of learning how the product worked and provided insight into the customer experience.
In addition to my normal testing duties, I also became interested in the usability of the product. With a tight-knit relationship with Technical Support and our customers, I regularly suggested usability enhancements to the development team which could improve the user experience of the product.
Much of my time was spent working with developers, testing new builds, and logging defects into systems such as Jira and Bugzilla. I regularly utilized several testing techniques including unit, integration, compatibility, usability, and especially regression testing.
Having a clear QA plan of action is important, but keep it simple. Do not get lost in endless process "red tape."
Once a development/QA process has been decided on, follow up by writing simple, documentation. Having the process in writing assures that everyone is on the same page.
Create an organization chart of how the development team is structured. This helps clear up confusion on how each colleague relates to the project.
Write clear documentation for each project, which states goals, wireframes, specs, and the personnel involved.
Define the phases of a project's development and quality assurance. In an agile model, there is often room to go back and make changes, but define when and how this will occur.
Verify that everyone is indeed following the established development/QA process. If not, take the time to troubleshoot what issues need to be addressed.
Check in with the development team every now and then and ask if they believe room for improvement is needed in the day-to-day maintenance of the project. Such "technical debt" should be addressed to ensure that there is not a domino effect in overall quality loss.
Define a consistent priority system for defect and enhancement reports. A priority of "1" should mean the exact thing from one day to another, and from one project to another.
Define which items need to be automated for testing. It is best to assign automation to standard functionality that is often seen across multiple projects.
Do not spam the team with emails which do not pertain to them. This often discourages some employees from reading email at all.