Showing posts with label smart. Show all posts
Showing posts with label smart. Show all posts

Tuesday, September 2, 2014

Hard sh** and smart sh**

Great GSD ("Get Sh.. Done" as ApexSQL Nis office central poster says) = consistently have both short term and long term transparent results

Some teams are working using Scrum work organization through goals (answer to what?) and tasks (answer to how?), while others have monthly goals / quotas to achieve by making daily and weekly progress in smaller increments

We often forget about the goals while focusing exclusively on the tasks at hand to get them finished no matter what, forgetting about what is actually needed in the end and in what time

Now I'm not saying that focusing on daily tasks is wrong - one of my own email quotes from the early days said:
"Focused, hard work is the real key to success. Keep your eyes on the goal, and just keep taking the next step towards completing it." - John Carmack
 
We must always keep in mind both the goal and have Daily deliverables at all times

But in order to grow, besides working hard we must also be cognizant of the end goal at all times and be able to work smart to achieve the goal

GSD has two levels:
   1) Hard Sh**: daily deliverables, things we must all do our part and complete in order to make daily progress, i.e. fix stubborn bugs, test bug fixes and report new bugs, write TS articles for unfixed bugs, publish articles, communicate with stakeholders, share your progress daily, etc.
   2) Smart Sh**: make a dent in the universe by aligning all action with end goals and complete the goals with a reasonable amount of effort in a reasonable amount of time

How does Smart Sh** translate to your everyday work?
   a) Dev teams should realize that customers don't have a use for 50 bug fixes in code and would much rather prefer 20 bug fixes in a product build they can actually download - know when to cut and deliver.
   b) SQA should invest time to understand product usefulness, to learn about it and to use it as customers use it in order to test the product well, to never assume they know everything, to write about it in non-hamburger helper way, and to excel in customer support.
   c) Everyone should be investing time into ERF mentorship with new colleagues so that they can start contributing back and in turn save time to you.
   d) Everyone should automate repetitive work - currently a huge soft spot in multiple teams.

What does automation have to do with working smart?
A lot: one of the funnier historical examples is several devs going rogue and writing a software that fully automates sending Daily Scrum Summaries - while other teams were taking hours of time weekly to manually compile good Scrum Summary emails, these devs took one day and created a two-click solution that in turn saved them days of time; I always received spotless Scrum Summaries from their teams which was a mystery to me until I found out about the "plot" ;) (devs are good guys, they were planning to distribute the software to everyone when we stopped sending these emails daily)
 
Time is a limited commodity so by investing some in order to automate repetitive work (smart vs. hard), devs gained more time to focus on what really matters: to write better code and to deliver great products to the customers in order to make a dent in the universe. I'm also sure devs had more fun in the process ;)

Ultimately whatever we do we must ask ourselves:
   A) Can I complete the task at hand more efficiently with less effort and time?
   B) What goal will be closer to completion when I finish this task today?
   C) What will I deliver to customers when the goal is completed?
   D) Will the customers pay me for the deliverable?
 
If no one will pay you in the end, why would you do it in the first place? ;)

Thursday, November 1, 2012

Optimize for success

"With great power comes great responsibility" - Stan Lee

Self-management can be a double-edged sword without discipline and a plan. Many teams including QA now have SMART goals to guide you, however you must guide yourself on a daily basis in order to achieve monthly SMART goal expectancy

Let's focus on the latest real-life use case:
   1) Testing weekly plan was defined: use up to 2 hours per day to test patches and focus on testing our new enterprise product for the remainder of the day
   2) Testing weekly plan was refined and approved - by the end of this week we'll have 2nd testing round of enterprise product wrapped up
   3) Week is almost over, however enterprise product 2nd testing round has just started and will be postponed for 3-4 days

What happened? Here's what I heard:

   A) "We had too many patches to test and this took a lot of time"
These patches were preplanned in the weekly test plan - it is not an excuse to violate weekly test plan. Our goal isn't to fully regression test each patch and verify its readiness for production but to focus on verifying those few (usually 1-2) bug fixes, spot-test core functionality and get it out to the customers who requested it

   Solution: dedicate a fixed chunk of time for testing each patch build, up to 1 hour; first verify fixed bugs then spot-test core functionality and finally send the patch testing summary. If the whole QA team found no core functionality issues during this time, it is highly unlikely that the single customer who requested the patch will find any; if there are new issues, we'll quickly re-patch without wasting much time

   B) "We had too many Support team forwarded cases that we stuck with for a long time in order to not forward them to developers"
What I'm seeing is ~82% of Support forwarded cases handled within the QA team which is way above 50% SMART goal; although this will be measured more precisely soon and is also an important goal, you must optimize your time better - taking 4-5 hrs to stick with a single support issue is an overkill and will inevitably cause your other SMART goal (Zig score) to suffer and our products to be late to production due to testing delays

   Solution: dedicate a fixed chunk of time for sticking with a support issue, especially if you are over the 50% SMART goal expectation for the month. Discover your own diminishing return point and use it to balance out the SMART goals when you make no progress with the support case at hand

   C) "We had many ad-hoc issues that piled up and took more time than actual testing, developers needed help with specific bugs, new team member needed guidance, there were team planning meetings, bugs needed priority corrected as we updated severity guidelines, Skyfall is premiering in the movies this week"

   Solutions:
   a) Dedicate a fixed chunk of time for team planning meeting (learn from Daily scrum) - 30min max
   b) When creating bugs, explain them in more details so devs don't ask you to clarify them - this will save time for both you and the devs on the long run
   c) Don't go to see the new James Bond movie until you have achieved daily SMART goal of at least 50 Zigs
   d) Before finishing a workday stop for a minute and ask yourself: "Have I achieved all my daily SMART goal expectations and are we on track with the weekly plan?" If the answer is yes, go ahead and have a scary night watching Paranormal Activity 4

Monday, October 8, 2012

Comprehensive test plan - PACT

SMART goal provides much more internal team organization flexibility compared to SCRUM where Product owner defines clear individual goals and expectations; however increased flexibility without planning can also cause disorganization and make your life more complicated

There is no singular Product owner for SMART, but why not consider a weekly team agreement / weekly plan as the "Product owner" to clearly guide the "What's" (incremental goals, or "what needs to be done") in the team?

Define weekly Priorities, Allocation, Continuity and Thresholds plan / a PACT to guide you as a team

Note that below are guidelines and you need to define and send your own weekly PACT plan and table

Priorities
Every bug leads one step closer to accomplishing your SMART goal; however some bugs need to be found before others:
   1) What are the main test priorities you need to work on? Check Production schedule what products are expected to be delivered to testing; contact developer teams to get direct feedback if there are unplanned changes to the schedule you don't see; also remember JIT

   2) How to prioritize main product testing? Focus on product ROI: enterprise products first, then developer tools and finally community (free) tools

   3) Any ad-hoc test priorities? Test patches and engines before regular product releases as this can usually be completed quickly; engines are usually needed for a specific main product release

   4) Quick-testing: always break to check new installers and website content

Allocation
There are finite number of test engineers and so many products and features to test. Parallelism is our enemy as we don't need test summaries for 4 products at once but one test summary at a time as soon as possible
   1) Unless you have a strong reason how you can improve efficiency by splitting the team to work in parallel on different test deliveries, focus on everyone testing one deliverable at a time

   2) Make testing for a single deliverable feature set circular in order to reduce the number of false negatives (and increase the number of Zigs) - tester A tests feature A while tester B tests feature B; then tester A tests feature B while tester B tests feature A

Continuity
We can have between 1 and 5 product test rounds. The first test round is usually the one with the most low hanging Zigs to pick and to put in your Zig basket. However the fifth test round is as important as the first one even though "What" to test is different
   1) Plan for the longest first test round, especially if you have a completely new product to test; always specify how long will the testing last as there won't be second chances to [regression] test all features from scratch

   2) Push the testing into the next week as necessary but always specify why

   3) Subsequent test rounds should be short but long enough to cover all fixes and changes made by the developers since the last test summary

   4) Final round is always #3 (#5 for new products) - no matter what you find there, the product will be released so think twice before deciding how much time to spend on this one as you cannot extend the testing further

Thresholds
You have 3 new product builds to plan testing but how will you know when to stop testing one and move on to the next one?

Actually I'd like to hear some of your suggestions here and then I'll update this post; there are many ways to define thresholds in testing but also don't forget that you have a SMART goal that must be achieved each month as it is reset to 0

Testing the hell out of one product as it has easy to find Zigs and glancing through another one is not an option as we will easily detect false negatives for the latter one: bugs are always there since developers inadvertently ensure this is true, you just need to find them