Posts Tagged ‘DoD’

Analytics

The short answer is yes – the product/team will definitely benefit by having web/app analytics tracking as part of the definition of done (DoD).

The only time that a separate analytics tracking story should be written and played is typically in the scenario of:

  1. There’s no existing analytics tracking, so there’s tracking debt to deal with including the initial API integration
  2. A migration from one analytics provider to another

The reason why it’s super important to ensure that analytics/tracking is baked into the actual feature acceptance criteria/DoD, is so then:

  1. It doesn’t get forgotten
  2. It forces analytics tracking to be included in MVP/each product iteration as default
  3. It drives home that having tracking attached to a feature before it goes live is just as important as QAing, load testing, regression testing or code reviews

Unless you can measure the impact of a feature, it’s hard to celebrate success, prove the hypothesis/whether it delivered the expected outcome or know whether it delivered any business value – the purpose of product development isn’t to deliver stories or points, it’s to deliver outcomes.

Having a data-driven strategy isn’t the future, it’s now and the advertising industry adopted this analytics tracking philosophy over two decades ago, so including analytics tracking within the DoD will only help set the product/team in the right direction.

Done

Once the product backlog is in a good quality condition and the product backlog items (PBIs) start moving into development, there’s a significant amount of tasks to tick off before the feature can be marked as ‘done’.

Typically a development team would use a ‘definition of done’ (DoD) as a reference to ensure that none of the processes get missed off before it’s ‘done’, as each of those processes are essential and could have considerable consequences for the business and customers if it’s not done.

Some examples of what could be in a definition of done:

  • Code is reviewed by someone who didn’t do the PBI
  • Code is deployed to test environment
  • Feature is tested against acceptance criteria
  • Feature passes regression testing
  • Feature passes smoke test
  • Feature is documented
  • Feature has analytics tracking
  • Feature approved by UX designer / stakeholder
  • Feature approved by Product Owner

Missing any of the DoD processes before a feature gets delivered to a customer could result in critical bugs across the feature, causing bugs across other features in the code, bring down the product or delivering the wrong requirement, so it’s essential to take the definition of done seriously even if it means taking the PBI over to the next sprint resulting in potentially not meeting a sprint goal.