[go: up one dir, main page]

Skip to content

VSA: tracking MR review iterations

Context

When creating a Value Stream in our Value Stream Analytics, we're given limited number of Start and End events.

While we have access to these Start Events related to Merge Requests:

  • created
  • first assigned
  • first commit time
  • first deployed to production
  • label was added
  • label was removed
  • last build finish time
  • merged
  • reviewer first assigned

This gives us little insight into the natural flow of MRs with multiple reviews and reviewers.

Possible states of a MR Review

For awareness, these are the current possible states of a MR review:

    value 'UNREVIEWED', value: 'unreviewed',
      description: 'Awaiting review from merge request reviewer.'
    value 'REVIEWED', value: 'reviewed',
      description: 'Merge request reviewer has reviewed.'
    value 'REQUESTED_CHANGES', value: 'requested_changes',
      description: 'Merge request reviewer has requested changes.'
    value 'APPROVED', value: 'approved',
      description: 'Merge request reviewer has approved the changes.'
    value 'UNAPPROVED', value: 'unapproved',
      description: 'Merge request reviewer removed their approval of the changes.'
    value 'REVIEW_STARTED', value: 'review_started',
      description: 'Merge request reviewer has started a review.'

in https://gitlab.com/gitlab-org/gitlab/-/blob/master/app/graphql/types/merge_request_review_state_enum.rb#L8

Need

When drilling into the value being delivered through these reviews, one wonders about:

  • time between the request of the review and the publishing of the outcome of that review
  • number of review cycles per MR
  • ...?

Proposal

Expose a series of events pertaining to the MR Review itself.

  • MR Review requested: when a reviewer is assigned to the MR
  • MR Review published: when a reviewer submits their review (with a simple comment, requesting changes or approving).
  • what else...?

Doubts/Open questions

What's the best way to structure the events in a way that allows us to grasp at the questions in the section above, Need.

Note about labels

For now opening this as groupcode review as we're interested in supporting this as Technical Roadmap effort allowing us to understand better our teams' metrics. But if it's best moved to devopsplan groupoptimize, by all means.

Edited by André Luís