Docs
MethodologiesGuides

MvF Author Guide

How to write a Methodology Verification Framework — structure, requirements, and submission process.

Who is this guide for?

This guide is for environmental scientists, methodology experts, and organizations proposing new dMRV methodologies for the Carrot Network. It walks through the process of writing a Methodology Verification Framework (MvF) from scratch.

Prerequisites

MvF Minimum Structure Reference

For the complete specification of what an MvF must contain — section by section — see the MvF Minimum Structure reference.

Before writing an MvF, you should have:

  • Domain expertise in the environmental claim your methodology addresses
  • Understanding of dMRV concepts and the Carrot dMRV Standard
  • Familiarity with the methodology catalog and existing methodologies
  • Knowledge of relevant international standards (e.g., UNFCCC CDM methodologies, IPCC guidelines)

Deliverables

An MvF submission includes four deliverables:

  1. Framework document — The complete specification defining scope, eligibility, validation rules, formulas, and data requirements.
  2. Traceability matrix — A structured mapping from every verification requirement to its data source and expected output.
  3. Verification guidelines — Instructions for Network Integrators on what data to collect and how to submit MassID documents.
  4. Evidence policy — A per-event specification of required evidence artifacts, provenance and retention expectations, digital vs. audit acceptance, and escalation triggers.

Step 1: Define the framework

Start by defining the core boundaries of your methodology:

  • Scope — Which waste types, treatment methods, and geographic regions does the methodology cover? Be specific about included and excluded materials.
  • Eligibility criteria — Which participants, facilities, and activities qualify? Define requirements for waste generators, haulers, processors, and recyclers.
  • Environmental claim — What is the measurable impact? (e.g., tons of waste diverted, CO2e prevented)

The scope definition must not overlap with existing methodologies. See the Colliding Methodologies policy.

Step 2: Specify validation rules

Each verification check must be defined as a rule with clear PASSED/FAILED criteria:

  • Rule name — A descriptive identifier (e.g., weighing, driver-identification)
  • What it verifies — The specific data element or condition being checked
  • PASSED condition — The exact criteria that must be met, with no ambiguity
  • FAILED condition — What causes the rule to fail, with specific error explanations

Reference existing framework rules in the methodology catalog for patterns. Many common verifications (actor identification, weighing, geolocation) already have established framework-level definitions that can serve as templates.

Rules fall into three categories:

  • Structure rules — Verify formal integrity and data completeness (e.g., required fields present, correct units). These are methodology-independent.
  • Methodology rules — Verify conformity with the methodology's criteria and parameters (e.g., eligible waste type, distance within project boundary).
  • Audit rules — Verify cross-event consistency and behavioral patterns that require deeper analysis (e.g., cumulative mass limits, duplication checks).

For the full 12-field rule specification, see Validation Rules & Controls.

Step 3: Create the traceability matrix

The traceability matrix links every verification requirement to its data source and expected outcome:

RequirementData sourceExpected output
Waste generator is identifiedMassID OPEN eventPASSED: waste origin consistent with events
Weight is verifiedMassID WEIGHING eventPASSED: net weight validated

This matrix ensures completeness — every requirement has a corresponding data source and verifiable outcome. The MvA will later implement each requirement as executable application rules.

Step 4: Write verification guidelines

Verification guidelines tell Network Integrators what data to collect and submit:

  • Which supply chain events are required (pick-up, drop-off, weighing, recycling)
  • What attributes each event must include
  • Which documents and attachments are needed (manifests, accreditation certificates)
  • Data format and submission requirements via the API

Step 5: Submit for review

Submit the complete MvF package (framework document, traceability matrix, verification guidelines, and evidence policy) to the Carrot Foundation for review.

The review process includes:

  1. Completeness check — Verifying all deliverables are present and well-structured
  2. Standard compliance — Ensuring alignment with the Carrot dMRV Standard
  3. Community review — The Community of Experts evaluates scientific rigor, feasibility, and potential collisions
  4. Approval — The Foundation approves the methodology for MvA development

Quality criteria

A strong MvF submission demonstrates:

  • Completeness — Every aspect of the verification is specified, with no gaps in the traceability matrix
  • Clarity — Rules are unambiguous and can be implemented as deterministic code
  • Testability — Every rule can be validated with concrete test cases
  • No collision — The scope does not overlap with existing methodologies
  • Standard compliance — All Carrot dMRV Standard requirements are met

These five criteria summarize the key expectations. For the complete six-dimension quality framework used during MvF accreditation — including the evaluation process, review cycles, and non-conformity typology — see Quality Criteria & Accreditation.

Feedback and methodology proposals: method@carrot.eco

Learn about MvF · MvA Developer Guide

On this page