Chapter 0 . Outcome & Value Realization
Why does any of this exist?
Why does any of this exist?
Ring 0 of RING:1000:2026 . The Core
Edition v0.1 . Draft for working group review Lead author: Derris Taylor . Working group masthead pending ratification
1 . The Opening Forensic
In 2017, a Fortune 500 retailer announced a five-year, $2.4 billion technology transformation program. The program would consolidate seventeen legacy systems into a unified platform, modernize the data architecture, deliver real-time analytics to store managers, and generate, in the words of the announcement, "measurable lift across customer experience, supply chain efficiency, and operating margin." The press cycle was generous. The board ratified the budget. The CIO presented the program at three investor conferences in the next eighteen months.
In 2022, the program closed. The federation has reviewed the closeout report under non-disclosure terms with anonymized references. Of the seventeen legacy systems, eleven had been migrated. Four had been retired without replacement after determining the underlying business processes were themselves legacy. Two were still in production, with the migration path now under reconsideration. The unified platform existed and was operational. Real-time analytics had been deployed to a subset of stores.
The closeout report's value-realization analysis was the document the federation reads as the canonical Ring 0 forensic. The program had spent $2.1 billion against the original $2.4 billion budget, which by traditional execution metrics is a successful program. The value realization analysis, conducted three years after the program's nominal completion, identified roughly $340 million in measurable revenue lift, $180 million in operating-margin improvement, and $90 million in cost-of-goods-sold reduction. The total measured value was approximately $610 million against $2.1 billion of investment.
The federation does not include this case to argue that the program failed. The program produced a quarter of a billion dollars of measurable value, modernized the data infrastructure, and produced organizational learning that informed subsequent investments. By many criteria the program was a success. The federation includes the case because the institution had no Ring 0 discipline that produced this analysis on a continuous cadence during the five years of execution. The $610 million figure was assembled retrospectively, after the program was nominally complete, through a six-month forensic exercise. During the program's execution, the institution's reporting was a series of milestone updates, capability deliveries, and progress percentages. None of those reports answered the Ring 0 question.
Why does any of this exist?
A practitioner reading this chapter will look at their own institution's largest active investment and ask whether the institution can answer the Ring 0 question, against that investment, today. If the answer requires a six-month forensic, the institution has not yet implemented Ring 0.
2 . The Doctrine
Ring 0 sits at the center of the methodology. It is the Core. The federation chose this position deliberately and held it through every working group review.
The Core is the value the other six rings exist to protect. Ring 6 denies the conditions that would compromise it. Ring 5 produces signal on its formation and exposure. Ring 4 attributes ownership of every component that supports it. Ring 3 enforces policy on every action that touches it. Ring 2 ensures every dollar that flows through it produces measurable work. Ring 1 governs the execution of every change that affects it. All six exist because the Core exists.
The doctrine of Ring 0 is the question.
Why does any of this exist? Every expenditure, every resource, every initiative, and every decision must ultimately answer one question: does this create measurable value, against a defined business outcome, that the institution can defend in front of the board, the investors, the regulators, and the field?
The phrasing reads as the working group ratified it. The four audiences are deliberate. The Core's value claims must survive board review, investor scrutiny, regulatory inquiry, and the federation's professional review. Claims that survive only one or two are partial claims. Mature Ring 0 implementations produce claims that survive all four.
Three principles run through this chapter.
Value is what the institution earned. Spend is what the institution committed. The two are not the same and the institution must measure both with equal rigor. Most institutions measure spend with precision and value with adjectives.
Outcomes are upstream of activities. The institution does not measure activities and call them outcomes. A migration completed is an activity. A migration that produced measurable revenue lift, operating-margin improvement, or risk reduction is an outcome. Activity reporting that pretends to be outcome reporting is the most common Ring 0 failure mode.
Value realization is continuous. The institution does not assemble value claims retrospectively. Mature Ring 0 implementations measure value throughout the lifecycle of the investment, course-correct based on the measurement, and treat retrospective forensics as a forensic-of-the-forensic, not as the primary value claim.
3 . The Standard
Ten controls. Five mandatory. Three recommended. Two adaptive.
3.1 Value Stream Mapping
Category: Mapping. Enforcement: Mandatory.
Mapping costs to business value streams to understand true cost-to-value relationships. The control is the foundation of Ring 0 because value cannot be measured without first defining the value streams the institution operates.
A value stream is a sequence of activities that produces a defined business outcome. Customer acquisition is a value stream. Order fulfillment is a value stream. Product development is a value stream. Each stream has named owners, measurable inputs and outputs, and a defined contribution to the institution's overall outcomes.
Value stream mapping in the federation's standard requires the institution to publish a value-stream taxonomy, attribute every cost line in the institution's ledger to one or more value streams, and refresh the attribution continuously as new investments and activities come online. The mapping is the bridge between Ring 4 (Ownership & Attribution) and Ring 0. Ring 4 names who owns each cost; Ring 0 names which value stream each cost serves.
KPI. Value-stream attribution coverage. Target: 95 percent of costs attributed to one or more named value streams.
3.2 ROI Tracking, Continuous
Category: Measurement. Enforcement: Mandatory.
Continuous tracking of return on investment for all significant expenditures. The control is the operating expression of the doctrine that value realization is continuous, not retrospective.
ROI in the federation's standard is not a single number computed at year-end. ROI is a continuous measurement that runs against every significant investment from the moment the investment is approved through its operational life. The measurement captures the investment's expected outcome (defined at approval per Ring 1.6 decision documentation), the realized outcome to date (measured continuously), and the variance between them.
The federation's reference threshold for "significant" is investments above the Ring 3 cost-override authority matrix's medium-tier band. Below the threshold, the institution may aggregate ROI by value stream rather than by individual investment. Above the threshold, each investment carries its own continuous ROI measurement.
KPI. ROI measurement coverage and freshness. Target: 100 percent of significant investments measured; freshness within one quarter of the most recent reporting period.
3.3 Business Outcome KPIs
Category: Outcomes. Enforcement: Mandatory.
Linking financial metrics to business outcomes: revenue impact, customer value, operational efficiency. The control is the institution's commitment that financial reporting is anchored to business outcomes rather than to activities.
The KPIs the institution tracks under Ring 0 are not "deployment frequency" or "uptime" or "feature releases." Those are activity metrics. Ring 0 KPIs are revenue per customer, customer lifetime value, gross margin per product line, operating margin per business unit, and similar outcome-anchored measures. Each Ring 0 KPI is owned by a named executive, reported on the institution's standard cadence, and tied to a value stream from 3.1.
The federation publishes a reference KPI catalog with calibrated definitions and measurement methodologies. Practitioners pick the KPIs that match the institution's business model and document the choice for federation review.
KPI. Outcome-KPI coverage and methodology. Target: every value stream carries at least three outcome-anchored KPIs with documented methodology.
3.4 Value Realization Reviews
Category: Review. Enforcement: Mandatory.
Periodic reviews to verify that expected value from investments is being realized. The control is the operating cadence that converts ROI tracking and outcome KPIs into actionable institutional practice.
The reviews run on a published cadence. Quarterly for significant investments. Annual for the full investment portfolio. The review is structured: the investment's original expected outcome is read against the realized outcome to date, the variance is named, and the review produces one of three outputs: the investment is on track and continues, the investment is off-track and a course-correction is approved, or the investment is fundamentally not producing value and is sunset.
The federation's reference review template carries seven sections: original expected outcome, realized outcome to date, variance analysis, contributing factors, course-correction options, recommended decision, and the named decision-maker. The template is the institution's commitment that review meetings produce decisions, not status updates.
KPI. Review cadence and decision outputs. Target: 100 percent of significant investments reviewed quarterly with a documented decision.
3.5 Total Cost of Ownership
Category: Analysis. Enforcement: Mandatory.
Complete TCO analysis including hidden costs: migration, training, integration, operational overhead. The control is the institution's commitment that the cost number used in Ring 0 calculations is the actual cost, not the visible cost.
TCO in the federation's standard captures eleven cost categories: acquisition cost, implementation cost, integration cost, training cost, operational cost, support cost, security and compliance cost, opportunity cost, depreciation cost, exit cost, and risk-adjusted contingency cost. Most institutions track three to five of these categories with rigor and treat the others as overhead. The federation's standard requires explicit treatment of all eleven, with documented methodologies for each.
The control prevents the failure mode where investments appear cost-effective on a narrow accounting basis but are loss-making on a complete-cost basis. The Fortune 500 retailer forensic in Section 1 showed this pattern: the program's $2.4 billion budget covered visible costs, but the actual operational cost of the migration (training, parallel-system maintenance, integration overhead, exit costs of the retired systems) added another $700 million that did not appear in the original budget.
KPI. TCO completeness across the eleven categories. Target: every significant investment carries documented TCO analysis covering all eleven categories.
3.6 Investment Portfolio Management
Category: Portfolio. Enforcement: Recommended.
Managing technology investments as a portfolio with diversification and risk management. The control extends 3.2 (ROI Tracking) and 3.4 (Value Realization Reviews) into a portfolio-level discipline.
A portfolio view evaluates the institution's full investment slate against three dimensions: risk distribution (concentration of investments in any single domain, vendor, or value stream), maturity distribution (the time-to-realization profile across investments), and outcome diversification (the spread of expected outcomes across the institution's strategic priorities). The portfolio is rebalanced on a published cadence, typically annually with mid-year mid-cycle review.
KPI. Portfolio review cadence and rebalancing actions. Target: annual portfolio review with documented rebalancing decisions.
3.7 Value Erosion Detection
Category: Detection. Enforcement: Recommended.
Detecting when previously valuable investments begin losing their return. The control is Ring 0's recognition that value is not permanent. Investments that produced measurable returns in year one may produce diminishing returns in year three and negative returns in year five.
Erosion detection runs against the continuous ROI tracking and surfaces investments whose return curve is bending downward. The detection triggers a value-realization review (3.4) for the affected investment with a focused remediation question: is the erosion structural (the investment's underlying value driver has changed and the investment cannot be restored) or operational (the institution's execution against the investment has degraded and can be restored). The two answers produce different remediation paths.
KPI. Erosion detection coverage and remediation rate. Target: 80 percent of significant investments under continuous erosion monitoring; detected erosion produces a documented remediation decision within one quarter.
3.8 Strategic Alignment Scoring
Category: Strategy. Enforcement: Recommended.
Scoring expenditures based on alignment with strategic objectives and priorities. The control is the institution's commitment that capital flows to strategic priorities rather than to historical inertia.
Strategic alignment scoring evaluates each significant investment against the institution's published strategic priorities and produces an alignment rating. Investments with high alignment are protected and accelerated. Investments with low alignment are reviewed for sunset or repurposing. The score is one input to the portfolio rebalancing in 3.6.
KPI. Alignment-score coverage and movement. Target: every significant investment scored annually; the score moves measurably as the strategic priorities evolve.
3.9 Opportunity Cost Analysis
Category: Analysis. Enforcement: Adaptive.
Evaluating what value is being forgone by current investment allocation. Adaptive because opportunity-cost methodologies vary substantially with institutional context.
The principle is that every dollar committed to one investment is a dollar not committed to alternative investments. Mature Ring 0 implementations document the alternatives that were not chosen and revisit those alternatives periodically as the institution's information about each option improves. The control prevents the failure mode where institutions over-commit to early choices because the cost of revisiting the choice is not measured against the value of the alternatives forgone.
KPI. Opportunity-cost analysis coverage. Target: every major investment carries documented alternatives that were considered and rejected, with revisit triggers.
3.10 Continuous Value Optimization
Category: Optimization. Enforcement: Adaptive.
Iterative process of shifting investment from low-value to high-value activities based on outcome data. The control is the highest-maturity Ring 0 expression: the institution operates as a continuously rebalancing portfolio of investments rather than as a series of one-time decisions.
Continuous value optimization is the integration of every other Ring 0 control into a single operating discipline. The institution measures value (3.2, 3.3), reviews against the measurement (3.4), detects erosion (3.7), evaluates strategic alignment (3.8), considers opportunity cost (3.9), and rebalances the portfolio (3.6) continuously. The output is an institution that allocates capital with the same discipline a sophisticated investor allocates a portfolio.
KPI. Capital reallocation cadence and value movement. Target: measurable capital reallocation toward higher-value investments at the institution's strategic-cycle cadence; value-per-dollar trending positively over multi-year windows.
4 . The Pattern Library
Ring 0 across the five canonical stacks.
| Stack | Ring 0 Pattern | |---|---| | Public Cloud | Every dollar tied to an OKR or revenue line. PoCs with no value metric killed at midpoint. Unit economics per feature. Cloud spend reported per business unit with outcome attribution. | | SaaS Portfolio | Every tool measured against productivity delta. Tools that save under their cost are sunset. Value reported quarterly per team. License renewals subject to outcome review. | | On-Prem and Hybrid | Capacity decisions tied to revenue forecasts. Migration projects justified by three-year TCO including labor. Sunsetting mandatory once outcome is no longer measurable. | | AI and ML | Model value per dollar tracked: retention lift, deflection, automation hours, revenue contribution per inference. Training-run ROI evaluated before next training cycle approval. | | Data Platform | Every dataset measured for value realization. Products such as the twenty-dollar query that protects two million in annual value are documented and defended. Data-product retirement on outcome failure. |
5 . Industry Applications
Cloud Infrastructure. Investment-portfolio management against cloud-spend lines. ROI per workload class. Strategic-alignment scoring that cuts unaligned cloud spend. Continuous value optimization that reallocates capacity toward higher-return workloads.
Software Development. Feature-level value attribution. Engineering investment portfolio with ROI per feature. Strategic alignment scoring that prioritizes engineering capacity toward high-value product lines.
SaaS Portfolio. Per-tool ROI measurement. Renewal decisions subject to value-realization review. Portfolio management that consolidates overlapping tools when value attribution shows redundancy.
Government. Program-level outcome KPIs aligned to mission objectives. Value-stream mapping that ties appropriations to mission outcomes. Continuous review of programs against measurable mission impact rather than against budget execution.
Supply Chain. Vendor-relationship ROI measurement. Procurement decisions subject to TCO analysis across the eleven categories. Continuous value optimization across the supplier portfolio.
AI and ML Operations. Model-deployment ROI per inference. Training-run value review before next-cycle approval. Strategic alignment scoring for AI investments against the institution's AI strategy. Sunsetting models whose return has eroded.
6 . The Adversarial Audit
Five vectors.
Vector 1: "Show me an investment from last year that did not produce its expected outcome and walk me through the decision."
The practitioner picks an underperforming investment from the portfolio and walks the value-realization review chain: original expected outcome, realized outcome, variance, contributing factors, decision (continue, course-correct, sunset), and the documented decision-maker. The auditor verifies that the institution made an active decision rather than letting the investment drift.
Vector 2: "Reconcile this cost line to its value stream and outcome KPI."
The auditor picks an arbitrary cost line. The practitioner produces the value-stream attribution per 3.1, the outcome KPIs the value stream is measured against per 3.3, and the cost line's contribution to those KPIs. If the cost line is unattributed or unmeasured, Ring 0 has not been satisfied for that line.
Vector 3: "Walk me through the TCO analysis for your largest active investment."
The practitioner produces the TCO analysis covering all eleven categories. The auditor verifies that each category has a documented methodology, that the numbers reconcile to the institution's ledger, and that the analysis has been refreshed within the published cadence. Partial TCO analysis is failure mode M5.
Vector 4: "Demonstrate a value-erosion detection event from the last four quarters."
The practitioner produces an investment whose return curve was detected as bending downward, the trigger that surfaced the erosion, the review that followed, and the remediation decision. If the practitioner cannot produce such an event over four quarters across a substantial portfolio, 3.7 is either not implemented or not detecting.
Vector 5: "Reconcile this strategic-priority claim to investment allocation."
The auditor picks a strategic priority from the institution's published strategy. The practitioner produces the investments aligned to that priority, their alignment scores per 3.8, and their realized outcomes per 3.3. The auditor verifies that capital allocation reflects the strategic priority rather than historical inertia.
7 . The Working Capital Math
Ring 0's quantitative spine is the relationship between investment-portfolio value-per-dollar and institutional outcome capture.
For an institution with annualized investment portfolio $P$ and value-per-dollar realization rate $v$, the realized-value position is approximately:
Realized value ≈ P × v
The federation's calibration is that institutions in Phase 1 typically realize 0.4 to 0.7 dollars of measurable value per dollar of investment. Phase 4 institutions realize 1.2 to 1.6 dollars per dollar across a multi-year horizon. The compression of this gap is the highest-leverage Ring 0 work.
| Ring 0 Maturity | Value-per-Dollar Realization | Practical Posture | |---|---|---| | Phase 1 (Blind) | 0.4 to 0.7 | No value-stream mapping. ROI computed retrospectively if at all. Reviews are status updates. Investments drift on inertia. | | Phase 2 (Reactive) | 0.7 to 0.9 | Annual value reviews. Some TCO analysis on major investments. Portfolio managed by gut, not by signal. | | Phase 3 (Coordinated) | 0.9 to 1.1 | Continuous ROI tracking on significant investments. Quarterly value reviews. Portfolio review annually. Outcome KPIs published. | | Phase 4 (Proactive) | 1.1 to 1.5 | Full Ring 0 surface active. Value-erosion detection running. Strategic alignment scoring active. Portfolio rebalancing tied to outcomes. | | Phase 5 (Adaptive) | Above 1.5 | Continuous value optimization across the portfolio. Capital reallocation tied to outcome signals. Investment decisions reviewed against opportunity cost. The institution operates as a sophisticated investor. |
8 . The 13 Modes of Failure
M1. Activity reporting masquerading as outcome reporting. Remedy: outcome KPIs distinguished from activity metrics; reports must include outcome-anchored measures.
M2. Value claims assembled retrospectively rather than measured continuously. Remedy: continuous ROI tracking on significant investments.
M3. Value-stream mapping absent or stale. Remedy: annual refresh with continuous attribution as new investments come online.
M4. ROI tracking without documented methodology. Remedy: methodology published per investment class; claims without methodology fail review.
M5. TCO analysis covering only visible costs. Remedy: all eleven categories addressed; categories without explicit treatment carry an explanation rather than an omission.
M6. Value-realization reviews that are status updates. Remedy: review template enforced with required decision outputs.
M7. Strategic-priority claims unanchored to investment allocation. Remedy: alignment scoring per investment with measurable allocation toward priorities.
M8. Portfolio managed by historical inertia rather than by current signal. Remedy: continuous portfolio review with explicit rebalancing decisions.
M9. Value erosion undetected because the curve is not monitored. Remedy: continuous monitoring with named triggers and review cadence.
M10. Opportunity cost ignored because alternatives are not documented. Remedy: alternatives recorded at investment approval with revisit triggers.
M11. Outcome KPIs that drift to vanity metrics. Remedy: KPI methodology reviewed annually; KPIs that lose tie to business outcome are deprecated.
M12. Sunsetting investments treated as embarrassments rather than as portfolio discipline. Remedy: institutional norm that sunsetting is a feature of mature Ring 0, not a failure.
M13. Continuous value optimization claimed but capital never reallocates. Remedy: measurable reallocation evidence required for the claim.
9 . Sidebars
>
Sidebar 0.A . Why the Core is the last ring you read. Co-authored, signed at ratification. The federation's reading order is outside-in: Ring 6 to Ring 0. The Core is the last ring you read because the Core is what every other ring exists to protect. A practitioner who jumps to Ring 0 without first absorbing Ring 6 through Ring 1 will produce value claims that float free of the protection layers that make the value durable. The order is principled. Read the protection layers first. Read the Core last. By the time you arrive at the Core, you understand what the Core is asking the institution to defend.
>
Sidebar 0.B . The activity-versus-outcome confusion. Co-authored, signed at ratification. The most common Ring 0 failure mode is institutions that measure activities and call them outcomes. A migration completed is an activity. A platform deployed is an activity. A model trained is an activity. Each activity may have produced outcomes, but the activity itself is not the outcome. Mature Ring 0 implementations enforce this distinction in their reporting. Practitioners building Ring 0 should expect to revise the institution's reporting templates because most of them present activity metrics as outcome reports.
>
Sidebar 0.C . Sunsetting is a feature. Co-authored, signed at ratification. The federation reviews many Ring 0 implementations that struggle to sunset investments that have stopped producing measurable value. The struggle is cultural. Institutions treat sunsetting as an admission of failure. Mature Ring 0 implementations treat sunsetting as a sign of portfolio discipline. An institution that never sunsets investments is an institution that is accumulating dead capital. The federation's position is that sunsetting cadence is itself a Ring 0 KPI: institutions should sunset some non-trivial fraction of their portfolio every year as the operating reality and strategic priorities evolve.
10 . The Founder's Annotation Track
>
I want the reader to know that the Fortune 500 retailer forensic in Section 1 was the case the working group asked me to use over the alternatives I had drafted. My first draft used a public-company case that the federation could discuss without anonymization. The working group's dissent was that the anonymized case was structurally cleaner because it represented the modal pattern rather than the dramatic exception. I lost the editorial fight and the chapter is better for it. The pattern this case represents (a successful program with low value-per-dollar realization that the institution did not measure during execution) is the federation's most-encountered Ring 0 failure mode. I also want to flag that Section 3.5 (Total Cost of Ownership) lists eleven cost categories. The first draft listed seven. The working group expanded the list during ratification because the additional four (opportunity cost, exit cost, risk-adjusted contingency, security and compliance cost) are the categories institutions most commonly omit and most painfully encounter retrospectively. Practitioners should treat the eleven-category framework as the federation's calibrated minimum, not as exhaustive.
11 . The Capstone Artifact
The Ring 0 capstone is the Value Realization Pack for the candidate's organization.
The pack contains, at minimum:
- The value-stream taxonomy. The institution's named value streams with owners, inputs, outputs, and contribution to overall outcomes.
- The cost-to-value-stream attribution. Every cost line in the institution's ledger attributed to one or more value streams.
- The outcome-KPI catalog. The KPIs the institution tracks per value stream, with documented methodologies and named executive owners.
- The continuous ROI tracking evidence. The current state of every significant investment with expected outcome, realized outcome, and variance.
- The TCO analyses for the institution's three largest active investments, covering all eleven cost categories.
- The value-realization review cadence and recent review outputs (continue, course-correct, sunset decisions).
- The investment portfolio current state with diversification, maturity, and outcome dimensions.
- The value-erosion detection report.
- The strategic-alignment scoring for the investment portfolio.
- Documentation of opportunity-cost alternatives for the institution's three largest active investments.
- Evidence of continuous value optimization (capital reallocation events from the last four quarters).
Submitted, signed, and dated. Federation Standards Council reviews. Accepted packs are filed against the candidate's CFO-R credential and contribute to the federation's public corpus of Ring 0 reference implementations.
The Ring 0 capstone is the most consequential artifact the candidate produces during the credential. It is the artifact that ties the candidate's Ring 6 through Ring 1 work to the value the institution exists to create. Practitioners should treat the Value Realization Pack as the demonstration of mastery, not as the final paperwork.
12 . Doctrine Q&A
Fifteen calibrated questions. Forty-eight in the proctored examination.
Q1. A program closeout report produced a value-realization number through a six-month forensic exercise. Has Ring 0 been satisfied?
A. No. Retrospective value assembly is failure mode M2. Mature Ring 0 implementations measure value continuously throughout the investment lifecycle.
Q2. A team reports "twenty-three new features released this quarter." Is this a Ring 0 KPI?
A. No. Feature releases are an activity metric. Ring 0 KPIs are outcome-anchored: revenue contribution, customer impact, operating-margin movement. Failure mode M1.
Q3. A cost line in the institution's ledger is unattributed to any value stream. What does this signal?
A. Failure mode M3 if the value-stream mapping has not captured the line, or a Ring 4 attribution gap if the cost line has no owner. Either way, Ring 0 cannot evaluate the cost's contribution to value.
Q4. TCO for a major investment captures acquisition, implementation, and operational costs. Has 3.5 been satisfied?
A. No. 3.5 requires all eleven categories. Three categories is failure mode M5. Remediation is expanding the analysis to cover the missing categories with documented methodology.
Q5. Quarterly value-realization reviews produce status updates without documented decisions. Is 3.4 active?
A. No. 3.4 requires reviews to produce decisions. Status-update reviews are failure mode M6. Remediation is enforcing the review template with required decision outputs.
Q6. A strategic priority is published but the institution's investment allocation does not reflect the priority. What is failing?
A. Failure mode M7. The strategic-alignment scoring per 3.8 is either absent or not connected to portfolio rebalancing. Remediation is scoring every significant investment against the priority and reallocating accordingly.
Q7. A vendor relationship has been continuously funded for six years without value-realization review. Is the institution operating Ring 0 maturely?
A. No. Continuous funding without review is failure mode M8 (portfolio managed by inertia). The remediation is bringing the relationship into the review cadence and producing an explicit continue, course-correct, or sunset decision.
Q8. Value-per-dollar realization is reported at 0.85. What phase is the institution operating at?
A. Phase 2 (Reactive) approaching Phase 3. The federation's Phase 3 band is 0.9 to 1.1.
Q9. A sunset decision has been made for an investment that produced 1.2x value-per-dollar. Is the sunset appropriate?
A. Possibly. Value-per-dollar above 1.0 does not preclude sunset if the investment's value is eroding (3.7), if alternative uses of the capital have higher expected returns (3.9), or if strategic alignment has shifted (3.8). The sunset is appropriate if it is documented against one of these triggers.
Q10. Opportunity-cost analysis is performed only for new investments. Is 3.9 satisfied?
A. Partially. Mature 3.9 implementations revisit opportunity cost periodically for active investments as the institution's information about alternatives improves. New-investment-only analysis is a starting point, not the full implementation.
Q11. The institution's value-stream mapping was last refreshed two years ago. Is 3.1 current?
A. No. 3.1 requires annual refresh with continuous attribution as new investments come online. Two-year-old mapping is failure mode M3.
Q12. A value-erosion detection event triggered three quarters ago. The remediation review has not been completed. Is 3.7 active?
A. Detection is active; remediation is failing. 3.7 requires remediation decisions within one quarter of detection. Remediation pending three quarters is failure mode M9.
Q13. Continuous value optimization is claimed but no capital reallocation has occurred in the last year. Is 3.10 implemented?
A. No. Failure mode M13. The claim requires evidence of measurable capital reallocation toward higher-value investments. Without reallocation, the claim is documentation rather than implementation.
Q14. Outcome KPIs include "platform availability" and "deployment frequency." Are these acceptable Ring 0 KPIs?
A. No. Both are activity metrics. Platform availability supports outcomes but is not itself an outcome. Deployment frequency is an engineering throughput metric. Replace both with outcome-anchored measures (revenue impact during incident windows, customer-experience metrics tied to deployment quality).
Q15. What is the canonical Ring 0 forensic the federation uses to ground the chapter?
A. The 2017 Fortune 500 retailer technology transformation program, reviewed under non-disclosure with anonymized references. The federation's reference case for the modal Ring 0 failure mode: a successful program with low value-per-dollar realization that the institution did not measure during execution.
End of Chapter 0 . Edition v0.1 draft . Working group review pending . Ratification target Q3 2026 . Public comment window opens at vot.ifo4.org on the chapter publication date.
Closing note. The seven chapters of CFO-R taken together constitute the v0.1 manuscript of the federation's first credential textbook. Ring 6 through Ring 1 are the protection layers. Ring 0 is the Core. The reader who has finished all seven chapters has read the federation's working draft of how Financial Operations is governed, in the founder's voice, against real public forensics, with named controls, calibrated KPIs, and capstone artifacts that produce contributions to the federation registry. The next edition (v0.2) will incorporate working group ratification feedback, public comment from vot.ifo4.org, and the expanded methodology annexes the working group is preparing. Practitioners reading v0.1 should expect substantive revision. Practitioners studying for the proctored examination should expect that the doctrinal stakes, the canonical forensics, and the controls themselves will hold across editions; the elaborations and KPIs will refine.