Stop Implementing Features. Start Measuring Outcomes

Authored by: David Iscove – Director of Sales

Most marketing technology implementations fail to deliver their promised ROI, not because the platforms lack capability to demonstrate value, but because implementation lacks a measurement strategy.

You demo features. You build workflows around what’s possible. You train teams. You go live. And then, months later, when leadership asks “Did this actually move the needle?”, you’re left without the ability to report on the metrics that actually matter such as speed-to-market, cost-per-asset, revenue contribution, or campaign effectiveness.

This is the feature-first trap, and it’s costing marketing organizations millions in unrealized value. It’s why transformations exceed budget, risk delays, and deliver technology change without business impact. It’s why you can’t confidently answer the two questions marketing leaders care about: Did this tech make us better? And can we prove it?

The Feature-First Sequence (And Why It Fails)

Here’s how virtually every technology partner operates—and why it doesn’t work:

Vendors lead with capability. They pitch platforms and features: “Look at this AI-powered asset generation!”; “Check out these automated workflows!”; “Imagine the creative possibilities and efficiency gains!”

Implementations focus on rollout. In response, organizations build workflows around what’s been sold in. Configuration, adoption programs, training sessions, governance frameworks—all centered on maximizing feature utilization.

Go live becomes the milestone. Success is defined as features deployed against a targeted user base in a production context.

Measurement becomes an afterthought. Only then, as an aggregate of feature usage, does reporting get prioritized using whatever operational metrics are available—tasks completed, workflows launched, users active. These are tactical details about system health, not strategic metrics about business impact.

When technology is solely implemented from a feature-centric perspective, you’re limited to surface-level activity reporting. You can tell your CFO how many assets were created. You cannot tell them what revenue that generated. You can prove adoption. You cannot prove ROI.

This sequence seems logical because it mirrors how humans naturally approach problems. We build better tools, then figure out how to use them. But starting with features instead of outcomes creates a dangerous gap between what the platform can do and what your organization actually needs to measure.

There’s a better way. One that starts with measurement, not features. One that connects daily work to strategic outcomes. One that lets you prove ROI from day one and not as an afterthought.

What If We Inverted the Sequence?

Instead of this vendor-first, feature-centric approach, what if you started with something completely different? What if, before we ever looked at a demo or selected a platform, we asked an entirely different set of questions?

What are we actually trying to improve? Not “what could we improve if we had better tools,” but what are the specific, strategic problems we need to solve? Is it speed-to-market? Cost per campaign? Brand compliance? Creative throughput? Be specific about the business outcome, not the feature gap.

How will we know if we succeeded? If we solve those problems, what will be measurably different six months from now? What number will change? By how much? If you cannot articulate this, you have not defined success.

What’s our starting point? If we’re going to claim improvement, we need to know where we started. What’s our current performance? Do we even have that data? This baseline is non-negotiable.

What capabilities would enable those improvements? Notice the word “capabilities,” not “features.” If you need to reduce creative review cycles, the capability you need is “faster stakeholder collaboration and decision-making.” The features that enable that capability might include proofing tools, automated routing, mobile approvals, version control—but the capability is the goal, not the feature.

What features should we actually implement? Only now, after you’ve defined the problem, the measurement approach, the baseline, and the required capabilities, do we talk about specific features. And when we do, we implement only what we need to achieve the goal, not everything the platform can do.

This goals-first approach is the single most efficient and effective strategy marketing organizations can take to optimize technology selection, implementation, and adoption. When you lead with measurement, implementation becomes targeted. You stop building workflows because they’re possible and start building workflows because they’re provable. You focus resources on the configurations that move the metrics leadership actually cares about.

Most importantly, the daily work of content and creative teams becomes connected to the strategic priorities of the C-suite. With this direct link, marketing operations transforms from its traditional designation as a cost center into a value driver with quantifiable ROI. Instead of looking at project management and DAM platforms as simply managing work and assets, you unlock their true value as performance measurement systems.

The Strategic Pillar Framework: Goals-First in Action

This goals-first philosophy has been codified into a repeatable methodology: Omnicom Adobe Practice’s Strategic Pillar Framework. It’s built on a simple premise: transformation strategy should cascade from strategic intent down to operational evidence, not the other way around.

This seems obvious. Yes, transformation is always a top-down initiative, but the difference lies in the granularity of that cascade. Far too often, transformation targets only focus on projects (Was a tool implemented by a given date – yes/no?). But the only way to deliver a true understanding of business impact from an operational level is to go deeper. You must cascade business strategy into day-to-day, ongoing, and continuous reporting of work via a hierarchical relationship that directly links business goals to atomic data points captured in tactical work.

The Seven-Level Hierarchy

Every engagement, whether it’s a Workfront implementation, a Content Supply Chain optimization effort, or a singular generative AI pilot can and should follow the same structured cascade in its approach:

  1. Strategic Pillars → What capability do we, as an organization, need to improve?
  2. Goals → What does success in improving the capability look like?
  3. Objectives → How do we quantify success and define when we need to achieve it?
  4. KPIs → What will we measure to know we’re moving towards success?
  5. Platform Strategies → How will we enable ourselves to measure success (i.e. what tools are used to measure success)?
  6. Services → Who will enable us and how will they deliver our ability to measure success?
  7. Metrics → What is the evidence we must capture in our day-to-day operations to report on that success?

This creates complete traceability: every metric ladders back to a service, which maps to a platform strategy, which supports specific KPIs, demonstrating achievement of objectives, delivering on goals, aligned to strategic pillars. No orphaned features. No disconnected dashboards. No “we think this is working but can’t prove it.”

Eight Strategic Pillars, One Connected System

The framework organizes marketing operations transformation across eight capability pillars:

Core Operations:

  • Agility – Speed, adaptability, and time-to-market
  • Value – Cost efficiency and resource optimization
  • Governance – Compliance, brand consistency, and auditability
  • Scale – Content volume and production throughput

Outcomes & Enablement:

  • Intelligence – Data-driven insights and attribution
  • Experience – Personalized customer journey orchestration
  • Capability – Team skills, adoption, and organizational readiness
  • Collaboration – Cross-functional workflows and ecosystem integration

Each pillar is distinct but interconnected. Improving agility affects scale. Strengthening governance enables better intelligence. Collaboration amplifies capability development. What matters most is that each pillar has defined goals, measurable KPIs, and is mapped to specific technology platform enablers. Goals and specific KPIs may change from client to client or industry vertical, but the framework itself is consistent, built on a proven measurement architecture.

Real-World Example: Speed-to-Market

Here’s how the cascade works in practice with a client looking to improve speed-to-market:

Notice how every element connects. The success metric directly measures achievement of the strategic objective. The platform strategy is explicitly designed to support those metrics. Everything is traceable, intentional, and measurable.

Three Phases: From Planning to Proof

Understanding this framework is one thing. Applying it is another. Here’s how to operationalize it across three critical phases of your transformation:

Planning Phase: Before You Select Technology

Ask which pillars you’re trying to strengthen. If you’re focused on Agility and Scale, your platform strategy looks very different than if you’re prioritizing Governance and Intelligence. Don’t select technology based on “what’s hot” or “what our peers are using.” Select it based on which pillars need strengthening to achieve your strategic goals.

Implementation Phase: Before You Build Workflows

Define the KPIs you’re targeting. If you’re measuring “request-to-approval cycle time,” your workflow design focuses on reducing handoffs and review rounds. If you’re measuring “compliance pass rate,” your workflow design focuses on approval gates and audit trails. Your workflows should be architected to move the specific metrics that matter, not to leverage every capability the platform offers.

Proof Phase: Before You Report Progress & When You Prove ROI

Establish your baseline from day one. Maximum impact requires establishing a rigorous baseline, documenting historical performance, current-state workflows, and pre-transformation costs (including Cost Per Approved Asset when feasible). Every “after” claim should be grounded in a documented “before.”

Then, when reporting results, reach beyond the immediate data. Don’t just report “45% reduction in approval time”, consider that the same workflow can also deliver reporting such as “Achieved Strategic Objective 1.2 (Reduce Campaign Launch Time by 40%), enabling 3× content throughput without additional headcount, directly supporting the Scale Pillar and contributing $2.1M in avoided hiring costs.”

That’s the difference between a project report and a business case. That’s what transforms CFO skepticism into CFO advocacy.

From Implementation to Transformation

The Strategic Pillar Framework doesn’t eliminate the need for great implementation. Platforms still need to be configured thoughtfully. Creative workflows still need to be optimized. Teams still need training and adoption support.

But it changes the context in which this work happens.

You’re no longer implementing a platform. You’re strengthening specific capability pillars that have explicit strategic goals, quantified objectives, and measurable KPIs—with the platform as the technical enabler, not the end goal.

This shift transforms perceived value:

  • Before: “We implemented Workfront to manage tasks.”
  • After: “We enabled Agility and Governance, achieving 35% faster time-to-market and 98% compliance, enabled by our Workfront + AEM integration.”

One is a technology deliverable. The other is a business outcome. One justifies a budget. The other justifies an expansion of scope and investment.

Instead of another platform implementation that delivers features without business impact, you now have a repeatable methodology that delivers transformation—one that connects every feature to a strategic outcome, every tool to a measurable goal, and every implementation to a quantifiable ROI.

What’s Next: Deep-Diving the Pillars

In the coming weeks, we’ll publish deep-dives into each of the eight Strategic Pillars, exploring:

  • Pillar-specific goals and KPIs – What does success look like in each domain?
  • Platform enablement strategies – How Adobe GenStudio components (Workfront, AEM, Creative Cloud, Analytics, GenStudio for Performance Marketing) support each pillar
  • Real-world measurement examples – How clients are tracking and proving impact
  • Cross-pillar dependencies – How strengthening one pillar accelerates others

About this article: This article was co-created with GenAI tools. From stress-testing the framework logic to refining the narrative flow, I use GenAI as a thinking partner—helping me articulate complex operational concepts more clearly, write more efficiently, and focus on the strategic insights that matter. The ideas, experience, and methodology are mine; the AI helps me communicate them at a faster pace.

We want to help you create something extraordinary!
Let's Talk

We want to help you create something extraordinary!

Industry Intelligence

Related Articles

View All

Stop Implementing Features. Start Measuring Outcomes

Your Tool Isn’t Broken. Your Inputs Are.

Stop Guessing Where to Start. Build the Roadmap First.