Last week, Forrester released its Wave report evaluating US digital marketing agencies on strategy and execution. Brian Morrissey has a good summary in AdWeek if you don't want to pay the $1749.
I wasn't able to find the full contents of the report for free online but I did find a complete copy of the Q2 '09 Forrester Wave rating the web design chops of digital marketing agencies. After dissecting it a bit, I have some questions about the methodology and overall usefulness of these reports.
The Envelope Please
Per Forrester, the top agencies for "transaction-led web projects" (read: ecommerce) are Sapient, Razorfish (who just got acquired by Publicis resolving Microsoft's conflict of interest but not Razorfish's -- see my update to this post), imc2, IconNicholson (yeah, I don't know either), and IBM Interactive.
As for "image-led web projects" (read: branding), the top shops were Sapient, Razorfish, IBM Interactive, imc2, and Organic.
Forrester used the following 18 criteria to score agencies:
- User research
- Persona creation
- Persona application
- Design process
- Skills and staffing
- Cross-office consistency
- Collaboration abilities
- User experience
- Brand image experience
- Satisfaction of reference clients
- Market positioning
- Clarity of vision
- Emerging web technologies
- Industry focus
- Billable staff as of Q4 2008
- Revenues (2008)
- Revenue growth (2008 over 2009)
- Number of North American offices
The rest of the report was completed based on interviews with agency staffers and 2 client references offered by each shop.
Better than Nothing
While I applaud any and all efforts to bring sense to the nonsense that is the agency/client RFP process, I think the Forrester report is incomplete, at best, and flawed, at worst.
For one thing, the process for selecting which agencies it includes in the report is questionable. Per footnote #6, Forrester "invited 29 shops from Advertising Age's top 50 digital agencies by web design revenue to participate" along with 2 from a previous report, before picking "the 20 largest."
So, right off the bat Forrester assumes that the best agencies are the ones with the most revenue. (Unfortunately, many clients make this mistake too when building their list of shops to RFP, but more on that later.)
Even after settling on 20 shops, it's quite telling that 2 dropped out of the Forrester study because they "had commitments that prevented them from dedicating the required time and resources."
Indeed, it takes time and resources to contribute to, I mean... manipulate the results of this report. First, you have to pick your best 2 websites and 1 persona that you worked on in the past year -- clearly, not representative of your entire body of work, just cherry-pick the top outputs. Then you have to offer up 2 client references -- obviously, focusing on those that are best buds, relatives or otherwise inclined to only say glowing things about you. Finally, you have to take your best people off their current projects to make time for Forrester's intensive interviews.
Geez, this Forrester Wave report sounds a lot like the Agency/Client RFP process itself, aye?
Losing the Forrest for the Trees
Don't get me wrong, I think the Forrester Wave reports are valuable resources for clients trying to decide between the shops evaluated. My issue is that not all the shops are evaluated. And, furthermore, the criteria is too complicated and subjective to be really meaningful.
As an aside, last year, while I was at Resolution Media, I had my team inquire as to how we could be included in the Forrester Wave Search Marketing Agency Report. Resolution Media is rated by AdAge as a top 20 agency by revenue so I thought that would be enough to make the list. Turns out, our inquiry was returned by a salesperson looking to sell us a research subscription. It was unclear whether or not this is a prerequisite for inclusion in the Wave reports but we declined and instead took them up on an offer to speak directly with one of their analysts who was responsible for compiling the report. Let's just say that we never heard from anyone but the salesperson again.
Now, I realize that doing comprehensive research on 18 criteria across the hundreds, nay, thousands of digital marketing agencies out there is simply not feasible. However, I'd argue that all the various points of differentiation (Cross-office consistency, Measurement, Collaboration abilities, User experience, etc.) can all be boiled down to one key metric/benchmark: client satisfaction.
I won't make you go back and read all 4,000+ words of my client-agency RFP manifesto but here was one of my suggestions:
"I’m a big fan of the Ultimate Question as a way to cut through all the clutter and get to the heart of how good a job a company is doing. Responses to the question, 'How likely is it that you would recommend this company to a friend or colleague?' tell us more about that company than any exhaustive questionnaire or fancy demo ever will.
We need a system whereby all agencies are required to create a 3rd-party audited list of current and former clients including duration of engagement (ie, number of years under contract) and breakdown of service offerings. Then, each quarter, a 3rd party asks all clients to anonymously rate their agencies on the Ultimate Question. The resulting list would be sortable by agency service offering and client category and be made available to the general public for a nominal fee (to cover the survey hosting costs).
This model is really nothing more than a peer rating system for agencies -- not unlike Yelp for restaurants or eBay for merchandise sellers. Just as LinkedIn is the Facebook for business, I guess I’m suggesting a review site for marketing communications agencies. I firmly believe in the power of transparent and self-policing communities."
Rest assured, folks, I'm working on it! In the meantime, please take the Forrester Wave reports (and the plethora of agency press releases they touch off) with a grain of blue ocean sea salt.