Is Server to Server a Panacea for Publishers?
Header bidding has proven a lucrative innovation for publishers, helping them increase revenue by as much as 70 percent. Today, most publishers have adopted header bidding as a cornerstone of their monetization strategy, and more companies are capable of bidding via the header, signaling that the technology is here to stay.
Or is it?
Publishers appear intent on inviting more header bidders to participate in their auctions, hoping to achieve even greater prosperity. There’s limited real estate in a publishers’ header, and header bidding has increasingly become the scapegoat for slow loading pages and ads.
So, where do we go from here?
Publishers are now testing server-to-server (s2s) solutions, moving their partner’s code off of their pages into outside servers all in an effort to improve page speed. There is no debate— moving code off page will accomplish this.
The question is, by how much, and what is being sacrificed?
Perhaps the single biggest benefit to publishers is the ability to add a seemingly unlimited number of bidders through the server. After all, if publishers are able to increase revenue by adding five header bidders, it stands to reason that they can increase revenue even more by inviting more bidders to the auction. One early adopter of server-to-server said added auction pressure has increased revenue by 20 to 30 percent. And interestingly, the jump in revenue was directly attributed to AdX.
Sounds great, right? So, what are publishers waiting for? Not so fast… Let’s take a closer look.
First off, header bidding works on the client side, and has only gained speed over the years. Header bidders, especially the veterans, understand that latency is a concern and have worked to accelerate their “round trip” times (the amount of time it takes to get back to the page with a decision). Many partners now return bids in less than 100 milliseconds.
Second, publisher ad ops and developer teams are more knowledgeable. Many teams have tested and developed an ideal vendor mix flush with unique demand and fast partners. They understand that server-to-server is a not a magic bullet for page speed issues because header bidding is not always the culprit. Typically the combination of heavy JS implementation across their pages, videos, uncompressed images, legacy code, and not following best practices produces the perfect recipe for slow loading pages.
I advise publishers to look at their own content potentially occupying browser resources. On heavy pages, no matter how fast header bidders come back to the page they still have to wait in the queue for resources to free up. It’s easy to point the finger at third party bidders, but they’re not always the culprits.
Third, servers aren’t free to maintain. If a publisher is hosting its own server, server costs can be tens of thousands of dollars a month.
Fourth, outsourcing server maintenance to a third party creates consternation amongst demand partners. Currently, these servers are owned and maintained by companies also participating in the header auctions they are paying to host. The auction dynamics are a black box, even more so than those occurring within a client side wrapper solution. Why would these companies volunteer to host these auctions if they didn’t stand to gain from doing so? Remember – there’s no such thing as a free lunch. In order to achieve true transparency and for server-to-server adoption to accelerate, publishers must demand that the third party provide log level reporting on auction mechanics (bid speed per partner, bid rate per partner, etc.) to both demand sources and publishers.
Lastly, if I’m a publisher, and one, large company is responsible for hosting the auctions that drive over half of my indirect revenue, and something breaks, who is going to answer my call at 3 a.m.?
So, how are publishers going to play it in 2017? Lacking absolute assurance that they’ll match the revenue they’re recouping from their client side setups, publishers won’t rip out their header code in one fell swoop. Publishers will also need to ensure that bidder data fidelity is maintained in the server. The last thing publishers want is demand partners bidding less frequently because the data they used to recoup on-page to make an intelligent ad decision has been compromised.
Likely, several publishers will start by adopting a preferred server-to-server vendor, and hope to get the majority of their demand partners integrated through that one connection. If a minority of bidders won’t play nicely, they may allow some to remain on the page, likely within a wrapper. If the majority of their demand partners balk, they may set up their own server.
In conclusion, I caution publishers: When considering a server side solution, don’t throw the baby out with the bathwater. Server-to-server is an innovative approach to header bidding that can help improve some aspects of their businesses (e.g. getting some slow, high revenue guys off the page) but it might not be right for every publisher or every demand partner.