Following up on an internal training session at Media.Monks, this article introduces two key tactics you can use to support and grow your business through digital marketing on the Google Marketing Platform. The audience is intentionally broad with the view of sharing the “what” and the “why” across the full spectrum of digital marketer roles.
These techniques are exciting, as Google has published data demonstrating that double-digit percentage uplift in conversion value is possible. Results clearly depend on having the very best data, the best modeling capabilities and the best activation strategy, which is where Media.Monks teams play an essential role.
Who is this for? Everyone!
Are you in digital marketing as an “analytics person”? Primarily data focused? Technical? You’ll know about enhanced conversions (EC) and value-based bidding (VBB), but beyond the tagging, do you know what’s going on in the media systems and what it’s actually for?
Or are you a “non-technical” marketer? Your talents for campaign setup and management don’t overlap with tagging. Again, you’re across EC and VBB but where does the data come from? Why’s it so tricky to get right? What’s the hold up with the tags?
Regardless of our role specifics, we all need to have as full understanding of the solutions as possible. We need to get a handle on what happens “on the other side” so we can deliver the very best solutions for clients, and for users. Here’s the scoop you need. This is relevant to people on the Search Ads/Display & Video/Campaign Manager side as well as those on the Google Analytics/Google Tag Manager side. Here’s an opportunity to share knowledge… LFG.
Set the scene.
Cookie atrophy is a poorly kept secret. Browser tech continues to erode cookie usage. Third-party cookies are being deprecated from Chrome in 2024, which holds a dominant market share that’s significant for marketers. That doesn’t mean we are on safe ground when it comes to first party cookies though; just check through the details on Cookie Status to see the reality.
As data volume diminishes with sufficient signal quality, we can still use modeling techniques to mitigate for gaps in data, but that’s not a robust solution in isolation. We continue to make every effort necessary to maintain data volume, whilst evolving our tactics to improve efficiency.
This is where EC becomes a playbook entry to maximize observable conversions, while VBB drives greater efficiency by enabling optimization for value rather than volume.
Maximize observable data.
If we have less data, we must have better data quality. By that, we mean clean and clear data where we can clearly see conversions and channels. This means that the data still has utility even if it’s not complete. Where we may have holes due to browser tech and cookie loss, for example, we can still use first-party data to get better conversion accuracy. Enhanced conversions help us see more conversion data, but in a privacy-safe manner.
What it does.
Basically, on the conversion/sale/thank you page, a tag will fire—let’s say a floodlight tag for simplicity. The user’s email address is hashed (encoded using the SHA-256 algorithm), and then added to the tag data which is then sent to Google. This hashed value is then used to match the user with Google’s data to recover conversions that are absent from your data set.
You can use a range of values in addition to, or instead of, the email address. The email address is normally fine. It’s hashed, so no third party (not even Google) sees the data and it’s deleted after use. Google has published in-depth details on how the data is used, and this is essential reading for your teams.
Use best practices for tagging.
Ideally, you’d expose pre-hashed personal identifiable information (PII) on the dataLayer variable which can be picked up easily by Google Tag Manager (GTM) and added to the floodlight.
You can scrape the Document Object Model (DOM) to extract the data, but this is not a robust, long-term solution. You can use Google tag instead of GTM if a tag management system is not available. For offline conversions (leads), you can also upload conversion data via an API.
Collaboration is key.
Tech, data media and legal teams should work closely in order to correctly implement and then validate changes in data volumes.
Make sure you know the conversion page path, and that the PII variable is available. Scraping the DOM might be okay for a proof of concept, but don’t rely on it as a permanent solution.
Media teams need to make simple configuration changes and then report accurately on conversion volume changes. Use your data science teams to establish causality and validate EC is working. Liaise with your media teams regularly after rolling out EC to maintain scrutiny on the data volumes and changes. Be impatient for action (get it done!), but patient for results—manage expectations regarding timing, change may take weeks.
Using value-based bid optimization.
As we progress along the path of digital maturity, our tactics adapt and evolve. Where it’s normal and fine to optimize for click volume in the early days, the optimization KPI changes as our business grows. We aim to reduce cost, grow revenue, build ROI and ultimately optimize for long-term profit.
Optimizing a campaign for click volume was a brute-force budget tactic. Optimizing for value (profit stems from value) is a more precise allocation of budget. How the budget is allocated is the clever part.
Optimize for value.
Consider an ecommerce site where the obvious valuable outcome is a sale. There are other outcomes that serve as signals to indicate a user may be a valuable customer: viewing a product detail page, adding to cart, starting a checkout. All actions lead to the conversion, all with varying degrees of value. As each outcome is completed, fire a floodlight to inform GMP that the user has performed a “high-value action” worth €x. These actions and values are then used to automatically optimize the bid for the user.
Previously, defining the values associated with an action was a matter of experimentation. Now you can use an online calculator to refine these numbers.
This approach to value-based bidding needs a level of data volume and quality that is delivered by using EC with VBB—and is extremely powerful. It has few moving parts, but the values are static, commercial values that don’t always reflect the user’s likely behavior. To address this, let’s look back at an older solution to see how we can level up this approach.
Using coarse-grained optimization.
Previously, we’ve used machine learning to build a predictive model that will output an answer to “how likely is it for user X to convert”? At scale, the data is imported into GMP as an audience, and we use this to guide where the budget is spent. A simple approach here is to build a set of audiences from the model output to drive bid optimizations:
- “No hopers” with the lowest propensity to convert: €0.
- “Dead certs” with the highest propensity to convert: low or €0
- “Floating voter” with medium propensity; needs convincing: €maximum
This technique has delivered great results in the past. There are shortcomings, however. With three audiences, the segmentation by propensity is quite coarse. As the number of audiences ramps up, there is more to compute and more to maintain in terms of infrastructure. The user needs to revisit the site to “get cookied” and be included in a remarketing audience.
There is a more modern approach that addresses the shortcomings from these techniques.
Modeled VBB optimization goes even further.
We’ll now blend these two solutions with server-side data collection (sGTM). Server-side data collection has a number of key features that make it very appropriate for use here:
- First, it allows data enrichment in private—we can introduce profit as a value for optimization without exposing margin data to third parties.
- Additionally, first-party cookie tenure is enhanced by server-side data collection. Your first-party cookies are set in a way that prevents third-party exposure—browsers like this and take a less harsh view of them. This is better for your first-party data quality.
- There is no need to revisit the site to establish audience membership; all cookie-ing is done in the pixel execution.
So now, we can fire floodlights for our sales conversions, attach per-item profit data at the server level and optimize bids based on user profitability. Awesome, but what about the predictive model output?
At the server-side data collection point, sGTM can integrate with other Google Cloud Platform (GCP) components. As well as extracting profit data, we can interrogate a propensity model, and for each high-value action per user, ask what the propensity is for the user to convert. The predictive score is then attached to the floodlight to drive VBB.
This has fewer moving parts than the older solution. It solves for the coarse-grained audience feature by delivering per user scoring as the data is collected. Again, we team this up with EC to maximize conversion visibility and drive powerful marketing optimizations.
Optimize your marketing with EC and VBB.
These techniques have existed in isolation for some time. With a broader understanding of the data requirements, and the activation of the data, we’re all in a better position to use privacy-first marketing optimizations to deliver efficiencies for clients, and ultimately, a better, more useful online experience for consumers.
Collect the Data You Need, Right Where You Need ItBy Julien Coquet 4 min read
Performance Max: Over a Year in, Are We Prepared for a Keyword-less Future?By Tory Lariar 4 min read
From Starting to Selling: Why Integration Is the Next Exciting Part of a Founder’s JourneyBy Michael Cross 4 min read
Make our digital heart beat faster
Get our newsletter with inspiration on the latest trends, projects and much more.