Behind the Brilliance: Testing and Optimizing Your ABM Programs with David Tam
I continue to be impressed with the level skill and talent of our customers at Engagio.
Recently, we invited David Tam, Director of Marketing at OneLogin, to talk to our team about how he’s executing ABM. Not only was I impressed with his knowledge of Engagio, but I loved his rigor and discipline around optimizing ABM campaigns and programs.
That’s why I wanted to feature him on this edition of Behind the Brilliance, where I highlight some of the sharpest marketers around. If you love ABM, and if you love testing and optimizing your marketing programs, you’re going to love this interview.
So, without further adieu please enjoy my conversion with David.
– – –
Brandon: How does your ABM fit into your overall go-to-market (GTM)?
David: At OneLogin, we shifted to target account selling as the primary approach to prospecting and closing business internally. GTM at OneLogin is made up of the sales, marketing, and business development teams. Together, we view ABM really as an ABR (Account Based Revenue) approach where each team is responsible for moving the needle for our target accounts.
Brandon: How does ABM drive collaboration selling across the company?
David: Collaboration starts at the very beginning with our Chief Revenue Officer and Chief Marketing Officer introducing an account-based approach to the entire company. During the account selection process, account executives select accounts using inputs from marketing insights/technology and considerations from partner relationships. During this process we hold office hours with reps from each team on how to select accounts, create account plans, marketing coverage and tips on using tools like Engagio to track, measure and execute. We also review the progress reps are making on target accounts in weekly meetings with the key players from each team.
Brandon: How do you think about testing campaigns in ABM given that you have lower volume?
David: Some of the same testing methodologies still carry over from general inbound and outbound marketing tactics, such as A/B testing messaging, copy and creative. But where ABM is different is being able to test more personal approaches, such as crafting messaging, copy and creative for specific accounts in ads, direct mailers and your landing pages. Using intent data, like engagement minutes, we know what activity each engaged account is doing. This allows us to show them custom messaging rather than grouping them with a much larger cohort and using generic messaging..
Brandon: How do you measure the success of your tests?
David: With ABM, the goal is to ultimately create revenue. But to get there we must first look to create awareness. Once we get in front of our target accounts and gain recognition/awareness, we want to engage people within those accounts. The touchpoints we use can be anything include advertising, events, marketing emails, driving inbound traffic to our landing pages, outbound calls and email, voicemails, social interactions, engaging through our partners and partner marketing programs.
Testing for success ultimately is determining the various orchestrated sequences or plays. This will get someone to speak with us and ultimately help us determine the orchestrated plays that result in new revenue and recurring revenue.
Brandon: How much budget do you allocate to testing ABM campaigns before rolling it out in a full program?
David: Budgeting depends on multiple variables, but the framework for planning starts with determining the impact (for example, a revenue goal or a number of programs needed to warm up accounts for sales) you require from your target accounts. Next, determine how many accounts you will target, and give your best estimate for how many engagements you need from each account to have an impact on the outcome you’re seeking.
Once all this is determined, cross match the engagement goals with the actual the cost of the tactics needed. Finally, validate the cost or programs with the expected revenue and determine if you are within the sales efficiency ratio needed to keep the program profitable. There are lots of assumptions you will make in the beginning, but you will validate assumptions along the way.
Brandon: How do you take what you’ve learned in your tests and roll it out to improve overall outbound marketing?
David: It is my personal opinion that smart marketers can balance making intuitive optimizations vs tested optimizations. There is a time and place for intuition, and it can be difficult swimming in the ocean of data these days. A method I use to help with this is immersing myself in a hypothetical situation where I’m on the receiving end of a campaign or program relative to the entire buying cycle. If something needs to happen regardless of testing to improve or even fix an experience, just do it.
For testing and roll out, it is important to run the A/B test where tests are truly A or B so you can be confident of the results once it passes the significance level tests. I’m also conscious of waiting too long for tests, so it’s smart to conduct multiple non-conflicting tests. This way, we can continue to execute tests and optimize our outcomes without having to wait for one test to finish before the next test begins. In short, we’re optimizing our testing schedule, and thus optimizing our learning. This applies to demand gen efforts as well as our ABM efforts through web testing, digital advertising, email sequences and messaging/copy.
– – –
I hope you enjoyed that as much as I did. If you would like to learn more about how to test and optimize your ABM campaign, check out Engagio.com.