*Let me start with a few disclaimers. This is a single, and very small test comparing how different Facebook Ad Objectives perform. I’m publishing the results from this single, small test because I haven’t seen any comparable articles or results anywhere. This data is extremely interesting for organizations and developers working with Facebook Messenger, and I’m hoping that more people will run tests and share their results.
*If anyone reading has done similar tests, please share results in the comments or link to posts or data.
*Finally, the dollar-costs in this article are high and should only be referenced to compare these ad sets to each other. The tests were run with new Facebook Pages and not-very-refined target audiences. Experienced Facebook Advertisers should be able to beat their normal cost per lead using Messenger, even though the costs in this article don’t come close to what an experienced marketer or active Page can achieve.
The test consisted of 3 Facebook Ad tests that routed users into Messenger. Once the clicker arrived in Messenger and started a conversation, they were asked to respond with their email address in order to become a lead. The test focused on which Facebook Ad Objective was best at starting conversations and if there was a difference in conversion rates (emails collected) based on the Ad Objective.
Steps to a successful conversion
Messenger Ads have a unique user-flow. In order to understand how the results are tracked, it’s first important to visualize the user experience.
Everything starts with the user seeing the Facebook Ad in their Facebook feed, in Instagram or in Messenger itself. When the user clicks the ad, that user will route into Facebook Messenger and a new conversation window will be opened. In this case, since the ad was from @Mssg, the clicker routed into Messenger and the conversation had @Mssg entered as the Page that the clicker is messaging with.
The Ad also sends the user a Welcome Message. This is part of the Ad, and it’s going is to get the user to start the conversation. Clicking the Ad simply routes the user to Messenger. The conversation doesn’t start until the clicker engages with the Welcome Message.
Once the conversation starts the user is asked to reply with their email address. Receiving a valid email address is considered a successful conversion.
Listing the steps of the flow looks something like:
- User sees the ad
- User clicks the ad
- User engages with the Welcome Message which starts the conversation=conversation rate
- User responds with their email address = full conversion
Facebook Ad Objectives that route to Messenger
Facebook has of Ad Objectives, but only 3 of them will route the clicker into Messenger.
Traffic Ads: Facebook is optimizing for users to click the Ad. They aren’t paying as much attention to users starting the conversation once they route to Messenger and receive the Welcome Message.
Messenger Ads: Facebook is optimizing for users that start conversations with the Page after clicking the ad.
Conversion Ads: Facebook is optimizing for users to convert. Most of the time, converting means reaching a specific page on the website. The conversion event relies on a Facebook Pixel tracker on the page after the conversion happens.
Messenger and the conversation doesn’t happen on a webpage. @Mssg has built the Facebook Pixel into the conversation and it triggers when the user replies with a valid email address. So this test is optimizing for user to respond with a valid email.
The test consisted of 3 campaigns with Traffic, Messages and Conversations as the respective objectives. The overall goal was collecting email address, but if possible each along the way was tracked.
Traffic Ad Results: The Traffic Objective had the largest reach and drove the most impressions by a wide margin. Traffic reached 4x more people and 8x more impressions than the Conversion Objective ad. Compared to the Messages Objective, the Traffic ad reached 3x more people and 4x more impressions.
The cost per click on this ad was $1.38. Again, that’s probably higher than the cost of an experienced FB Ad buyer might experience. After the user clicks the Ad, the next step in the flow is starting the conversation. The “Conversation Rate” is the percentage of Ad-clickers that engage with the Welcome Message and start the conversation. This ad experienced a 12% Conversation Rate, which is very low. of the clickers started a conversation — not good. A good Conversation Rate would be 25% to 50% or even higher.
Of the people that started the conversation, 16% responded with their email address. Again, that’s very low. Most campaigns should see email response rate of more than 50%. Overall the cost per email from this campaign was the worst of any Ad— $80.44 per email.
Messenger Ads: The biggest surprise for the test overall was missing data. Ad Manager simply didn’t show stats that would be expected. For the Messenger Ads (Messages Objective), Ads Manager showed the number of “Messaging Replies”, but the interface doesn’t show the number of clicks on the ad.
Furthermore, Ad Manager was showing a number of conversations 62% higher than what we experienced. It’s frustrating because not only does @Mssg see every incoming message via the API, Facebook’s own inbox also agreed with the @Mssg numbers (62% less than the Ad Manager numbers). Facebook is charging us based on impressions, so the inaccurate stats don’t affect the cost. It’s just an annoying inconsistency.
Since Ad Manager doesn’t report on the number of Ad-clicks, it’s not possible to get a Conversation Rate and compare this to the Traffic Ad. The test generated a strong, but within normal range conversion rate for email address of 62.5%. So 62.5% of the conversations resulted in collecting an email. With the Messages Objective the cost per email address was $15.15, much better than the Traffic Ad.
Conversion Objective: The trend of Ad Manager missing data continues. With this conversion ad test, there was a clear disconnect on the Facebook platform. Facebook didn’t show a single conversion in Ad Manager, but there were conversions happening in @Mssg. Once again, another part of Facebook was tracking the conversions and Ad Manager was the only interface not showing conversion.
Ad Conversions are based on the Facebook Pixel, which is mapped to a Custom Conversion. Facebook tracks the Custom Conversions on separate page. During the test, these Custom Conversions were registering on that page. In Ad Manager, there were no Custom Conversions showing.
Again, without knowing how many people clicked on the ad it’s not possible to understand what percentage of the clickers started the conversation. This add saw an incredibly high conversion rate to email — 100% of the people that started conversations provided an email address. But very few people started conversations and it’s most likely that the Facebook optimization might have never kicked in because Facebook wasn’t registering any successes. It was clear that something wasn’t working correctly and I ended the ad before all the budget was wasted. The cost per email for Conversion Ads was $38.68.
The biggest takeaway is that the Ad Manager reporting for these new types of ads is not working well. Most of the data is missing or clearly incorrect. This makes optimizing the ads hard. For many clients the overall numbers, cost per email or cost per transaction, will be good enough that not getting the full data picture is OK.
The Messages Ad Objective worked best, but this objective drove the best conversations, not the most conversations. The Traffic Objective and the Messages Objective drove about the same number of conversations, but the Messages Objective conversations converted at a much higher rate. This drove the difference in cost per email. More testing and higher volumes would be needed to know if this is a pattern or a fluke.
Starting conversations cost effectively is one of the first big steps to finding success on Messenger. Hopefully the people working in this space can share knowledge and best practices to help bring more attention and marketing budgets to messaging.
Thanks for reading. Please test yourself and share.