How to Build a Conversion Campaign Through Intercom

build a conversion campaign through intercom

This is more of a blog post for myself than for you the reader as open rates have been something I’ve been quite unhappy with up until recently. We’ve dropped the ball for a while on trial optimization. And it’s primarily my fault.

Funnel optimization is literally the MOST important thing you can do between five figure monthly recurring revenue and six figure monthly recurring revenue.

In my opinion, there are three fundamental keys to a successful SAAS marketing campaign:

  • acquisition – getting customers in the door;
  • conversion – getting them to sign up for your trial and convert into a paying customer and…
  • retention – keeping those customers and getting them to tell their friends about your product/service.

Our site optimization has been going very well to the point in which I’d say we don’t have very many more improvements in optimization left. Outside of a few tweaks, we didn’t really optimize our email sequence to any meaningful degree in comparison to site optimization.

This was due to a few reasons:

  1. We had two different sources for sending emails: SendGrid and Intercom. We mostly send transactional emails through SendGrid and marketing emails through Intercom.
  2. Intercom is a new type of ESP (email service provider). Unlike other ESP’s that focus on ‘Open Rates’ and ‘Click Through Rates,’ Intercom allows you to send messages based off specific actions and measure results based off other actions. As an example, if your user installed your software and the next action you wanted them to do was to add an employee to that software, you’d send them that exact message and track to see whether it resulted in them completing that goal. It’s pretty awesome and until recently you’d have to build out that capability yourself.
  3. Until recently, Intercom lacked a split testing and optimization feature which made optimization really difficult. We used the beta split tester but it didn’t work 100% of the time so we decided to hold off until the system worked perfectly.

As of the publication of this article, their split tester still isn’t ‘great’ but good enough to start running simple tests It’s really hard to do multiple tests on a single message and impossible to run two completely separate sequences but I’m told they’re working on it.

I’m going to give you a step by step approach on how we took our email sequence from approximately 20% open rate and 30% CTR to a 30+% open rate and 40% CTR.

Build your initial sequence based off your first instincts:

This kind of sucks but if you’re using some kind of metrics based ESP such as Intercom, Customer.io or your own custom program you unfortunately have to start with your best guess at an email sequence. You will suck at this, you’ll have to do it all over again but you need that initial baseline to begin experimentation.

A background in ad copy and email sequences will help a lot here and if you can afford it, hire somebody to write your sequences for a few grand. The sequence should be structured around your users signing up, performing a few actions (install software, add users, upload code, etc) and then converting before your 30-day trial is up.

Figure out the Taxonomy of your Non Converters

This is a pretty long process but one that I would say is more important than any of the other steps we did.

You’ll need to find ideally 50 people who didn’t use your product and convince them to have a phone conversation with you. You need to talk to them on the phone, not through email, not through a survey but through the PHONE. I was able to just email our customers and get calls but you may have to bribe them with free extensions, money… sex… basically whatever you have to offer. You really can’t get the details of why somebody quit based off an email thrown back and forth.

The conversation should center around three central questions:

  1. How did you come to use [our product/service]
  2. How did the trial go?
  3. Why didn’t you end up [buying/using] it?

I used a semi structured interview process, and I made sure I asked for follow ups on questions.

As an example, if a customer says they decided to go with another company ask them why they went with that company. Was there anything we could have done to have you go with us instead? etc. Throw all your notes into a Google doc and make sure to link it to the customer file you’ve developed for that user (we use Intercom for this) so you can confirm the interviews with the interactions they had in the app. Based off this information you may change the structure of your email sequence or discover that you’re following the wrong information when it comes to barriers to overcoming conversion. You can also segment this out by customer type and see if there are any differences in the barriers for different customers. As an example, larger customers may want an API whereas smaller customers may not be able to afford you.

Code your data

This part is quite long to do. You need to take all the information you collected in your interviews and tie them back to the information you’ve collected through Intercom, your database, etc. to come up with an overall dataset that you can analyze. You should compare the people who are happy with their product to the people who tried your product but didn’t convert. Collect all the relevant data you believe impacts your customers decision to convert.

For example, I collected the following variables through Intercom and overlaid that information with my semi structured interview:

  • Signed Up Date
  • Last Logged In
  • Does your Credit Card Exist?
  • Activated Users
  • Deactivated Users
  • Users they tried to activate but haven’t installed the software
  • Overall User Count
  • Did you delete your account?
  • Hours Tracked By Company
  • Last Successful Payment Amount
  • Payment Status
  • Did the Company owner personally install the software?
  • [Interview Question Trials] How did you come to use us?
  • [Interview Question Trials] How did the Trial Go?
  • [Interview Question Trials] Why didn’t you end up buying it?

To code this you need to export this information from Intercom and then put it into an excel file with your interview questions. I would get somebody that can do data entry to put this together. I’ve included an example of the instructions I put together for my data entry team when I had them put my report together, it shouldn’t take somebody more than a few hours to do.

What is your company’s key activation points?

Off this data you should be able to figure out where your customers are hitting a brick wall.

I was able to find out some really obvious activation points that I confirmed quantitatively before moving forward with trying to fix them. I looked at two separate groups, people who tried a 30 day trial and didn’t purchase the product and people who did. Most of the metrics we looked at didn’t produce any large differences, an example being that 86% of buyers and 63% of trials accessed the dashboard within the first 30 days.

The biggest difference I saw between trials and buyers was whether they installed the software themselves. 38% of trials installed the software during their trial compared to a whopping 86% of the buyers segment.

This clearly shows me that once they’ve installed the software (under the current conditions) we’ll have an 80%+ chance of converting them into customers. So if they haven’t installed the software there really isn’t any point sending them any other messages other than that.

Figure out your customers psychological triggers

Now that you’ve collected your qualitative interviews and you’ve backed that up with quantitative data, you now need to figure out the psychological triggers stopping them from performing the actions you’re looking for them to do. For me, that was getting them to install the software and asking them ‘why’ they wouldn’t install the software.

Here are a couple examples that I came up with after asking customers through email about why they didn’t install the software.

  • I don’t care enough to install this software
  • I don’t get what this software does
  • I didn’t really know what I was signing up for and I now know I don’t want this
  • I want to become more productive but can’t actually come up with the discipline to start using this thing
  • I lost the installation file, got lost during the installation process or something else confused me with the onboarding process

You can test each of these excuses individually which is time consuming or do what I did and simply ask customers what they didn’t get during the installation process to the point of becoming respectfully annoying.

Figure out who you can save and who you should leave behind

This is a tough one and I’m probably going to have a lot of people disagree with me but I think you should do an opportunity cost analysis on particular customers based off the points they’ve gotten to in their email sequence.

As an example we had a big customer who was ready to buy if our app could only work on Chromebooks. We looked at the cost to build that app out and told them we couldn’t do it. If they don’t fit into your funnel, abandon the lead and focus energy on customers that matter.

Another example: I send an email out to every person who has accessed the dashboard and activated more than 20 users in the first 30 days to do a personal phone call as those users are obviously engaged. I usually do 1-2 calls a week based off those numbers, I could increase the number of calls I get but at this rate but that means I also can’t apply my time to other things. I’m going to split test this idea and see if it brings up conversions for smaller accounts and then apply that opportunity cost to the work that I’m doing.

Run your email experiments

Try to optimize for your users key activation points but don’t just ‘try’ things. If you have data in place randomly changing your sequences will destroy your results.

At this point you should know the open and click through rates for all your messages and the rate by which your success variable is reached (installing the software yourself, buying the product etc). Testing should be done in the following order, title tags, content, different emails, different messaging types.

Title Tag Change

Version A: Video with three key features of Time Doctor
Version B: I’ve put together our team management tutorials for you

Content Changes

Version A: Content in the pop up: Here is how the Projects Page works.
Version B: Content in the pop up: Here’s a really quick video showing you how to the Projects page works.

Messaging type change

Version A: Small pop up
Version B: Much bigger popup that’s directly in the user’s face

The same strategy won’t work on everybody

Once you figure out how the majority of your customers convert you need to start segmenting cohorts. See if you can get more granular with your messaging.

As an example, specifically targeting customers that you gave up on earlier or customers that have another key activation point that you’d like to work on. Just remember not to message people too much, we were sending 10 emails throughout 30 day trials and I ended up getting a lot of negative feedback, we’ve pulled it back to about 5 and that seems to work quite well for us.

Testing emails is a continuous process

Make sure to continue to test your results, and keep innovating. For every 10 experiments we try, we get one or two that are big wins. Testing emails are so easy to do nowadays and is probably the second most important optimization behind split testing your site.

Finally…

These are the things we did to increase our open CTRs and interactions by a third during 30-day trials. I hope that this post has been helpful to you.

Please let me know if you have any questions in the comments below.

time tracking software

About Liam McIvor Martin

Liam Martin is a co-founder of Time Doctor—a time tracking and productivity monitoring software designed for tracking hours and productivity of remote teams.

There are 2 comments

We'd love to hear from you!

Your email address will not be published. Required fields are marked *