What if the diverse streams of data connected with your enterprise PPC campaigns could be integrated and analyzed, across platforms and by your own staff, to create more effective campaign management plans?Ads by Hooly
Since Google essentially owns the internet, a steady flow of traffic to your site (and thus qualified leads and other high priority individuals) is dependent upon one of two things being true. Either you rank well in Google for your primary keywords or you purchase traffic through their AdWords program. The challenge is that it’s very difficult (or takes a long time) to hit the first page of Google for all your target keywords. As a result, companies pump big budgets into their Pay-Per-Click (PPC) Advertising Campaigns.
With monthly spends in the tens or hundreds of thousands of dollars, it’s important to optimize your PPC campaigns to determine the most effective ads and identify your most valuable site visitors. The typical approach to improving a PPC campaign relies on A/B testing. Two versions of an ad are both served up to search engine traffic, and eventually conversion rates are compared. The more successful ad gets funded, and this testing occurs throughout the life of a campaign. In some instances, more sophisticated multivariate testing occurs when companies are running large numbers of ads.
A/B testing has severe limitations. First, all the data is historical: by the time you get it, it’s out of date. You’ve learned what convinced people to convert in the past. To a certain extent, this is likely predictive of future behavior. Yet given the rate of market changes, there could be external factors that influence buying and clicking decisions on a day-by-day basis. Second, this testing happens in insolation. You can’t tie your PPC conversion rates to specific behaviors on your website. Did the people who clicked your ad download your white paper or make a purchase? What if you could tie their PPC and website behavior to their longer-term behavioral and consumption data? You could then build models that help you predict not only which ads convert best, but which ads convert the people who will take action and become profitable customers in the future.
As an example of how fast the space is changing, one major travel player was generating vast quantities of data on an hourly basis. All this data was sent to a data warehouse, and purged two weeks later. The landscape in that industry changes so quickly that 14-day-old data is deemed useless and out of date for purposes of effective PPC campaign management. Therefore, the speed of integrated analysis and implementation of insights that go into crafting a truly effective PPC campaign happens on a timeline of days, not weeks or months.
In another case study presented at the Predictive Analytics World Conference, we learned about an online education portal that increased their revenues by $1 million dollars using predictive modeling to drive their PPC campaigns. 1 in 3 students in high school visited the site at some point, but there was a distinct lifecycle of product consumption. Successful PPC campaign management relied on identifying and converting the right prospect to a very specific landing page: applying to college for sports scholarships, applying to college but worried about GPA, or applying to college and hoping to get into a competitive major, etc.
Their old system of extremely well developed multivariate testing determined ad popularity across demographics and in connection with these specific themes. Yet using advanced predictive modeling techniques, their team built hundreds of models for each ad to help develop an even further refined targeting strategy. The results speak for themselves: a 25% increase in response rate, which translates to a revenue jump of $1 million dollars every 1.5 years.
Anything less than using predictive analytics that incorporates all your relevant data for enterprise level PPC is leaving money on the table. But orchestrating a major systems integration and data mining exercise is a formidable task. In previous generations, this would involve major infrastructure upgrades, coordinating and collaborating across multiple vendors, bringing in and convincing diverse internal stakeholders, and justifying the return on a massive budget. Times have changed. Check out the variety of tools on the market that allow companies to seamlessly build models and manipulate information across various pools of data without the costly and time wasting efforts of moving data into new sandbox environments.
Author : Liz Alton
Liz Alton is an business and technology writer. Her writing has appeared in the Huffington Post, USA Today, Social Media Today, Technorati, PolicyMic, and more.