Junk traffic can also suck the life out of your campaign. Most, but not all pay per click services or providers distribute a segment of their budget to several search engines and other sites via their search partners and content networks. While you certainly want your ads displayed on Google and/or Bing, you may not want your ads showing up and generating clicks from some of the deeper, darker corners of the Internet. The resulting traffic may look fine in high-level statistics reports, but you have to separate out partner network campaigns and carefully manage them if you’re going to get your money’s worth.
A rich snippet contains more information than a normal snippet does, including pictures, reviews, or customer ratings. You can recognize a rich snippet as any organic search result that provides more information than the title of the page, the URL, and the metadescription. Site operators can add structured data markup to their HTML to help search engines understand their website and optimize for a rich snippet. The Starbucks app, for example, includes customer ratings and pricing within the search description.
Paid Search, the lead and traffic generation medium has become a cornerstone for billion-dollar organizations and has remained virtually unchanged. Some may argue that “unchanged” isn’t necessarily the right description based on industry and tactic changes — such as the introduction of Quality Score, the Bing/Yahoo deal, Enhanced Campaigns, etc. — however, one thing that has not changed in paid search is what comprises its campaign: keywords, ad text and landing pages.
Lost IS (budget), aka “budget too low” – Do your campaigns have set daily/monthly budget caps? If so, are your campaigns hitting their caps frequently? Budget caps help pace PPC spend but can also suppress yours Ads from being shown if set too low. Google calls it “throttling” where Adwords won’t serve up ads every time they are eligible to be shown in an effort to allow your account to evenly pace through the daily budget.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×