Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.
Hi, my name is Dimitrios and I am responsible for Crave Culinaire’s digital marketing. I would like to drive more traffic to Crave’s blog. Since Crave Culinaire is the only catering company who provides molecular cuisine, I thought about craving a blog post about that. The influencers in this niche have great success in utilizing recipes on their blogs. I will share some recipes of Brian Roland, owner and head chef of Crave Culinaire.
Google Analytics is free to use, and the insights gleaned from it can help you to drive further traffic to your website. Use tracked links for your marketing campaigns and regularly check your website analytics. This will enable you to identify which strategies and types of content work, which ones need improvement, and which ones you should not waste your time on.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Product images. If you think images don't play a role, think again. When many consumers search for products in the search engines, not only are they looking at the "Web" results, but they're also looking at the "images" results. If you have quality images of that product on your site -- and the files' names contain relevant keywords -- these images will rank well in search engines. This avenue will drive a lot of traffic to your site, as potential customers will click on that image to find your store.
Hey Brian, I have landed in this blog while visiting via blog land. I must appreciate your effort to put up such informative content. As being an Internet Marketing Consultant, I would like to add few thought of my own with your valuable content. There are many people who wants HUGE number of traffic with no time at all. But as per my experience, SEO has become SLOW-BUT-STEADY process in the recent times. After so many algorithm updates of Google, I think if we will do any wrong things with the websites, that should be paid off. So without taking any risk, we need to work ethically so that slowly the website will get the authority and grab the targeting traffic. What do you think mate? I am eagerly looking forward to your reply and love to see more valuable write-ups from your side. Why don’t you write about some important points about Hummingbird Updates of Google. It will be a good read. Right brother? 🙂
On a dating niche site I took the ‘ego-bait’ post one step further and had sexy girls perform a dance and strip to revel the names of the major bloggers in my niche written on their bodies. As you can imagine it got a lot of attention from the big players in my niche and my audience and is a little more creative for getting links, shares and traffic.

All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?

Google has recently changed how you can use the Google Keyword Planner. Before, everyone who signed up could see the search volume for keywords. Now, it only shows estimates. There is a way to get around this. You need to create a Google Adwords campaign. The amount you spend doesn’t matter. After you do that, you will regain access to the search volume.
Nice work Laura! This is going to be a great series. I'm working my way through SEOmoz's Advanced SEO Training Series (videos) Vol. 1 & 2 to build upon the advice and guidance that you and your team provided to me during my time at Yahoo!. Now many others will benefit from your knowledge, experience and passion for SEO strategy and tactics. Best wishes for great success in your new role.

Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.


I feel I have great content…but most of it is within my email marketing campaign instead of my blogs. I’ve used my blogs to include links to my email marketing campaigns to lead to my product. In your opinion, should my blog content be the priority? I find my marketing emails sound more like a blog than just a “tip” or a reason to grab people to my list.
Really great article. In some cases when you are creating strategies there are many SEO companies that focus more on finding link building opportunities rather than actually create visual assets or other link earning content. Do you think more emphasis should be placed on creating these assets first based first on understanding hot topics or refining a customers pain points?
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Product images. If you think images don't play a role, think again. When many consumers search for products in the search engines, not only are they looking at the "Web" results, but they're also looking at the "images" results. If you have quality images of that product on your site -- and the files' names contain relevant keywords -- these images will rank well in search engines. This avenue will drive a lot of traffic to your site, as potential customers will click on that image to find your store.
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
×