Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Take the 10 pillar topics you came up with in Step 1 and create a web page for each one that outlines the topic at a high level -- using the long-tail keywords you came up with for each cluster in Step 2. A pillar page on SEO, for example, can describe SEO in brief sections that introduce keyword research, image optimization, SEO strategy, and other subtopics as they are identified. Think of each pillar page as a table of contents, where you're briefing your readers on subtopics you'll elaborate on in blog posts.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Meta tags. Meta tags still play a vital role in SEO. If you type any keyword into a search engine, you’ll see how that keyword is reflected in the title for that page. Google looks at your page title as a signal of relevance for that keyword. The same holds true for the description of that page. (Don't worry about the keyword title tag -- Google has publicly said that it doesn't pay attention to that tag, since it has been abused by webmasters and all those trying to rank for certain keywords.)
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.

I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Hi Brian, Awsome content as ever! I’m very interested in your idea of creating an ‘uber’ resource list or expert roundup post i.e. linking out to lots of to other authorities in my niche within one post. But should you always create ‘no-follow’ links to these authority sites to prevent juice from passing to them? And similarly if you sprinkle a few outbound authority links in other posts should they all be ‘no-follow’ or do you think big G ignores ‘no follow’ these days?

Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines. For more information on thin content see More Guidance on Building High-quality Sites.
Another way to increase traffic to your website is to get listed in free online directories and review sites. For most of these sites, your profile will have a link to your website, so actively updating these listings and getting positive reviews is likely to result in more website traffic. In addition, many directories like Yelp have strong domain authority on Google. There’s a chance that your business’s free Yelp page could rank high for relevant searches.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.

I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂

×