Yesterday I was re doing our process for ideas and alltop was a part of it. Now I have also known it was a bit spammy (some of my grey sites are featured ) but now it seems way too bad. You have places like new York times next to random adsense blog x. Guy kawasaki needs to really start giving some sort of influence ranking or at least culling the total crap ones.
advertisingalgorithmbacklinkscampaigncontentFacebookGoogleGuest PostinghundredsimagesImprovise the LoadingIncreaseinternal linkingLinkedInLoading speedLong Tail keywordsOff Page SEOon-page SEOPaid AdvertisementPinterestplatformRedditROIsearch engine optimizationsearch enginesSEOsocial platformsSocial sharingsubmitTwittervideosVisitorsWebsiteXMLXML sitemap
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.

In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
Loading speed is one of the significant factors in Google’s search algorithm. If your website is having a slow loading time, then you need to improve this thing on your website. If you don’t do this then slow loading time of your website will leave a bad impression about your website on your visitors. Ensure that your website is search engine friendly and loads faster.
advertisingalgorithmbacklinkscampaigncontentFacebookGoogleGuest PostinghundredsimagesImprovise the LoadingIncreaseinternal linkingLinkedInLoading speedLong Tail keywordsOff Page SEOon-page SEOPaid AdvertisementPinterestplatformRedditROIsearch engine optimizationsearch enginesSEOsocial platformsSocial sharingsubmitTwittervideosVisitorsWebsiteXMLXML sitemap
Brian, great post as always! Question: Do you consider authority sites (industry portals) a form of “influencer marketing?” e.g. guest blogging, etc? In some niches there are not so many individuals who are influencers (outside of journalists) but there are sites that those in the industry respect. I am in the digital video space and for me one site is actually a magazine that is building a very strong digital presence. Thanks, keep up the good work!
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Hi Brian, I’m so glad I found Backlinko! I’m downloading all the free guides you’re offering and taking notes. I started a blog last year, and I’ll just call it my “learning blog.” You help me understand that I need to change how I think about content creation (think keyword and topic, research it, THEN create content). So that will be the first strategy I implement for the new blog I plan on launching in the fall.

Thank you Brian. I am so brand spanking new to all this and i am really struggling with understanding it all. I have tried to read so many thing to help my website and this was the first article to really make sense however Being an urban, street menswear online store i feel like my niche is too broad?.. Ahh Feel like I am drowning maybe I need to do your course! Thanks again for the read I will be doing a lot more thats for sure
Google has recently changed how you can use the Google Keyword Planner. Before, everyone who signed up could see the search volume for keywords. Now, it only shows estimates. There is a way to get around this. You need to create a Google Adwords campaign. The amount you spend doesn’t matter. After you do that, you will regain access to the search volume.

We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.

Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Thanks Jure. That actually makes sense. Exactly: I’ve tested lowering the number of tips in a few posts and it’s helped CTR/organic traffic. One thing to keep in mind is that the number can also be: the year, time (like how long it will take to find what someone needs), % (like 25% off) etc. It doesn’t have to be the number of tips, classified ads, etc.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.

It’s rare to come across new SEO tips worth trying. And this post has tons of them. I know that’s true BECAUSE…I actually read it all the way to the end and downloaded the PDF. What makes these great is that so many are a multiple step little strategy, not just the one-off things to do that clients often stumble across and ask if they are truly good for SEO. But there are also some nice one-off tips that I can easily start using without ramping up a new project.


The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Fantastic stuff, as usual, Brian. The First Link Priority Rule is always one that causes me great angst. I often get torn between search engines and usability when it comes to the main navigation bar. And, I’ve never known what the heck to do about the “Home” link. You can hardly target your keywords with that one without it being anything but awkward.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
So I’m not good at English. Well I’m running a new vacation rental and travel website in French. But it seems that in francophone area people are reluctant to implement backlinks. I do need links to rank because I have strong competitors. So I’ve decided to ask for those links to Anglophone website owner. Since my content is in French, I thought I could ask for links to pages with solely touristic spots photos. What do you thinks of that?
Hi Brian, Awsome content as ever! I’m very interested in your idea of creating an ‘uber’ resource list or expert roundup post i.e. linking out to lots of to other authorities in my niche within one post. But should you always create ‘no-follow’ links to these authority sites to prevent juice from passing to them? And similarly if you sprinkle a few outbound authority links in other posts should they all be ‘no-follow’ or do you think big G ignores ‘no follow’ these days?
×