Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
Ever heard of Maslow's hierarchy of needs? It's a theory of psychology that prioritizes the most fundamental human needs (like air, water, and physical safety) over more advanced needs (like esteem and social belonging). The theory is that you can't achieve the needs at the top without ensuring the more fundamental needs are met first. Love doesn't matter if you don't have food.

Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Hey Brian, This article is really really awesome, Seriously you have covered all 21 points which i never read on anywhere over internet. Everyone shares basics but here you have shared awesome info specially that face book keyword research and 1000+ words in a post, and the wiki pedia ideas are really good and helpful. learned many things from this article. keep sharing this kind of info thanks
Like you I am a scientist and like you did in the past, I am currently working on translating great scientific literature into tips. In my case it’s child development research into play tips for parents. I can already see that the outcome of my experiment is going to be the same as yours. Great content but who cares. I hadn’t even thought about my key influences. I know some important ones, but don’t see how they would share my content. I thought I was writing content for my potential customers. Is your SEO that works course the same as the content that gets results course? Sorry if I sound a bit dim asking that question.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Great article as always. My wife is about to start a business about teaching (mainly) Mums how to film and edit little movies of their loved ones for posterity (www.lovethelittlethings.com launching soon). We have always struggled with thinking of and targeting relevant keywords because keywords like ‘videography’ and ‘family movies’ don’t really some up what she is about. Your article ties in with other learnings we have come across where we obviously need to reach out to right people and get them to share to get her product out there because purely focusing on keywords I don’t think will get us anywhere.

There are many people who create their own and launch their blogs on online daily basis. Then they are going to make strategies, plains to earn money from the blogs. In today’s time about every people are making Blog. And the google are going to strict their policies day by day. And sometimes people do the invalid activities on their Blogs, because they wanted to increase the website visitor to their Blogs. There are latest technologies that google has. The google are going to capture every activities that are happening on the blogs. And in a second’s google are going to find out invalid activities. And then google are going to disconnect their Blog on the spot.


The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
Hi Brian, i absolutely love your content.My competitors and influence rs are very strong-most of them government bodies or supported by government, or travel guides known worldwide.I constantly follow them,get onto them.like,share,comment etc.they share photos that are submitted to them,and i do photography myself which takes hours and still can’t reach out big audience…Any idea please what could i create that my influence rs would love to share (hard to find out what they care,they get 100’s photos submitted daily and collaborate with other big names…) Please help me.
Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[15] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[16] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[17]

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
“Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.”
Think of it this way: The more specific your content, the more specific the needs of your audience are -- and the more likely you'll convert this traffic into leads. This is how Google finds value in the websites it crawls; the pages that dig into the interworkings of a general topic are seen as the best answer to a person's query, and will rank higher.
Hey Brian, This article is really really awesome, Seriously you have covered all 21 points which i never read on anywhere over internet. Everyone shares basics but here you have shared awesome info specially that face book keyword research and 1000+ words in a post, and the wiki pedia ideas are really good and helpful. learned many things from this article. keep sharing this kind of info thanks

We often see posts on how to get blog topic ideas or ideas on creating visuals but nobody ever talked about new link building ideas. The ways you showed here some are absolutely unheard to me. You know what I think you should write a post on how to get your own link building ideas…where to start…how to proceed…how do I know it’s full proof…it surely comes with lots of experiments…but the point is starting…….I know sounds weird but I know you will come up with something 🙂
Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
We are a professionally managed social media marketing company that has long been involved in the Internet Marketing industry. Our staff have years of professional experience in related fields. With the help of the latest social marketing research, search engine optimization and algorithm analysis concepts, we have produced some of the most effective web services in the world to date.
On the other hand, I'd like to know how many people constitutes your new experience as an indipedent consultant? Infact, as others noted in the comments here, what you suggest is perfect especially for an in-house SEO situation or in for an Web Marketing Agency with at least 5/8 people working in. Even if all you say is correct and hopefully what everybodies should do, I honestly find quite difficult to dedicate all the amount of time and dedication in order to check all the steps described in your post. Or, at least, I cannot imagine myself doing it for all the clients.
In this excellent post, SEO and Digital Trends in 2017, Gianluca Fiorelli writes, "In a mobile-only world, the relevance of local search is even higher. This seems to be the strategic reason both for an update like Possum and all the tests we see in local, and also of the acquisition of a company like Urban Engines, whose purpose is to analyze the "Internet of Moving Things."
Like many SEOs, I was hired with one vague responsibility: to set up an SEO program and achieve results. Like many SEOs, we jumped right in and started spewing out SEO audits, rewriting title tags, offering up link suggestions, rewriting URLs and so on. And like many SEOs we promised results. But what we didn’t do, until that fateful launch, was develop a comprehensive strategy.

This toolbar is based on the LRT Power*Trust metric that we’ve been using to identify spammy and great links in LinkResearchTools and Link Detox since 2012 and the free browser was just recently launched. It helps you promptly evaluate the power and trustworthiness of a website or page during your web-browsing way exacter than Google PageRank ever did.
I have been trying to produce more content because I believed the lack of traffic was to the small amount of content, but after reading your blog post, i’m beginning to doubt wether or not this is quality content. I will definitely do more research on influencers on my niche, now I have to figure out how to get their attention with my kind of content.
If you do great work with your search engine optimization (SEO), it could mean a significant amount of revenue for your business. At the same time, however, it is also an ongoing initiative. Once you generate a steady stream of traffic from SEO, you need to constantly be maintaining and improving your SEO in order to keep those rankings you worked so hard for.

Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

×