I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Next, on the list we have guest Posting. It is the one of the ways to create backlinks and it enables you to rank your website rank on search engines like Google. Keep one thing in mind, craft quality content for guest posting. Just don’t only write posts for getting backlinks but focus on quality as well. If your posts are helpful and interesting in the eyes of your readers, then they will come to your website.
Studies have proven that top placement in search engines generally provide a more favorable return on investment compared to traditional forms of advertising such as, snail mail, radio commercials and television. Search engine optimization is the primary method to earning top 10 search engine placement. Learn more about the search engine optimization process and discuss an SEO strategy for your site when you contact a search engine specialist today.
When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
On one specific project, one of the SEOs on my team was brought in during the wireframe stage. T­he entire product team held SEO-specific meetings every week to go over specific recommendations, taking them very seriously, and leaning on every word our team said. We were thrilled. We were hailing their efforts, promising big wins for the relaunch, and even hyping up the launch and it’s projected SEO results in the company SEO newsletter.
#16 is interesting because no one really knows about it. Myself and a former colleagu did a test on it about 4 years ago and published our results which conculded what you are saying. Since then I’ve been careful to follow this rule. The only issue is that often times using the exact kw does not “work” for navigation anchor texts. But with a little CSS trickery one can get the code for the nav bar to be lower in the code, prioritizing contextual links. I’ve also seen sites add links to 3-5 specific and important internal pages with keyword rich anchor texts, at the very top of the page in order to get those important internal links to be indexed first.
I really enjoyed your post, im building my own business from the ground up making custom furniture, lighting, and home decor. it took me a year to launch my website and now im trying to invite more traffic and ways for clients and interested parties to share my content and start buying my product. I liked the idea of Share triggers… im going to be incorporating that into my social media strategies. Any advice would go a long way. thanks again Brian
Holy Engagement! This was an awesome post, full of great info… and then I realized that 3/4 of the actual page was comments… which is even better for shares, SEO and overall engagement. I was lucky enough to attend an event where Neil Patel was giving some great blogging training and a lot of what you covered was there. https://www.thatbloggingthing.com/69-blogging-secrets-i-stole-from-neil-patel/ The simple fact that you comment back is awesome.
As we discus above do not copy the content from different websites or different blogs. You need to write your own article on a daily basis. You are going to come up with new ideas, and some new innovation, then you are going to write and upload on tour blog. Billions of people are going to search their stuff on a daily basis. Google are going to priority those links, which are activity performing on daily basis. And on a daily basis write something and update on your blog.
Use your keyword list to determine how many different pillar pages you should create. Ultimately, the number of topics for which you create pillar pages should coincide with how many different products, offerings, and locations your business has. This will make it much easier for your prospects and customers to find you in search engines no matter what keywords they use.
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.
Like you I am a scientist and like you did in the past, I am currently working on translating great scientific literature into tips. In my case it’s child development research into play tips for parents. I can already see that the outcome of my experiment is going to be the same as yours. Great content but who cares. I hadn’t even thought about my key influences. I know some important ones, but don’t see how they would share my content. I thought I was writing content for my potential customers. Is your SEO that works course the same as the content that gets results course? Sorry if I sound a bit dim asking that question.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Hi! I really found this article to be valuable and helpful to improve our SEO techniques. But I am just wondering regarding the dead links, does that mean we can contact those who have dead links to recreate the page? How does it improve my SEO technique for my website? Can they add some citations or thank you or gratitude section that links to our website?

Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁

Specifics: Be as specific as you can with your recommendations. For example if you’re suggesting partnering with meal home delivery sites, find out which ones are going to provide the most relevant info, at what cost if possible, and what the ideal partnership would look like for content and SEO purposes. Even provide contact information if you can.

Yep and sometimes it’s just being a little creative. I’ve started a little blog on seo/wordpress just for fun actually… no great content on it like here though… but because the competition is so tough in these niches I decided to take another approach. I created a few WordPress plugins that users can download for free from wordpress.org… and of course these link to my site so this gets me visitors each day.
The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.
Add relevant links back to your site. Throughout your answer, sprinkle a few relevant links back to your website. The more relevant they are to the question, the more clicks and traffic they will generate. You can also finish your answers with a link to your lead magnet, concluding with something like this: “Want to know more about how to start a business? Check out my free checklist with 10 steps for starting your first business!” and a link to the lead magnet (in this example, the checklist).
Brian, great post as always! Question: Do you consider authority sites (industry portals) a form of “influencer marketing?” e.g. guest blogging, etc? In some niches there are not so many individuals who are influencers (outside of journalists) but there are sites that those in the industry respect. I am in the digital video space and for me one site is actually a magazine that is building a very strong digital presence. Thanks, keep up the good work!
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Although this is a step-by-step series, everyone's methods will (and should) vary, so it really depends on how much time you think it will take (if you're billing hourly).  What tools do you have at your disposal vs. how much researching for information will you have to do on your own? Will you have to pay for research reports or companies? Do you pay a monthly service for data or research?
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Each organic search engine ranking places emphasis on variable factors such as the design and layout, keyword density and the number of relevant sites linking to it. Search engines constantly update and refine their ranking algorithms in order to index the most relevant sites. Other variables that have an impact on search engine placement include the following:
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.

Hey Brian I must say it’s a awesome content you are sharing .my question to you is how did you transform from a nutrition expert to a Seo master I mean both subjects are poles apart so how did you learn SEO can you share your story because I find my self in similar situation I am an engineer by profession and I am starting a ecommerce business niche is Apparel no experience of watspever in Blog writing and SEO if you can throw some resources where I can improve my skills that would be a huge help
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
There are about every people who are using different social application. If you are actively perform good, then your performance of increases the website visitor to the blog. Social media is the key of increasing the visitor to the blog. If you wanted to increase the visitors then you are activity write the quality content, and share on Facebook, Instagram, Twitter, and LinkedIn. If you are not able to write the good content, then you should hire a person, who write content and post to the social media on daily basis.  In this way you are not worry about to increasing the website visitor to you blog. From social media majority people are going to find out that what this is. And you post on social media are directly take the people to their blog. And then they are able to learn and find out the other information that might will help to the visitor.

I second Rand's comment!  Congrats on moving from the corporate world to the independent consultant.  This is my goal for the near future.  I too have been testing the waters of independent consulting, but it doesn't quite pay the bills yet!  Sometimes I feel like I should find a mentor who has been where I am now and is where I want to go.  Perhaps i'll find a few in this community over time!
I read your post on my mobile phone while on a bus travel and it stirred me due to the fact that I’ve been doing SEO lately the poor man’s way like blog commenting, Social bookmarking, forum signature, directory submission, etc. I don’t know if any of these things still work today since I’ve been practicing them since 2008. These 25 SEO tactics that you have shared got my interest. Actually I am planning to make a new site right now after reading this one. I found out that maybe I’ve been doing a lot of spamming lately that my site is still not ranking on my desired keywords. And also, you have pointed out that it is not just by means of Keyword planner that we will be able to get keywords since there are others like, as what you have said, the wikipedia and the like. I am planning to make use of this article as my guide in starting a new one. I bookmarked it… honestly.. 🙂 And since I have read a lot of articles regarding SEO tips from other sites, I can compare them to your tactics and this is more interesting and exciting. I want to build a quality site that can make me generate income for long years. THANK YOU FOR BEING GENEROUS WITH YOUR KNOWLEDGE. I will try to communicate with you through email and I hope you can coach me Brian. .. please.. 🙂
This post and the Skycraper technique changed my mind about how I approach SEO, I’m not a marketing expert and I haven’t ranked sites that monetize really well, I’m just a guy trying to get some projects moving on and I’m not even in the marketing business so I just wanted to say that the way you write makes the information accesible, even if you’re not a native english speaker as myself.
Loading speed is one of the significant factors in Google’s search algorithm. If your website is having a slow loading time, then you need to improve this thing on your website. If you don’t do this then slow loading time of your website will leave a bad impression about your website on your visitors. Ensure that your website is search engine friendly and loads faster.
We now have a dedicated SEO strategist who, among other things, develops 90 day plans for our websites. 90 days isn't longterm planning, but at least we have a strategic objective for the quarter. He also works closely with our UX team to identify the target audience - the crew that does the persona research and focus groups prior to the wireframe stage.
Great article. My site has been up for several years now but I rebranded and switched from Blogger to WordPress about a year ago because I was told the reason why my traffic is so low is because I was using the wrong platform. I still haven’t seen an increase in my traffic and am very frustrated. I write in the health, fitness and parenting niche and I have over 30 experts that write for me, but I still don’t have the page views I would like. My paychecks are small and I am very frustrated. How do I find out what influencers in my niche are talking about and what they would like to share? I read tons of blogs, but most of them just review products or write about their kids, not a whole lot of similar articles. Where do I begin to find sharable content in my niche?
Traditionally, defining a target audience involves determining their age, sex, geographic locations, and especially their needs (aka pain points). Check out usability.gov’s description of personas and how to do task analysis & scenarios for more details, or better yet, read Vanessa Fox’s upcoming book about personas related to search and conversion.
In our research with what we have done for ourselves and our clients, there is a definite co-relation between content greater than 1000 words and better rankings. In fact, we are finding amazing ranking jumps when you have content over 3,000 words, about 12 original images (images not found anywhere else online), 1 H1 (not keyword stuffed), 12 sub-headlines (H2), 12 relevant internal links, 6 relevant external links and 1 bullet list. I know it sounds like a lot of work and a Big Mac recipe, but this does work.
If you own, manage, monetize, or promote online content via Google Search, this guide is meant for you. You might be the owner of a growing and thriving business, the webmaster of a dozen sites, the SEO specialist in a Web agency or a DIY SEO ninja passionate about the mechanics of Search : this guide is meant for you. If you're interested in having a complete overview of the basics of SEO according to our best practices, you are indeed in the right place. This guide won't provide any secrets that'll automatically rank your site first in Google (sorry!), but following the best practices outlined below will hopefully make it easier for search engines to crawl, index and understand your content.
Consistent Domains: If you type in www.example.com, but then your type in just example.com and the “www” does not redirect to www.example.com, that means the search engines are seeing two different sites. This isn’t effective for your overall SEO efforts as it will dilute your inbound links, as external sites will be linking to www.example.com and example.com.
Tip: Along with delicious I search on scoop.it for similar opportunities. If they liked an article related to a year.. say 2013 and you update the resource to 2014 chances are they’ll share it. Kind of a twist on your delicious + sky scraper technique. You don’t even have to make the content much different or better, just updated! Got some fantastic links recently because of it.

Not sure exactly why, perhaps I used a number too big and since my page is about classifieds, it probably seemed too much to browse through 1500 ads, I assume? Somewhat like you would post 800 tips for better ranking? Don’t know, will try to change things a bit and see how it goes, but you really gave me some new suggestions to go for with this article. Thanks again 🙂
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
×