After you have identified your target keywords, you need to create a page targeting that keyword. This is known as SEO content. In many cases, it makes sense to publish a blog post targeting keywords. However, you need to make decisions based on the search intent. If your target keyword phrase is “buy black Nike shoes”, then it doesn’t make sense to create a long-form piece of content.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Hi Brian! Very good and exactly what I was looking for. I have a problem though, we are creating the first video editing software that edits video WHILE FILMING. We are video geeks with a lot of experience, however we are trying to appeal to GoPro users and video tutorial makers but we have little knowledge in that field. Any suggestions on how we write about that if we have no idea about the space?
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]

Let me tell you a story. Early in my tenure at Yahoo we tried to get into the site dev process in the early stages in order to work SEO into the Product Recommendations Documents (PRD) before wireframing began. But as a fairly new horizontal group not reporting into any of the products, this was often difficult. Nay, damn near impossible. So usually we made friends with the product teams and got in where we could.
Specifics: Be as specific as you can with your recommendations. For example if you’re suggesting partnering with meal home delivery sites, find out which ones are going to provide the most relevant info, at what cost if possible, and what the ideal partnership would look like for content and SEO purposes. Even provide contact information if you can.
I am a little confused on your first point. Sorry if it is a simple one to understand and I’m just missing it. What good would finding dead links on Wiki do for my personal website? I thought you would explain how to find dead links faster within my own site… but it seems that your tip is way more valuable than that. I just don’t quite understand what I do to positively affect MY site with this. Any help would be great 🙂 THANKS!
Wow I wish I had the comments you do. So you’re saying that by re-visiting and re-writing old posts garnered 111% more traffic by updating old posts? I feel like I go back from time to time to do this, mostly to keep information current. This tip makes me want to revisit all my old posts to see what could be updated. That’s a lot of low hanging fruit. Thanks for this one.
×