Really its just a matter of getting creative - grab a cup of caffeine and think for a minute about what resources you have to try to get some insight on your visitors (or target markets) and their needs before you dive in. Think about how much time it might take you (or what the cost of the reports would be if you are going to buy some market research reports), and tack that onto your billing as an optional service.
Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.
While short-tail keywords are often searched more frequently, it is more difficult to rank for them on search engines. Targeting long-tail keywords, on the other hand, gives you a better chance of ranking higher (even on the first page) for queries specific to your products and services—and higher ranking means more traffic. Plus, as search engines and voice-to-text capabilities advance, people are using more specific phrases to search online. There are many free tools available to help you find keywords to target, such as Answer the Public.
Fortunately, Google puts more weight on the anchor text of external links anyway. So as long as some of your external links have your target anchors, you’re probably OK with a “Home” button. In fact, I’ve ranked homepages with a “Home” anchor text nav button for some seriously competitive terms. So it’s not a make-or-break ranking signal by any means.
The strength of your link profile isn’t solely determined by how many sites link back to you – it can also be affected by your internal linking structure. When creating and publishing content, be sure to keep an eye out for opportunities for internal links. This not only helps with SEO, but also results in a better, more useful experience for the user – the cornerstone of increasing traffic to your website.
Many blogging software packages automatically nofollow user comments, but those that don't can most likely be manually edited to do this. This advice also goes for other areas of your site that may involve user-generated content, such as guest books, forums, shout-boards, referrer listings, etc. If you're willing to vouch for links added by third parties (for example, if a commenter is trusted on your site), then there's no need to use nofollow on links; however, linking to sites that Google considers spammy can affect the reputation of your own site. The Webmaster Help Center has more tips on avoiding comment spam40, for example by using CAPTCHAs and turning on comment moderation.
Just ridiculously good as usual Brian, you continue to set the bar higher and higher each time I see a new post from you, well done. A quick point regarding point 16 about Google only counting the first anchor to a page, what is your opinion about links that go to DIFFERENT pages on the same site. I believe they pass equal weighting but would be good to get your option.
Attempting to replace a dead link with your own is easily and routinely identified as spam by the Wikipedia community, which expects dead links to be replaced to equivalent links at archive.org. Persistent attempts will quickly get your account blocked, and your webiste can be blacklisted (the Wikipedia blacklist is public, and there is evidence that Google uses it to determine rankings), which will have negative SEO consequences.
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..." Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
Black hat SEO involves techniques such as paying to post links to a website on link farms, stuffing the metadata with nonrelated keywords, and using text that is invisible to readers to attract search engines. These and many other black hat SEO tactics may boost traffic, but search engines frown on the use of such measures. Search engines may punish sites that employ these methods by reducing their page rank or delisting them from search results.
See the screenshot below for some of the sections for specific recommendations that you can add which will provide the meat of the document. Keep in mind this is a very flexible document – add recommendations that make sense (for example you may not always have specific design considerations for a project). Remember, it will be different every time you do it.
Thank you Brian. I am so brand spanking new to all this and i am really struggling with understanding it all. I have tried to read so many thing to help my website and this was the first article to really make sense however Being an urban, street menswear online store i feel like my niche is too broad?.. Ahh Feel like I am drowning maybe I need to do your course! Thanks again for the read I will be doing a lot more thats for sure
This toolbar is based on the LRT Power*Trust metric that we’ve been using to identify spammy and great links in LinkResearchTools and Link Detox since 2012 and the free browser was just recently launched. It helps you promptly evaluate the power and trustworthiness of a website or page during your web-browsing way exacter than Google PageRank ever did.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Great post. I know most of the stuff experienced people read and think “I know that already”… but actually lots of things we tend to forget even though we know them. So its always good to read those. What I liked most was the broken link solution. Not only to create a substitute for the broken link but actually going beyond that. I know some people do this as SEO technique but its actually also useful for the internet as you repair those broken links that others find somewhere else.
Elna, I love it when pro bloggers write how-to posts that are highly highly valuable to their readers. This one is top notch … as you will see by how I share my NAME and blog with this comment. What a brilliant idea that I could never have thought of on my own EVER. This one is getting pinned all over the place. I love sharing content that really helps people.