Excellent post Brian. I think the point about writing content that appeals to influencers in spot on. Could you recommend some good, manual strategies through which I can spot influencers in boring niches *B2B* where influencers are not really talking much online? Is it a good idea to rely on newspaper articles to a feel for what a particular industry is talking about? Would love to hear your thoughts on that.
This information hits the mark. “If you want your content to go viral, write content that influencers in your niche will want to share.” I love the information about share triggers too. I’m wondering, though, if you could share your insights on how influencers manage to build such vast followings. At some point, they had to start without the support of other influencers. It would seem that they found a way to take their passion directly to a “ready” world. Excellent insights. Thanks for sharing.
See the screenshot below for some of the sections for specific recommendations that you can add which will provide the meat of the document. Keep in mind this is a very flexible document – add recommendations that make sense (for example you may not always have specific design considerations for a project). Remember, it will be different every time you do it.
The best way to avoid this is proactively asking the right questions. Ask about resource support. Ask about historic roadblocks. Ask to be introduced to other players who otherwise hide behind an email here and there. Ask about the company's temperature regarding a bigger SEO strategy vs. short, quick-hit campaigns. Don't be your own biggest obstacle — I've never heard of anyone getting angry about over-communication unless it paralyzes progress.
There are a number of ways to optimize your website for conversion—such as by including calls to action and lead capture forms in the right places, providing the information your visitors are seeking, and making navigation easy and intuitive. But the first step is to be attracting the right visitors to your site in the first place. Your goal when it comes to website traffic is to be driving more qualified visitors to your site. That is, those who are most likely to convert into leads and customers.
Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
There is no magic formula for content marketing success, despite what some would have you believe. For this reason, vary the length and format of your content to make it as appealing as possible to different kinds of readers. Intersperse shorter, news-based blog posts with long-form content as well as video, infographics and data-driven pieces for maximum impact.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Wow Brian, You have solved my problem. A few days back I was looking for ways to increase traffic on my tech blog, I found this blog post by you while I was looking out for possible tricks to increase traffic. I must say that few of the tricks mentioned above really worked for me. For example, I updated a few old posts on my blog, I did try the broken link building technique and the last I did was to repost my content on Medium.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites. No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want. I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc. I also took note of who the people were who said those things and where they were talking (forums, twitter, etc). It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links. If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Lastly, it's important to remember that paralysis by over-thinking is a real issue some struggle with. There's no pill for it (yet). Predicting perfection is a fool's errand. Get as close as you can within a reasonable timeframe, and prepare for future iteration. If you're traveling through your plan and determine a soft spot at any time, simply pivot. It's many hours of upfront work to get your strategy built, but it's not too hard to tweak as you go.