Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
It’s an awesome post which I like the most and commenting here for the first time. I’m Abhishek founder of CouponMaal want to know more like you’ve said above in the points relaunch your old posts. Here I want to know is there any difference between changing the date, time and year while we’re relaunching old post OR we should relaunch the old post with the previous date, time and year. I mean it matters or not.
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
Btw, I was always under the impression that digg and delicious were dying but I’m really mistaken. Your(and Jason’s) thinking is foolproof though. If these guys are already curating content, there’s no reason they wouldn’t want to do more of just that! Seo has become a lot of chasing and pestering…it’s good of you to remind us that there are people out there just waiting to share stuff, too.:)
AdWords and Facebook are further vehicles for reaching the appropriate audiences with more refined messaging. I think it's important to create personas for your current visitors and the type of visitors you want to attract. It might be valuable to create personas of those you don't want to attract, to keep in the back of your mind as your content and advertising calendar is being built following the delivery of your overall strategy.
Google Analytics is an invaluable source of data on just about every conceivable aspect of your site, from your most popular pages to visitor demographics. Keep a close eye on your Analytics data, and use this information to inform your promotional and content strategies. Pay attention to what posts and pages are proving the most popular. Inspect visitor data to see how, where and when your site traffic is coming from.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Think of this document as a living communication between you and your client or boss. It is a document you should refer to often. It keeps all parties on the same page and aligned. I recommend sharing it in a collaborative platform so updates are shared between all viewers without having to constantly send out new copies (nothing sucks the life out of efficiency faster than "versioning" issues).
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.
“In conclusion, this research illuminates how content characteristics shape whether it becomes viral. When attempting to generate word of mouth, marketers often try targeting “influentials,” or opinion leaders (i.e., some small set of special people who, whether through having more social ties or being more persuasive, theoretically have more influence than others). Although this approach is pervasive,recent research has cast doubt on its value (Bakshy et al. 2011; Watts 2007) and suggests that it is far from cost effective. Rather than targeting “special” people, the current research suggests that it may be more beneficial to focus on crafting contagious content. By considering how psychological processes shape social transmission, it is possible to gain deeper insight into collective outcomes, such as what becomes viral.”
Use your keyword list to determine how many different pillar pages you should create. Ultimately, the number of topics for which you create pillar pages should coincide with how many different products, offerings, and locations your business has. This will make it much easier for your prospects and customers to find you in search engines no matter what keywords they use.
Awesome tips Brian. Always enjoy your posts. My question is, how can I boost traffic significantly if my keyword has pretty low search volume (around 100 monthly searches based on keyword planner)? I’ve been trying to expand my keyword list to include broader terms like “customer experience” but as you know that is super competitive. Do you have any suggestions for me? Thanks in advance.
While with search advertising, you’re paying to show up in the top spot for relevant searches, with social media advertising you are paying to show up in relevant feeds. With both forms of advertising, you can specify the type of audience in front of which you’d like to appear, but with more psychographic data, social media offers superb targeting.
The days when internet browsing was done exclusively on desktop PCs are long gone. Today, more people than ever before are using mobile devices to access the web, and if you force your visitors to pinch and scroll their way around your site, you’re basically telling them to go elsewhere. Ensure that your website is accessible and comfortably viewable across a range of devices, including smaller smartphones.
I can feel the excitement in your writing, and thanks for all this free info you know how to get loyal subscribers, I believe you are one of the best in the business, no up selling just honesty, its so refreshing, i cant keep up with you I have only just finished the awesome piece of content you told me to write and just about to modify it then finally start promoting, i will be looking at this also THANK YOU, PS i couldn’t make your last course but i will get on board for the next one
Maybe this feels a bit too scattershot for you. Buzzsumo also allows you to find and observe influencers. What are they sharing? By clicking the "view links shared" button, you'll get a display of all the unique pages shared. Sometimes "influencers" share all types of varying content crossing many topics. But sometimes, they're pretty specfic in the themes they share. Look for the latter in this competitive research stage.
Squidoo is a website full of 100% user generated content that allows you to create what’s called a “lense.” A lense is a page about a specific topic that you choose to write about (usually something you’re knowledgeable in). After creating your lense other people can find it by searching for terms and keywords related to your lense. Let me just start off by saying Squidoo is an absolute powerhouse in the search engines. Its very easy to rank Squidoo lenses for competitive terms that would prove to be a challenge for websites with lesser authority. Creating a lense on Squidoo gives you 2 traffic opportunities:
Hey Brian, I have landed in this blog while visiting via blog land. I must appreciate your effort to put up such informative content. As being an Internet Marketing Consultant, I would like to add few thought of my own with your valuable content. There are many people who wants HUGE number of traffic with no time at all. But as per my experience, SEO has become SLOW-BUT-STEADY process in the recent times. After so many algorithm updates of Google, I think if we will do any wrong things with the websites, that should be paid off. So without taking any risk, we need to work ethically so that slowly the website will get the authority and grab the targeting traffic. What do you think mate? I am eagerly looking forward to your reply and love to see more valuable write-ups from your side. Why don’t you write about some important points about Hummingbird Updates of Google. It will be a good read. Right brother? 🙂