A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Black hat SEO involves techniques such as paying to post links to a website on link farms, stuffing the metadata with nonrelated keywords, and using text that is invisible to readers to attract search engines. These and many other black hat SEO tactics may boost traffic, but search engines frown on the use of such measures. Search engines may punish sites that employ these methods by reducing their page rank or delisting them from search results.
I completely agree that defintion of a target audience isa great first step, but would ask if adding in competitors to the analysis (mentioned here as a later step) helps draw out who your target audience would be via comparisons, i.e. showing who you are an who you are not - would be very interested to hear opinions on how this tactic can be used within the overall step in coordination with targeted keyword discovery.
Google Analytics is free to use, and the insights gleaned from it can help you to drive further traffic to your website. Use tracked links for your marketing campaigns and regularly check your website analytics. This will enable you to identify which strategies and types of content work, which ones need improvement, and which ones you should not waste your time on.
Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.
Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines. For more information on thin content see More Guidance on Building High-quality Sites.

Yesterday I was re doing our process for ideas and alltop was a part of it. Now I have also known it was a bit spammy (some of my grey sites are featured ) but now it seems way too bad. You have places like new York times next to random adsense blog x. Guy kawasaki needs to really start giving some sort of influence ranking or at least culling the total crap ones.


I consulted a few years ago before Yahoo and CNET and my clients were all small businesses, even friends' sites.  No matter the size of the project, you can still try to get some insight into your target audiences and what they need or want.  I mentioned in a previous comment I used Search once to determine sentiment on a site vs. it's competitors by searching for a feature the site and its competitors all had, along with "like", "love", "hate", "wish", etc.  I also took note of who the people were who said those things and where they were talking (forums, twitter, etc).  It's a hacked manual approach and although not nearly as quality as a good market research report, at least I have a llittle bit of insight before going out to make site recommendations based solely on tags & links.  If you're recommending the site build things that people want (and fix or remove things that they dont), you're more likely to gain links and traffic naturally.
Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines. For more information on thin content see More Guidance on Building High-quality Sites.
Thanks for bringing up this point - I agree Eric - competitive positioning can help you determine value that you bring to the table that your competitors dont.  I'm all for it.  Neilsen does some reports that provide awareness, likelihood to recommend, sentiment and other insightsfor your site/brand and your competitors. You can also pull some of that type of insight out of social listening platforms like NetBase, SM2, Radian6, Dow Jones, Nielsen, and so many others.  I've even done some hacked compeitove sentiment comprisons before using Search: searching for [brand or feature] + "like", "love", hate", "wish" etc. 

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Getting more website visitors does not happen overnight. It takes some effort but we’ve eliminated the hard part for you: knowing what to do in the first place. By using Google My Business and the other safe channels listed above, you can get the right visitors coming to your site and more importantly, more of those visitors converting into customers.
I definitely learned tons of new things from your post. This post is old, but I didn’t get the chance to read all of it earlier. I’m totally amazed that these things actually exist in the SEO field. What I liked most is Dead Links scenario on wikipedia, Flippa thing, Reddit keyword research, and at last, the facebook ad keyword research. Its like facebook is actually being trolled for providing us keywords thinking they are promoting ads.
Getting more website visitors does not happen overnight. It takes some effort but we’ve eliminated the hard part for you: knowing what to do in the first place. By using Google My Business and the other safe channels listed above, you can get the right visitors coming to your site and more importantly, more of those visitors converting into customers.
The goal of SEO is to get a web page high search engine ranking. The better a web page's search engine optimization, the higher a ranking it will achieve in search result listings. (Note that SEO is not the only factor that determines search engine page ranks.) This is especially critical because most people who use search engines only look at the first page or two of the search results, so for a page to get high traffic from a search engine, it has to be listed on those first two pages, and the higher the rank, the closer a page is to the number one listing, the better.  And whatever your web page's rank is, you want your website to be listed before your competitor's websites if your business is selling products or services over the internet.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
×