Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
“When we came to Brick Marketing initially, we had a small subset of challenges we didn’t have the bandwidth to tackle in house. Our idea was simply to send out the work and be done with it. A one-shot deal. What we found mid way into the first project, was that Nick Stamoulis and Brick Marketing had a depth of understanding and approach to solving our Search Engine Marketing problems that we had not considered; solutions that dramatically improved our search engine ranking position on terms and improved the overall size of our index listing (by more than 25% in the first two months). In short order we expanded our horizons and enlisted his talents to take on refining and improving ROI on our rather expensive Pay Per Click campaigns, as well as having him consult on microsite projects and blogs. Nick Stamoulis of Brick Marketing helped us understand what works and why, and helping us maintain our dominant position in the SERPs, despite the markets constant resetting and ever-changing drama. I could not have gotten through this year without Brick Marketing’s assistance and advice. I couldn’t give a stronger recommendation; they are simply great!”
Master strategic marketing concepts and tools to address brand communication in a digital world. This Specialization explores several aspects of the new digital marketing environment, including topics such as digital marketing analytics, search engine optimization, social media marketing, and 3D Printing. When you complete the Digital Marketing Specialization you will have a richer understanding of the foundations of the new digital marketing landscape and acquire a new set of stories, concepts, and tools to help you digitally create, distribute, promote and price products and services. In 2016, this was one of the top 10 specializations in terms of enrollments. INC Magazine rated the first course, Marketing in a Digital World, as one of The 10 Hottest Online Classes for Professionals in 2015. In addition, this course was also ranked in the top five courses across multiple MOOC providers. Finally, the Digital Marketing Certificate was the top coveted certificate on Coursera in 2015. Get more updates on the specialization at http://digitalmarketingprofs.com/ This Specialization is part of the University of Illinois Masters of Business Administration degree program, the iMBA. Learn more about the admission into the program here.
Google Adwords helps you get your business found by your target audience who search for specific terms related to your brand, products and content. You can find out how to set up your Google Adwords account and set your budget here. First, let’s take a look at the benefits to help you decide if Adwords will help you achieve your digital marketing objectives and enable you to reach your ideal audience.
To answer your question, David, take a look at Jim’s comment below. Yes, you can and SHOULD optimize PR by directing link equity at important pages and internally linking within a theme. PageRank is a core part of the Google ranking algo. We don’t get visibility into PageRank as a number or score, but you need to know about the concept in order to direct your internal, strategic linking and navigation.
Let’s assume that it is a logarithmic, base 10 scale, and that it takes 10 properly linked new pages to move a site’s important page up 1 toolbar point. It will take 100 new pages to move it up another point, 1000 new pages to move it up one more, 10,000 to the next, and so on. That’s why moving up at the lower end is much easier that at the higher end.
You can focus on your targets so you can write targeted ad copy and bid/budget appropriately. You can do this based on categories, URLs, page titles, or page content. For example, you could set a target for all URLs with “purple-shoes” in the string. That would allow you to know all searches and ads will be about purple shoes, so you could write ad copy and bid accordingly.
The Open Directory Project (ODP) is a Web directory maintained by a large staff of volunteers. Each volunteer oversees a category, and together volunteers list and categorize Web sites into a huge, comprehensive directory. Because a real person evaluates and categorizes each page within the directory, search engines like Google use the ODP as a database for search results. Getting a site listed on the ODP often means it will show up on Google.
Baseline ranking assessment. You need to understand where you are now in order to accurately assess your future rankings. Keep a simple Excel sheet to start the process. Check weekly to begin. As you get more comfortable, check every 30 to 45 days. You should see improvements in website traffic, a key indicator of progress for your keywords. Some optimizers will say that rankings are dead. Yes, traffic and conversions are more important, but we use rankings as an indicator.
If your company is business-to-business (B2B), your digital marketing efforts are likely to be centered around online lead generation, with the end goal being for someone to speak to a salesperson. For that reason, the role of your marketing strategy is to attract and convert the highest quality leads for your salespeople via your website and supporting digital channels.
Just like the world’s markets, information is affected by supply and demand. The best content is that which does the best job of supplying the largest demand. It might take the form of an XKCD comic that is supplying nerd jokes to a large group of technologists or it might be a Wikipedia article that explains to the world the definition of Web 2.0. It can be a video, an image, a sound, or text, but it must supply a demand in order to be considered good content.
For example, to implement PPC using Google AdWords, you'll bid against other companies in your industry to appear at the top of Google's search results for keywords associated with your business. Depending on the competitiveness of the keyword, this can be reasonably affordable, or extremely expensive, which is why it's a good idea to focus building your organic reach, too.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.