Display Network – This network consists of millions of sites that agree to show Google text, image, and video ads. These ads are shown within the site’s content and don’t utilize traditional keyword based targeting, but rather audiences and demographics. For example, a user may visit a blog that speaks to the history of coffee tables. Even though the user isn’t necessarily in a buying mode, the content is relevant to coffee tables. The user may or may not click the ad, but is ultimately now aware of the brand.
“I had been impressed for a long time with the content that Brick Marketing was sharing in their informative blog posts and articles. I chatted with Nick Stamoulis a couple times and decided that he was the expert I wanted to work with. I have worked with Brick Marketing for about six months and they have helped us resolve several SEO related issues pertaining to our website. Our account rep is always just an email away with answers to any questions I have and suggestions for how we can improve what we’re doing. Brick Marketing is “solid” when it comes to support for SEO marketing advice. I definitely recommend them if you want to feel more secure about how your website is performing in searches and have the confidence that everything being done to improve your rank is white hat and legit.”
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed only to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.