Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients. Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.
Great post, your knowledge and innovative approach never fails to amaze me! This is certainly the first time I’ve heard someone suggest the Wikipedia dead link technique. It’s great that you’re getting people to think outside of the box. Pages like reddit are great for getting keywords and can also be used for link building although this can be difficult to get right. Even if you don’t succeed at using it to link build it’s still a really valuable platform for getting useful information. Thanks!
Hello Brian, i am planing to start my blog soon and im in preparation phase (invastigating, learning, etc…). I have read a lot of books and posts about SEO and i can say that this is the best post so far. Its not even a book and you covered more than in books. I would like to thank you for sharing your knowledge with me and rest of the world, thats one of the most appriciate thing that someone can do, even if you do it for your own “good” you shared it! As soon as i start my site ill make and article about you!!
Hi Brain, I am a young business owner who has had 4 different websites in the last 2 years but none of them were successful as I would have liked due to lack of SEO. Now I am in process of starting another business and I felt it was time for me to learn about SEO myself. I must say the information you have provided is invaluable and extremely helpful!! I am learning on the go and you are my biggest contributor. Thank you Sir!
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
Studies have proven that top placement in search engines generally provide a more favorable return on investment compared to traditional forms of advertising such as, snail mail, radio commercials and television. Search engine optimization is the primary method to earning top 10 search engine placement. Learn more about the search engine optimization process and discuss an SEO strategy for your site when you contact a search engine specialist today.
Stickers are essentially mini-posters, and advertisers have been using them for decades to get the word out without technically breaking the law. They hand them out to teams who then go out and plaster them over public buildings, bus stops and street signs. When the authorities complain, they say “oh, we only gave them to our customers. We have no control over where they put them.”
Backlinks. If content is king, then backlinks are queen. Remember, it's not about which site has the most links, but who has the most quality links pointing back to their website. Build backlinks by submitting monthly or bi-monthly press releases on any exciting company, and contacting popular blogs in your niche to see how you can work together to get a backlink from their website. Create the best possible product site you can, so people talking about the products you sell will link back. Try creating graphics or newsworthy content that will influence bloggers and news websites to link that content.
Make it as easy as possible for website visitors to connect with you by adding a live chat box to your homepage. Include a name and photo in the chat box so that users know they are talking to a real, live person and not just an automated robot. When there is nobody to monitor the live chat, be sure to mention that, by saying something along the lines of, “Nobody is here right now but feel free to leave a message and we will get back to you shortly!”
Having a different description meta tag for each page helps both users and Google, especially in searches where users may bring up multiple pages on your domain (for example, searches using the site: operator). If your site has thousands or even millions of pages, hand-crafting description meta tags probably isn't feasible. In this case, you could automatically generate description meta tags based on each page's content.
Content gaps – make an inventory of the site’s key content assets, are they lacking any foundational/cornerstone content pieces, non-existent content types, or relevant topic areas that haven’t been covered? What topics or content are missing from your competitors? Can you beat your competitors’ information-rich content assets? Useful guides on Content Gap Analysis:
Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
2. Targeted Keyword Discovery: Ideally you’ll want to do keyword research based on what the audience wants, not solely on what content the site already has (or plans to have sans audience targeting), which may be limited. I can do keyword research on health conditions and drugs (content I have on my site) and determine what the general population is searching for and optimize my current content, or I can cast my net wide and look at what my target audience wants first, then do my keyword research. You may find there are needs that your site is not meeting. Knowing my senior audience is interested in primarily in prescription drug plans and cheap blood pressure medication, I can first make sure I’m providing that content, and then further determine the top keywords in these areas (in the next article Step 2), and use those terms in relevant and high visibility areas on my site.
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.