To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
Yep and sometimes it’s just being a little creative. I’ve started a little blog on seo/wordpress just for fun actually… no great content on it like here though… but because the competition is so tough in these niches I decided to take another approach. I created a few WordPress plugins that users can download for free from wordpress.org… and of course these link to my site so this gets me visitors each day.
You mentioned: "many times clients have already done this work.  Ask them for copies of their market research reports when you start a project.  It will save you a ton of time and effort!"  We do this with most of our clients, like you said we have found that around 75% of the have some kind of Market research done, that saves you a lot of time and helps setting up the right SEO Strategy. 

If you haven’t used software like BuzzSumo to check out what your competitors are up to, you’re at a huge disadvantage. These services aggregate the social performance of specific sites and content to provide you with an at-a-glance view of what topics are resonating with readers and, most importantly, making the rounds on social media. Find out what people are reading (and talking about), and emulate that kind of content to bring traffic to your website.
Google is the most popular spider-driven search engine. Its database currently has about 4 billion pages indexed and is known for finding the most relevant information. When Google spiders the Web, it finds sites by traveling through links. The more sites that link to you, the more important the engines believe your content to be. You should focus on getting many important sites to link to your site. You can do this in many ways: submit to online directories, exchange links with business partners and industry-related sites, or participate in Link Building.

Studies have proven that top placement in search engines generally provide a more favorable return on investment compared to traditional forms of advertising such as, snail mail, radio commercials and television. Search engine optimization is the primary method to earning top 10 search engine placement. Learn more about the search engine optimization process and discuss an SEO strategy for your site when you contact a search engine specialist today.
2. Targeted Keyword Discovery: Ideally you’ll want to do keyword research based on what the audience wants, not solely on what content the site already has (or plans to have sans audience targeting), which may be limited. I can do keyword research on health conditions and drugs (content I have on my site) and determine what the general population is searching for and optimize my current content, or I can cast my net wide and look at what my target audience wants first, then do my keyword research. You may find there are needs that your site is not meeting. Knowing my senior audience is interested in primarily in prescription drug plans and cheap blood pressure medication, I can first make sure I’m providing that content, and then further determine the top keywords in these areas (in the next article Step 2), and use those terms in relevant and high visibility areas on my site.
So many businesses are focused on attracting new customers through content marketing that they forget about more traditional methods. Email marketing can be a powerful tool, and even a moderately successful email blast can result in a significant uptick in traffic. Just be careful not to bombard people with relentless emails about every single update in your business. Also, don’t overlook the power of word-of-mouth marketing, especially from people who are already enjoying your products or services. A friendly email reminder about a new service or product can help you boost your traffic, too.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.
Just ridiculously good as usual Brian, you continue to set the bar higher and higher each time I see a new post from you, well done. A quick point regarding point 16 about Google only counting the first anchor to a page, what is your opinion about links that go to DIFFERENT pages on the same site. I believe they pass equal weighting but would be good to get your option.
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Very good tips on traffic generation. However, for those with time constraint, back-links is a quick method along with another easy method, blog commenting. I would second most of the people who commented in support of guest posting. Yahoo answers may get mixed responses depending on the topic. If you have time, you can also post on related forums and try video marketing.

Hi Brian! Very good and exactly what I was looking for. I have a problem though, we are creating the first video editing software that edits video WHILE FILMING. We are video geeks with a lot of experience, however we are trying to appeal to GoPro users and video tutorial makers but we have little knowledge in that field. Any suggestions on how we write about that if we have no idea about the space?
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
While with search advertising, you’re paying to show up in the top spot for relevant searches, with social media advertising you are paying to show up in relevant feeds. With both forms of advertising, you can specify the type of audience in front of which you’d like to appear, but with more psychographic data, social media offers superb targeting.

Content gaps – make an inventory of the site’s key content assets, are they lacking any foundational/cornerstone content pieces, non-existent content types, or relevant topic areas that haven’t been covered? What topics or content are missing from your competitors? Can you beat your competitors’ information-rich content assets? Useful guides on Content Gap Analysis:


Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
×