He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
Thanks Brian. I’ve had a “a-ha” moment thanks to you! Great advice. I knew that backlinks would improve the organic SEO rankings to our client-targeted landing pages but I never knew it was through getting influencers to backlink blogs. I always just assumed it was great content that users wanted to share with others. It was driving me mad why people love my content but never share enough. Now I know!
Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
You have also mentioned Quuu for article sharing and driving traffic. I have been using Quuu for quite sometime now and I don’t think they’re worth it. While the content does get shared a lot, there are hardly any clicks to the site. Even the clicks that are there, average time is like 0.02 seconds compared to more than 2 minutes for other sources of traffic on my website. I have heard a few guys having a similar experience with Quuu and so, I thought should let you know.
Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.

It’s not enough to just share content through social channels – you need to actively participate in the community, too. Got a Twitter account? Then join in group discussions with relevant hashtags. Is your audience leaving comments on your Facebook posts? Answer questions and engage with your readers. Nothing turns people off quicker than using social media as a broadcast channel – use social media as it was intended and actually interact with your fans.
Like the hundreds of people already, I thought this was an amazing post. You have a great way of breaking things down into ways that the average reader will be able to understand and make actionable. I think this is a great resource for our readers, so I included it in my monthly roundup of the best SEO, social media, and content marketing articles. https://www.northcutt.com/blog/2014/02/january-resource-round-up-the-best-of-seo-social-media-and-content-marketing/
For some reason I had to delete some pages, these pages are using the HTML suffix, so I blocked them in robots.txt use Disallow: /*.html, but it’s been almost a year, I found that google robot often capture these pages, How can I quickly let Google completely remove these pages? And I have removed these URL from google webmaster tool by google index-> remove URLs, but Google still capture these pages.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
There are many times when you post a small quote or a phrase in your blog post that you believe people would love to tweet. ClickToTweet helps you do just that. Simple create a pre-made Tweet on ClickToTweet.com, generate a unique, and put it on your website so that people can just click it to tweet it. Sounds simple. It is, and it is one of the most popular strategies for generating buzz on Twitter.
Just a suggestion, but maybe you could write an article about generating traffic to a brand new blog. As you know, when you start out, you have only a couple posts and very little credibility with other bloggers, also the search engines will take considerable time to be of any benefit initially. Would be interesting to know how Brian Dean approaches that dilemma!
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Having an industry influencer publish a blog post on your site or turning an interview with them into a blog post can help to drive traffic both through organic search but also via that influencer promoting the content to their audience (see the backlinks section above). This can also help to add more variety to your content and show your visitors that you are active in your field.

Instead, in this instance, we started at wireframe stage, plopping in keywords and meta tags. Of course, the site really needed those things, and although it launched technically “optimized”, it wasn’t enough to provide a better product than our top competitor(s). A product that people want to visit, revisit, email to friends, share on social networks, and link to more than our competitors. It wasn’t even enough to move up in the rankings.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
To gain more customer engagement, the website must reach its visitors/customers efficiently. Obviously, you want the visitors to read your site content. Check the forms and click through on your Call To Actions (CTA’s) when they arrive on your web page. These features initiate user engagement in action, but it is essential to comprehend the in-depth analysis.
I have been trying to produce more content because I believed the lack of traffic was to the small amount of content, but after reading your blog post, i’m beginning to doubt wether or not this is quality content. I will definitely do more research on influencers on my niche, now I have to figure out how to get their attention with my kind of content.
Give customers the ways with which they can access the translated version of your website easily. And if they are not able to execute that, then they will bounce without engaging. You can integrate the ‘hreflang” attribute to the website’s code and assure that the adequately translated version of the website appears in the search engines. Yandex and Google highly recognize it.

Getting traffic is always important but one should not worry too much, nothing happens in overnight, Now I read this article and genuinely tried to make my own impression about the post which automatically creates a link to my blog but don’t try hard thinking back links in mind, you always get caught in some or the other way, Panda and Penguin are one such examples.


Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3

×