If you read our previous blog post you will know that we were hit by the latest Google Penguin Update. We suspect it was because we had used blogging networks in the past although on further analysis it looks like it could also relate to our early link building strategies where we focused on backlinking using the same keywords over and over again. Or it could even be the number of keywords we have on those pages.
Honestly, it’s just too difficult to say what the exact reason might be – we’ve tried so many things over the years (before we knew better) and I guess it was going to catch up with us in the end. That’s not to say we did anything black hat, but these days even what was once accepted as white hat is becoming unethical in the eyes of Google.
You’ll see a lot of posts on the net about Penguin and how to get through it but really when it comes down to it nobody really has all the answers because nobody really knows exactly what Google does. I’ve heard so many conflicting theories – some say it’s is all about the backlinks, while others says it’s nothing to do with backlinks and it’s all about the on page SEO. If they had all the answers, they wouldn’t be online writing blog posts about it – they’d be off on their own private island somewhere logging in only to check their bank balance.
We of course don’t have all the answers either. That doesn’t sound very comforting does it but it’s all just part of the wonderful world of internet marketing. However, we can at the very least get our information straight from the horses mouth…Google themselves, instead of relying on theories that may or may not be correct.
Now this is easier said than done of course because Google isn’t particularly forthcoming about what they do. They tend to be very vague about things and we can only get snippets of information from them if at all. But sometimes those little snippets are enough. So below you will only find information based on actual Google data.
But First…Why This Update Didn’t Really Work
I’m all for Google cleaning up their search results and only getting the high quality sites ranking. I really want to see this happen but at this point, with all the Panda and Penguin updates nothing has really changed. Sure some sites have moved up and some down but we still see scraper sites ranked before legitimate sites, those with thin content ranked before quality sites and sites full of spammy ads and nothing much else beating out well developed content sites.
I really don’t think this update did the trick. And as always there are plenty of people who were innocently hit. Google’s goal with this update was to target over-optimization both on page and off. Too many spammy links to your site and you were penalized, too many keywords on the page and you were penalized. Unfortunately however, that is all they looked at. They still didn’t look at the content. Just because a page of content has a lot of keywords and has a lot of spammy links to it, doesn’t mean the content is bad. It might be fantastic content but Google doesn’t know that…their search engine still isn’t sophisticated enough to compare pages for quality and usefulness. So penalizing a page simply for spammy links just doesn’t work in my opinion.
But there’s no point whingeing about it. You could spend hours on forums and blogs commenting on how bad this all is depending on your situation. The only way to get around it is either to comply with Google or to move on to some other form of traffic generation.
Where to Start
In order to clean up our sites to get them ranking again we need to start from the very beginning with Google’s Webmaster Guidelines. The reason we need to start here is that Google explicitly stated that the Penguin change “decrease(s) rankings for sites that we believe are violating Google’s existing ‘quality guidelines”.
So when they refer to those ‘quality guidelines’ they are talking about their Google Webmaster Guidelines. According to Google, these guidelines will help them “find, index and rank your site”. There are three sections to these guidelines and I’ll summarize the main points for each:
1. Design and Content Guidelines
- ensure that each page on your site can be reached from at least one other page
- use a site map to link to important pages
- keep the links on a page to a ‘reasonable number’ – they don’t say what that is
- create a useful, information-rich site
- use words on your pages that people would use to find your site
- use more text links than images for important links or content
- ensure you use descriptive and accurate title tags and ALT attributes
- check for broken links and correct HTML
- for those using dynamic pages (ie. the URL contains a ? character) be aware that not all search engines can crawl these pages.
2. Technical Guidelines
- Google recommend using the Lynx browser to view your site since it will display it the way most search engines see it. If some content is missing then search engines may have trouble viewing it. (I had a quick look at the Lynx site and it looks complicated so haven’t really tried it yet. I did instead find a Lynx viewer where all you need to do is type in your url and it does the same job.)
- Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. (I have no idea what this means but some of you might.)
- Make use of the robots.tx file on your web server. (Most people’s robots.tx file would be fine but Google wants to ensure that you don’t have their search engine blocked.) You can read more about it here – Block or remove pages using robots.txt file and here Robots.txt FAQs.
- Ensure your web server supports the If-Modified-Since HTTP header as this allows Google to tell whether your content has changed since they last crawled the site. (I simply typed in ‘If-Modified-Since HTTP header Hostgator’ as my search query in Google to find out if our hosting company supports this. They do! You can do the same search with your hosting company or simply contact them directly and ask.)
- Ensure that advertisements do not affect search engine rankings. (I’ m not quite sure what Google are getting at here but I think it has something to do with paid links. In other words, if you are selling paid ads on your site then add the rel=’nofollow’ attribute to the links.)
- If you use a content management system ensure that pages can be read by Google.
- Monitor and optimize site performance and load times. (You can read more about this here: Performance Best Practices.
3. Quality Guidelines
- Make pages primarily for users, not for search engines.
- Don’t deceive your users by using cloaking devices.
- Avoid tricks intended to improve search engine rankings.
- Don’t participate in link schemes designed to increase your search engine rankings.
- Avoid linking to web spammers or ‘bad neighborhoods’ as your own ranking may be adversely affected. (Interesting that even linking to a poor quality site could affect your ranking).
- Don’t use unauthorized computer pages to submit pages, check rankings etc. (Not sure how you know what is authorized or not authorized. Google doesn’t elaborate.)
- Avoid hidden text or links.
- Don’t use irrelevant keywords on your pages.
- Don’t create multiple pages, domains or subdomains with duplicate content.
- If you have an affiliate site, make sure your site adds value and provides unique and relevant content.
You can read the full version of the Google Webmaster Guidelines here.
As you can see, Google provides us with a basic overview of what they are after in their Google Webmaster Tools. It’s worth going through each of them to see whether you comply.
In this respect, we are on our own because unfortunately Google doesn’t elaborate too much on anything. We might think we are complying with their guidelines but how do we really know? They say for instance to ‘keep your links on a page to a reasonable number’. What is ‘reasonable’…who knows? You have to really dig around to find the answer which I managed to do. It was in a post by Matt Cutts (Google Engineer) and written in 2009. He mentioned that the page should preferably hold fewer than 100 links. Of course, the post is old so that could all be considered out of date by now and there could be a whole new number of links that we need to have on a page.
The Next Step
Once you have gone through the Webmaster Tools and ensure that you comply to each…as best as you can, the next step is to figure out what this Penguin update was all about. By doing this we can get a better insight into exactly what we need to do about cleaning up our sites. As we mentioned in the opening paragraphs, everyone has their own theory about what happened with this update but we want to hear what Google has to say. Here is what we found:
1. The Official Word
On April 24, 2012 Google published a blog post indicating that an update was imminent and this one was going to “reduce webspam and promote high quality content”. In the post they provided a couple of examples of keyword stuffing, spun content and outgoing links on a page that lead to unrelated content.
Apart from that, they didn’t impart too much information so at this point we were left in the dark. Considering we don’t keyword stuff, use spun content or link out to unrelated pages, this blog post didn’t help in the slightest.
Source: Another step to reward high quality sites
2. Matt Cutts Interview with Search Engine Land
On May 10, 2012 Danny Sullivan from Search Engine Land interviewed Matt Cutts (Google Engineer). In the interview, Matt Cutts said that the Penguin update was designed to be quite precise and act against pages when there was an extremely high confidence of spam being involved. Matt Cutts said, “The bottom line is, try to resolve what you can” and you will know if you have done the right thing the next time Penguin updates.
Again, we don’t really have much to go with here. Just a few snippets of information….just clean up the spam and you may be back ranking again when Penguin updates again….hmm, easy said than done when you don’t really know what the problem is to begin with.
Plus, Matt Cutts flippant statement “that you may need to start all over with a fresh site” if you can’t recover from the Penguin update just shows you how far removed he is from the rest of us. He obviously doesn’t know the amount of work that goes into it all. And what about small business owners who have created a branded website. You can’t tell me that they should start up a whole new website.
Source: Two Weeks In, Google Talks Penguin Update, Ways to Recover and Negative SEO
3. Google Updates Penguin Again
On May 26, 2012 Matt Cutts announces on his Twitter account that Google has pushed through a second Penguin data refresh. So if you weren’t hit the first time then you could have been hit the second time. Alternatively, if you made positive changes to your site since the first update, you might have seen an increase in traffic.
Does that mean we will see monthly Penguin updates? Hopefully, because it will mean that anyone affected by the Penguin update will be able to make changes to their site and not have to hang out for months waiting for the next update.
4. Matt Cutts Interviewed at SMX (Search Marketing Expo)
On June 5, 2012 Matt Cutts speaks at the SMX conference in Seattle. He informs the audience that Google’s definition of a Google penalty is something that is done manually. In other words, someone manually looks at the site and deems it to be bad. The Penguin update however was not a penalty but an algorithm change which is why you cannot submit your site for reconsideration.
Matt also spoke about negative SEO and that they are considering whether to create a system where you can disavow a link to your site. This would be fantastic but to me that says yes, negative SEO exists. Why would they bother creating a system otherwise? And if you’re wondering what negative SEO is, it is simply a way of killing a competitor by blasting their site with spammy backlinks.
One of the most interesting things to come out of this interview was when he was asked a question about wpmu.org, a reputable site that was hit badly by the Penguin update. The site went to the Sydney Morning Herald (an Australian newspaper) who in turn interviewed Matt Cutts about it.
In the SMX interview, Matt Cutts response was:
“They didn’t rank as high after Penguin, they made their case, and I thought it was a good case. We were able to remove about 500,000 instances of links, and that helped them.”
Now if you look at that response, Matt Cutts is effectively saying that it was their backlinks that caused the problem and by removing 500,000 links their ranking improved. The site had created free WordPress themes and in the footer section had added a link back to their website. This was what resulted in their penalty as a lot of spammy sites used the theme.
Now this is all very nice for wpmu.org who were able to get to the press first to turn their site into a high profile case. But for the rest of us, we are left with spammy links to our site which in most cases are no fault of our own and we are left trying to figure out how to get them removed. Sure, we can email each and every site but do you think a spammy scraper website owner is going to give a toss about removing a link on their site? The whole reason they have a spammy site is that they couldn’t be bothered working on them to begin with. They just want to automate it all and sit back and not touch them ever again.
Another point that Matt Cutts made that would be of interest to us are affiliate links. He did say that they do handle affiliate links okay but if you are in any way worried about them, then add nofollow to them.
And just one more thing which is interesting to note, Matt does say that Google does not look at Google Analytics in its rankings.
Source: You & A with Matt Cutts – SMX Advanced 2012
and Matt Cutts on Penalities vs Algorithm Changes, A Disavow-This-Link Tool and More
Now What?
That’s about all I have been able to find so far on the Penguin update that comes from straight from a Google representative. If you know of anything else, please let us know in the comments below.
From what I can tell from all of this, the Penguin update focused on two things:
1. Links
2. On page SEO
Of the two, I personally think that the links are the main factor. In other words, if you have spammy sites linking to you in quantity then you are likely to be affected. If you have used blogging networks or if you have paid someone to get hundreds of thousands of links to your site overnight then you would likely have felt the affects of this update.
I do also suspect that it could be the anchor text used in those links so try to avoid using the same anchor text over and over when backlinking.
Also if you link out from your site to totally unrelated sites then this could also affect you. So if you accept guest articles then ensure the links in those articles are to related sites. We often get sent articles for our sites and the article might be about dogs for instance which would be suitable for our dog site but the links in the article go to a credit card or insurance company. Don’t accept these articles. Keep them related to your site topic.
If you think that spammy backlinks are your problem then go to your Google Webmaster Tools and take the following steps to view your backlinks:
1. Click Traffic from the menu sidebar.
2. Click Links to Your Site from the drop-down that appears.
You can assess each link and if you deem it to be a spam site you can always email them to see if they will remove it.
As for the on page SEO, this has always been a problem for Google but perhaps they are cracking down a little harder. In this case, I would simply check your reviews and articles and if they sound unnatural to you when you read them out aloud then you are probably using too many keywords on the page. We’ve said for a long time now to just write naturally…throw in a few keywords to help Google find your pages but just don’t overdo it.
I think Google have a love-hate relationship when it comes to SEO. They need it because it helps them find relevant pages on the net for their search engine but at the same time it causes them all sorts of grief as webmasters attempt to use every SEO tip and trick in the book to attempt to manipulate their rankings.
What We Are Doing About It
We personally want to stick with Google but at the same time we want to focus on other forms of traffic. This was our goal before we went overseas and is something we are still looking into. But for the moment we want to get our sites back and ranking well in Google. Fortunately we weren’t hit too hard but it was enough to give us a jolt and get us focused again.
We started by doing nothing and you may think that a little odd but we have learned from years of experience that when it comes to Google updates that you don’t make any changes straight away if you have been hit. It’s always best to leave things alone for a while because oftentimes you will find yourself ranking again.
So after waiting a month or so we realized that our traffic wasn’t going to improve so we started to make some slight changes. Nothing major, just some changes to a few pages on our sites. Fortunately we can take our time with this and not make too many drastic changes at once.
We tried those blogging networks at one point but gave up on them pretty quickly because we realized they didn’t work very well. Plus we just didn’t feel comfortable about the sites the articles were going on. They just looked spammy to us and we wanted to be associated with quality wherever possible.
The other problem is spammy sites that link to us and that is out of our control. We noticed that some sites add hundreds of our links to their sites. I’m not sure why…we didn’t ask them to. One would be enough but they link to us from all sorts of pages in their ‘Further Information’ sections. This doesn’t help us if their site is just a scraper site of sorts.
However, we will try to contact the major offenders to see if they will remove the links and see how we go.
We are hoping that Google implements their ‘disavow’ option which will allow us to reject a site that is linking to us. But who knows when this will be – Matt Cutts says ‘maybe a month, or two, or three’. We will see.
Hi Wanda & Paula , I’m thankful for these great instructions I’d like to add a point to these great points , Google want us to build a brand name i.e develop an authority website in SERPs not to work on ranking for a specific keyword .
Good point Fred. We’ve never built websites (or even pages for that matter) that focus on just one keyword. I know a lot of people do that. They create mini-sites and build hundreds of them just based on keywords.
We have too many websites as it is and we know that doesn’t work. Focusing on one website and making it the best on whatever topic you choose is the best way to go about it.
Hello Paula,
Pretty long post.Thanks for all the info. It would have been too difficult for me to research on my own all this. I faced the panda update problem for a site where I created a mini-site concentrating on pages rather posts and then categories of pages with keywords. I think Google doesn’t like this stuff anymore. I think rightfully so because I wasn’t deserving to get ranked. I wasn’t adding any value doing things this way.
Yes, I think mini-sites are not the way to go anymore. Some people had success with them but Google wants to see quality and it’s often difficult to provide quality with a mini-site.
Another excellent post – thank you for sharing your clear, concise plans for getting past the penguin updates. while we would all like to get our traffic elsewhere, we still have to do our best to play by the big G’s rules, and your approach seems to be the most logical. While the forums are full of “the sky is falling” kind of threads every time Google pushes through another update, it’s good to see that I am not the only one who is taking slow, determined steps to work through these updates and regain or improve rankings.
I know what you mean about those forums. During my research for this post I went through quite a number of different forums and blogs and many just want to get on their to whinge. I understand that but at some point you just have to move on…whingeing about it forever isn’t going to help.
We have a ton of mini-sites, including some Amazon-based scrapers, and the traffic / rankings on most of those has dropped considerably since the end of April, so we have to assume we’ve been Penguinated.
Like you say, I believe authority sites are the way to go now (and probably always were).
Are you going to be moving to authority sites Mark? Or are you going to try and resurrect your mini-sites?
Don’t think its worth the effort with micro sites…authority all the way, though it costs!
It’s interesting to look at the SERPs right now: my old competitors and myself (in my best niche) are all not ranking…except a new site I built for taking screen shots for my Duct Tape SEO book.
(Yes, I know how self-serving that sounds. But it’s a fact all the same.)
Then I started looking at my competitors that are ranking now – as well as my other sites using my newer methods – and there are some patterns emerging.
One of the biggest coincidences is that a heavy reliance on article marketing seems to be a shared trait among all the affected sites.
One of my competitors has been ranking from thousands of articles – but not the spun kind, these were legitimate articles in “reputable” directories…now he’s nowhere to be found.
Since he and I ranked for the same terms, I know that’s a loss of thousands of dollars a month – and he had a much larger site than mine by a long shot: much more authority, everything. He wasn’t just an affiliate.
I think in Google reverse-engineering SEO practices (which it seems has driven Penguin: finding our “SEO best practices”), not only did they de-index blog networks (BMR, lots of ALN, etc.), but I highly suspect they’ve taken a hard stand against EzineArticles and the like.
Panda was reportedly going to do that – and when Panda came out in 2011 EZA lost a lot of traffic…but the backlinks in those articles still counted for ranking (far as I could see – I kept using them and it kept working).
Then Penguin rolls out – and in a comment made at SMX recently, Matt Cutts made the comment that Penguin was supposed to take care of issues that Panda didn’t address.
I’ve more to say on the issue but that’s fodder for my next post. Good round-up here of what’s happened, though.
We used article marketing in the past as well so that could be another thing that affected us. The problem is that article markering was considered completely legitimate in the past and now it’s not.
We gave up on it sometime back because it didn’t really work too well for us but Google doesn’t see that. They just bundle everything up and assume we are spammers even though we haven’t done any of those things for a long time now. That’s why they really need to get the disavow tool going. We’d love to just be able to remove all of those poor quality sites that link to us.
Hello Wanda & Paula, Thanks for the Detailed information to Clean Site from Google Penguin Update. By the way mine is New Blog and it is not hit by Google Penguin.
Excellent, you are in a good position now since your site is new. As long as you follow along with Google’s guidelines you should do well.
Hi Wanda and Paula, thanks for the information. As you say it is very difficult to really know what has affected different sites because it could be a number of different things and the only ones who really know are Google, and they don´t want us to know in many ways.
I have three sites that have been hit and I think it was because I used a couple of link schemes to help build up some back links. Nothing too big and in fact I had stopped using them long before Penguin hit.
But if that was the reason my sites got hit how do I go about putting that situation right again? I can´t undo what has already been done and because I no longer use these schemes I have no way of knowing how to contact all the sites that might have a link to me and ask them to remove them.
It is a real problem, and I just can´t get my head round how to go about making it right again.
Any ideas?
You’re in exactly the same boat as us Nigel. I would suggest going into Google Webmaster Tools and adding your site if you haven’t already done so. You will be able to see the sites that are linking to your sites.
You will have to go to each site, find their contact us page and email them to ask them to remove the link.
Google just doesn’t want to make it easy for us. We are just hanging out for that disavow tool…assuming they actually implement it of course.
Dear Paula and Wanda…
John from Boston here. Once again you’ve written a very informative article but there are some new people, and some not so new people like me, that just can’t seem to figure out what Google is doing, or worse yet, what they may do in the future.
There are some teachers who instruct their students by telling them not to do it this way… or… not to do it that way… when really what a student needs to hear is – Just Do This!
My point, at least for me…
1. Write ( or have written by someone else ) good quality unique content for the visitor… don’t worry about search engines.
2. Find and Contact good quality websites that would love some of the same from you ( or your stable of superior writers ) with a few links pointing back to some of that great content you have on your site.
Then just let the Google Panda and Penguin website slayers chop up the rest of the tricksters out there! What will be left is our sites at the top of the search results!
Also, please let me know when you have completed your research on the sites your building to bypass Google altogether for traffic. It would be valuable to know that we were not at the mercy of Google’s continuous “algorithmic mood” changes.
Thanks again…
Cheers!
JfB
I agree with you entirely John – writing unique content and finding quality sites to link to us is the key.
However, we have been doing that for ages now and we were still hit.
Also you might want to listen to the interview James from The Average Genius did with Tim Carter. It is an excellent interview. Tim Carter owns a huge site – all unique content and he has never done any backlinking – and yet he was hit by both Panda and Penguin.
http://theaveragegenius.net/google-reward-quality-original-content-interview-askthebuilder-tim-carter/
You are too good to me. :)
Thanks for mentioning the link and making the point that it’s not always enough. Frankly I’m surprised to learn you and Wanda were hit at all (even after my interview).
Would like to learn how your recovery process turns out.
Hello Paula…
I’m still in the dark about much of this.
All I’m trying to do is learn how to do things right and then implement those techniques daily.
I thought that Panda and Penguin were just going after the “shady” operators out there! I can understand that years ago, you were implementing methods that were GOOD in the eyes of Google but that over the years Google changed the rules and these older sites are now being penalized.
Will you be forced to rebuild any of your sites from scratch or will you be able to “Duct Tape” over the bullet holes?
Is it a good strategy to look at a site that has been hit hard by these changes… live with them… and just continue to publish new quality articles with good backlinks to rebuild the site pages rankings?
Thanks for reading!
JfB
I think we are all pretty much in the dark about this. I really wish I knew the answer to it all.
I do know that we are not going to start any sites from scratch….at least not at this point. We are fortunately able to take this slowly and not make too many panic changes. Maybe we are taking it too slowly…I don’t know, but we just want to make changes and then wait for the next update.
good article. Lot of people write great articles and some people indulge in unlawful cyber acts and then the rules get tougher. So the person who is working hard suffers.
That’s right Ronak, Google only make the rules tougher because they have to. Of course their could be other reasons we don’t know about. Google are a big company now and big companies are mostly about the big dollars so they will try to increase it at every opportunity.
For a quick way to measure your website’s performance, I go to http://gtmetrix.com
You enter your URL (I usually plugin the home page), and gtmetrix analyzes your site. After a short analysis, gtmetrix will display a grade for your website (A through F) using two different measurements: Google’s Page Speed, and YSlow.
Gtmetrix also has a great guide on properly implementing the W3 Total Cache plugin (for WordPress websites).
http://gtmetrix.com/wordpress-optimization-guide.html
I notice they are promoting their own plugin now, gtmetrix for wordpress. You do not need to install (I haven’t), and they write that installing the gtmetrix plugin is optional.
That’s a pretty cool site Chris. Thanks for letting us know about it.
i agree with what you have said in this article… Google’s activities really confusing..they keep coming with something or the other but i don;t think the purpose is served at the end of the day… And people who are working hard are getting adversely affected..
Yes it seems to be those that aren’t working so hard that are being rewarded. Too many scrapers sites are still ranking. Google still don’t have it right yet.
“Make pages primarily for users, not for search engines.” These quality guidelines seem to get thrown to the side sometimes, or so it seems when visiting some of the sites across the web. (poor grammar, full of keywords, etc) Focusing on a site intended solely for the benefit of your readers and not the search engines is definitely the way to go.
I agree Cythia. I think ultimately our number 1 goal should be to provide quality content. Although after listening to this interview I am starting to wonder…
http://theaveragegenius.net/google-reward-quality-original-content-interview-askthebuilder-tim-carter/
Good post :).
I have some questions. Would you mind answering them please?
Paula & Wanda, I wanted to ask you, after all those google updates, does the method revealed in your ebook still work? do you still rank well and make money from Amazon?
For a newbie like me, what kind of advice would you give me so that I won’t get affected by any further updates?
Also, what do you think of varying anchor text? and to what extent, if someone wants to rank extremely well for select keywords? Do you still think guest blogging is one of the strongest backlinking methods?
Last, according to your experience, what if someone builds backlinks for a month or 2 and then stops, would the page still rank well after months with no new backlinks built?
My goal is to sell at least one product a day for each of the 2 product reviews i’m working on. Do you think it’s doable? how many items do you usually sell per day, for a specific product ?
Thanks!
1. Yes, the method still works. The only reason our sites were penalized is (I think) because of the use of blogging networks and the overuse of anchor text in the past. We don’t recommend any of those things in the ebook.
2. My advice for a newbie is to simply focus on building a quality site. If you provide ‘quality’ in every thing you do then you will succeed.
3. Varying anchor text is important. We do mention this in the ebook. And yes, I think guest blogging is more important than ever. It still is the most powerful backlinking method in our opinion.
4. It’s always a good idea to continue backlinking even if it is only one or two new links a month. Once you get to the point where your sites are ranking well for all sorts of keywords then you can slow down. If you have quality content then people will naturally link to you so you won’t need to backlink as often.
5. Yes, it’s definitely doable to sell 1 product a day. For our sites we can sell as many as 30 products a day for just one review. That’s the extreme of course and happens only around Christmas. But we can sell up to 15 products a day per review depending on the page. Some pages only 1 a day and some 5 a day and some 8 a day. It just depends on the page, the time of the year, the economy, the amount of traffic to the page, how well the review has been written and so on.
Thanks Paula :)
What do you mean by the overuse of anchor text?
If you are trying to rank for say the keyword “dog training” then normally you would try to get as many backlinks using that keyword phrase as the anchor text. That is how you rank for that keyword in Google.
However, if you overdo it and have the majority of your backlinks using only that keyword phrase then Google will get suspicious and think you are trying to game the system. So you can still focus on one or two keywords but vary it up with a variety of other keywords to make it look natural.
Sadly this is what worked for us in the past so the industrious among us worked our pants off. LOL
Now, we’re paying for it. :)
Yep, you just can’t win when it comes to Google. It makes this industry a tough one but only the strong will stick it out. We just have to keep adapting.
Great post, really informative. I was looking for a solid guide on what to do after the update, thanks!
Glad we could help Jon.
Yes, Google Penguin update is a great inspiration for white hat strategy. So, After the google penguin update every blogger should clear their site following your shared guidelines. Thanks for this helpful post.
Well we can only hope that these Google updates forces the get-rich-quick automated sites off the internet. White hat has always been the most sustainable way to go.
What a comprehensive article on the Penguin Update! Google’s Penguin actually targets keyword and link spam. While working on SEO, therefore, it’s vital not to over-optimize or get linked to by bad or spammy sites. And we shouldn’t link to bad sites either.
Thanks for all this useful information.
I couldn’t have said it better Faissal. I think you’ve summed up our post in a those few sentences.
tbh i dnt really believe in this penguin nonsense any sensible rival site cn kill its competitor jst by doing over seo on their domain specially on the link dpt lolz
That’s the problem I think. It is open for anyone to attack their competitors. Which is why Google are working on the disavow tool. They know that it is becoming a big concern.
Great post to recover from penguin update one of my blog is really affect very badly by penguin update now again done some hardwork to recover it’s really irritative me.
Yes, it’s irritating us as well. Would be nice if Google could at least be a little more willing to tell us why our sites have been deranked so we can know what to do about it.
Penguin targets spamming and panda on duplicate content. Be in touch with Social Media and blogging and be ready for fast traffic increment.
I think you’re right about social media. Google are using it as part of their algorithm. How much emphasis they place on it…we’ll probably never know but I think it is something that needs to be taken into account these days.
Thanks a lot for these tips. My site being one of those hit hard by the penguin update, they will help me a great deal. I had no clue that there are sites with spammy ads ranking so high, that seems so unfair.
Unfortunately those spammy sites are still ranking. One day, Google might get it right.
There’s a lot of people were being swept away by this update. That’s why it good to play safe. if you’re connected to a site, you’ll be in trouble too. Quality blog comment is the best and legal way to get a higher
That’s the problem with this update. You could be “connected to a site” and not know it. People link to our sites all the time but we have no control over that.
The guide and instructions that you have shown and discussed here is very useful and I find it to be very helpful and relevant. I hope many will be able to receive great benefits from this information that you have shown here.
Penguin had been a headache to me. Took me about a month to clean my website.
Did your site start to rank again after you cleaned it up?
Hey Paula :)
Just wanted to know your thoughts on using bookmarking services such as SocialAdr? isn’t it risky after the Penguin update?
Also, what is the percentage of the target keyword compared to varying keywords would you recommend? I mean is there really a duplicate anchor text penalty?
Also, what about leaving a keyword instead of your name when blog commenting? would you even recommend blog commenting to start with? I’m planning to start social bookmarking and blog commenting as a way to diversify from my main backlinking strategy (guest blogging).
Finally, can we know how much of your traffic/revenue have your lost after the penguin update ?
Thanks
I don’t think bookmarking services are an issue. I really believe you need a mix of all sorts of backlinks. Guest blog backlinks are the strongest in our opinion but a mix of all sorts of backlinks are the way to go.
We don’t really have a specific percentage that we use for keywords when backlinking. We have just always focused on say two or three keywords and then varied up the mix with a variety of other keywords. So our main keyword might be keyword 1 and keyword 2. They are the 2 we really want to rank for. But we also add other keywords like keyword 3, keyword 4, keyword 5 and keyword 6 (or even more) to vary the mix. So it might look like this when we send out our guest articles.
Article 1: Keyword 1, Keyword 2, Keyword 3
Article 2: Keyword 1, Keyword 2, Keyword 5
Article 3: Keyword 2, Keyword 4, Keyword 6
Article 4: Keyword 2, Keyword 4, Keyword 5
Article 5: Keyword 1, Keyword 2, Keyword 4
Article 6: Keyword 4, Keyword 5, Keyword 6
Article 7: Keyword 3, Keyword 5, Keyword 6
In other words, we just mix it up as much as possible.
I wouldn’t recommend leaving a keyword as your name when commenting on blogs. You will probably find your comment will be deleted. We don’t allow them on this blog unless they are in the following format eg: Paula from Affiliate Blog Online or Paula @ Affilaite Blog Online.
Blog commenting should be part of your backlinking mix. You don’t have to go too crazy but if you comment on popular blogs in your own niche you can generate traffic.
The amount of traffic we lost is dependent on the page in question. We never look at our sites as a whole when looking at statistics. We only ever look at the pages we want to rank for. I mean who cares if we have lost traffic to our About Us page for instance. So in terms of pages, for some pages we lost no traffic, for others we lost up to 40% of our traffic.
I like the direction Google is going with the Penguin update, they are wanting people to produce better quality content and eliminate the bad links. Really it should be a good thing for everybody, well with the exception of those not wanting to do the work required. Quality content should be something that every site is wanting to put forward though I would believe.
Definitely agree Kevin. I like that Google wants to only display quality content. Unfortunately I suspect that is not their only prerequisite for ranking a website these days. I think the money is starting to become more of a priority for Google.
I really don’t understand the term “branded links.” I am still new to IM. Can you discuss with examples. Thank you.
I’m not too sure what a branded link is either Romona. I would have thought it is when you use a brand name keyword as the anchor text. For instance, ‘Sony Bravia TV’ instead of ‘best big screen tv’.
I really needed those tips to guide me after the update. Thanks a lot for sharing.
No problem Ashley, glad we could help.
I built tools review site with a totally unique content, articles up to 1000 and more words in length. Built no links to it, relying on good content only. 18 unique posts brought 40-60 unique visitors per day, buy now it dropped to less than 20. I was checking semrush for certain unique quality review sites I was tracking on the web, most of them lost up to 80% SE traffic. I myself kind of lost now, links or no links, good content – does not seem to make any difference.
If your site is relatively new Jemma then you will often experience a drop like this around the 3 to 4 month mark although it can be later. Is it a new site?
Hello Paula,
I’m facing the same problem as Jemma, I started a new website on may 1st and sometimes traffic seem to drop from 20 a day to zero as the rankings fluctuate in google. Like during the month of may I started ranking no.10 for that keyword and then for 8 days in June I lost the ranking and then it came back. Again now during the month of july I’ve lost the ranking and traffic has dropped to zero because I was ranking only for that one particular page and keyword? Is this normal for a new site?
Yes, it’s definitely normal, which is why when you start out with a new site, it’s better not to even look at the rankings.
Hello Paula and Wanda…
Can you recommend a tool where I can see how many keywords my webpages are ranking for. It would be great if this tool created a list and told me what page as well.
Are you ladies heading to New York in August?
Thanks.
JfB
You could try SEMRush but they do have a monthly fee for the privilege. I do like it though. We’ve reviewed SEMRush on our Affiliate Reviews site here: http://www.affiliatereviewshq.com/2011/10/24/semrush-seo-software-review/
We have also used SE KeyRanker in the past but it doesn’t do exactly what you want it to do. It is a WordPress plugin and it allows you to add keywords to a list and then it will display on the dashboard where you are ranked for those keywords.
It is a penalty, pages that should rank for specific terms no longer rank for those terms or are served in search results for any terms. Google is pulling other pages from my sites that are related but not the best match for the keyword term.
Obviously this hurts the quality of results but Google is content to serve poorer quality results to get it’s message across to “bloggers”.
You could be right. It’s hard to know with Google but I think their ultimate goal at the moment is to make money. And whenever you focus on the money first, the quality of everything else suffers.
Wow great insight paula, am hearing first time about lynx viewer, i have learned a great deal from your article. Well researched and written content. Keep it up!
First time I heard about it too.
Good analysis and advice, especially about page SEO and links from spammy sites. I only hope that manual comments with commentluv would not affect this… Some say that keyword density does not matter much anymoe, but I’d still advise to get it to below 5% and 2-3% as a rule of thumb.
I think it’s all just about keeping it natural. We have been on this road for some time now – we just write naturally and don’t worry about the keywords. I think that has kept us safe through most of these Google updates…except the Penguin of course.
Stumbled upon this site when I searched about the Penguin update in relation to keywords. I was hoping to get a definite answer on keywords including density allowed. Thank you very much for this post!
I’m really not sure what the right amount of keywords we should have on a page. We personally don’t worry about it…we just write naturally. I think that is what Google wants – keep it all as natural as possible. Mind you, Google seems to change their mind every few months so who knows what might happen next.
I have tried 301 direct to recover a site, it was successful but you gotta to use a new domain
That’s a risky one to go with, but then again, if you have lost the traffic what have you got to lose. Plus, wouldn’t Google just pass on the penalty from the old domain to the new domain?
A very informative post. Thanks so much for the information and research you put into it. I get nervous every time we see a google update. Their are so many seo strategies and its getting hard to tell what google considers kosher, and what they don’t. This is a great starting point to adjust seo strategy to increase the chance the your site will not be hurt by future updates. However, like your experience, I worry that our past may come back to haunt us. Thanks for your excellent post.
This is the problem Greg. What was done in the past was considered kosher and now it isn’t. Who’s to say what we are doing now is not going to be considered white hat in the future. It’s a tough business but we just have to adjust to stay with it.
Thanks for the Detailed information to Clean Site from Google Penguin Update. By the way mine is New Blog and it is not hit by Google Penguin.
It shouldn’t be hit by the Penguin update if it is new. Just stay on the right side of Google and your site will go well.
mostly bloggers are really frusted from these panda and penguin update but we follow the above tips we can easily get out from these updates..
Yes, it is very frustrating, but I think we just have to move past it all because if we don’t we just get stuck. As I said in one of the previous comments, this is a tough business but we need to stay focused and adapt to the changes if we want to continue to succeed.
Hey Paula and Wanda,
From everything I can tell looking at my own sites and the sites of others, it looks like Penguin was mostly about anchor text in backlinks, e.g. having too many of the same keyword pointing to your site. That seems to be the most common theme amongst the sites I’ve seen affected. I think it was more about that then it was about what sites those links actually came from. I’m sure there were several factors, but that seems to be the biggest one.
I also think Google is starting to consider things like dog training, how to train your dog, dog training tips, ways to train a dog, etc. as essentially the same keyword. So I don’t think having that kind of variation is good enough even though I see some people teaching that as the right way to “diversify” your anchor text.
I think it’s critical to have naked links, generic terms like “click here”, your name (or the name you use as the author of your website), blog post title links, and things like that if you want to stay out of trouble. The most frustrating thing is Google can change the rules at any time so what works great now might kill your site with a future update. You just never know.
It’s funny that you mention the “click here” links because I have just written about this in our latest ebook (which should be out next week). I think these types of keywords are a definite must. It’s all about making things look totally natural now and that is one way of doing it.
I always get confused after google penguin update :( or better say it frustrates me. Your guidelines will really help me out. Thank you Paula.
I think it confuses everyone Nitya. I’m still confused.
Hi Paula, i used to write some blogs many years ago and spent days and weeks trying every SEO technique and recall the endless nights of backlink building with anchor text etc to see the slow rewards come in. I just started a blog for an offline business recently and noticed to my surprise, that within minutes posts were being indexed under my keywords with little to no links or seo optimization, and were being ranked rather high, especially when compared to older sites who i am sure have hundreds/thousands of links. Has me a little baffled yet pleased to see at the same time. So naturally im now touring SEO blogs to see whats changed recently.
This has happened to me with the new blind dog site (www.blinddogsupport.com) we are working on. We haven’t backlinked at all except for a press release and we are ranking for all sorts of things.
New sites tend to do that. You might find in a few months that your rankings will jump around a bit.
That’s the honeymoon period new sites tend to recieve when added to google. They push you up the rankings to see how your site performs, then after about a month (maybe less) your site will tend to drop away
Hey,
I am really thankful to you for giving instruction.I was very upset from Google Penguin update.
Thanks,
Mohammad
one question if we use more images then text and use title tag and alt tag on that image then are they effective as text ?
I honestly don’t know the answer to that one. Maybe someone might be able to chime in if they know a little bit more about it.