Entrevista a Marie Haynes, Search Engine Marketing Consultant
Un auténtico honor haber podido realizar esta entrevista. Marie Haynes es una SEO experta en penalizaciones de Google que, entre otros sitios, escribe en Moz.
Con ella hablaremos sobre penalizaciones y SEO para ecommerce. La entrevista está en inglés, os aconsejo que la leáis a tope porque se puede aprender un montón.
1- What are the main Panda-related issues that you’ve found on ecommerce sites?
A lot of people will tell you that having stock product descriptions can cause a duplicate content problem in the eyes of Panda. But, I think it goes way deeper than this. Google knows that most eCommerce sites are going to use the manufacturer’s product descriptions. I don’t believe that they penalize for this. However, in this age of Panda, it is vitally important that your eCommerce store provides some kind of value to go along with these descriptions.
Let’s say that you sell green widgets. This one particular green widget has 1000 different stores selling it and all of them are using the same product description. How does Google decide to show first? To answer this, let’s look at this from the perspective of a searcher. If I’m looking to buy a green widget, why would I want to buy it from your site? In some cases, perhaps your site is recognized as the main authority on green widgets, but what if you’re not? You can make your site more valuable to users by including things that are helpful to users. I’m not just talking about having unique words in your product descriptions, but rather, having helpful things like buying guides, awesome photos and videos, a great search function, helpful and unique reviews and so on.
If you can convince Google that users consistently prefer your site, then you’ll do well. But, if you are the same as everyone else and don’t offer much value over those other sites, then Panda could be a significant factor that affects you.
Another common problem I see with a lot of eCommerce sites is thin content. Thin content is not necessarily short content, but rather, content that is not helpful to anyone. For example, if you have thousands and thousands of search results pages indexed, that’s not a good thing. Or, if you have thousands of out of stock products indexed, that’s going to be frustrating to users and could possibly trigger a Panda problem.
And then there are more obvious issues. Anything that annoys users such as massive popup ads, broken navigation, etc. can be Panda problems for eCommerce sites.
2- Can you tell us a Panda recovery success story (not necessarily step by step)
I recently worked with a news site that was hit by Panda in October of 2014. Sites that publish news stories can often be Panda fodder especially if they don’t provide exceptional value to go along with the news story. If you run a news site and you’re simply publishing the same stories as all of the major news outlets, then this can cause problems. You need to provide significant value.
We made the following changes on this site:
- Improved page load time by decreasing the size of images, removing unused plugins and utilizing caching
- Removed a robots.txt block that stopped Google from crawling javascript and CSS
- Changed the layout of the site so that there was more text above the fold and fewer ads
- Fixed some buggy slider issues that caused the site to look strange for readers
- Worked on improving title tags to make them more descriptive and use keywords effectively
- Most importantly, we worked really hard on improving the quality of the articles that were being produced
- Previously many articles were written by someone who was not a native English speaker, so that was improved upon. The old articles were short and lacking substance. Now, when the site publishes a new article it has many details and photos that you can’t find on other articles on the subject.
3- What are the tools you’re using for your daily audits?
I use Screaming Frog a lot. When it comes to link audits I use Ahrefs, Majestic, Open Site Explorer, Google Search Console and a neat little Chrome extension called SpamFlag that helps me to find where a site’s link is on a page and whether it’s followed or not. I also use my disavow blacklist (https://www.mariehaynes.com/
I use Google Analytics data wherever possible. Where I don’t have that data, I use SEMRush.com traffic data. I also use the Search Analytics Data from Google Console a lot as well.
For most of my audits though, I’m not using tools. I don’t rely on tools to tell me whether the keyword density is too high or how many H1 tags are on a site. Rather, I’ll spend a lot of time looking at the site with a critical eye to see if I can find issues that can be improved upon.