Common SEO misconceptions

In SEO (Search Engine Optimization), there are some common misconceptions, which continue to rise time and again.

Ingo – Clipping Error Lion, https://flic.kr/p/pwiA5R

I’ll present some of them in this article:

  • Google doesn’t change, as time passes by – this is one of the most common misconceptions I’ve seen. Google does change, as time passes by. A lot of times, SEO tactics which work for a certain timeframe, stop working at a given time. Lots of the things below are actually tactics which, at a given time, used to work, but not anymore.
  • You can easily rely on Google to get informed if a certain practice in SEO is still valid – the key word in here is “easily”. Yes, Google can be a good source of information to decide if a certain practice is still valid. But it’s not easy at all to do so, you need to use your best judgment and you have to have at least some idea on what are the trustworthy web sites to get information from. If you want to prove that black is white using Google, generally you can. There are a lot of gray areas in SEO, things which are unclear. If you look quickly and don’t know who to trust, you might get to some invalid conclusions based on a quick research.
  • Commenting on blogs for links is a good solution – yes, at some point, it used to be like this. Not anymore, and not for quite a long time now. You can still comment on blogs if all you’re after is some brand exposure, if you want to get known in your niche, if you want to add value to the discussions and can bring value to the table. But don’t just post comments on blogs to get links, it doesn’t work like this anymore.
  • Web directories submissions – for a while, they were the El Dorado of link building. Everybody was doing it, all it took was some time to fill in forms, or an external freelancer to do so for you, or even a piece of software to help you with the submissions. You can still add your web site on business directories, which are different from web directories in the sense that they ask for much more data, and generally bring some other value than the link itself. Also, there are some top web directories which you might want to consider to add your web site to. Just make sure that you work on a small, selected list of directories, as opposed to a very large one.
  • Posting on forums for links – if you post on forums for traffic, for brand exposure, to get insights on your business, to connect with others, keep doing so. If you only look for forums which allow you to add links, and only post on forums for the sole purpose of getting links, you should stop this practice, as it doesn’t bring too much value, and search engines might consider your practices as spammy.
  • It’s OK to get lots of links fast – this is called “link velocity”, and you should avoid doing link building only from time to time, and when you do, get a lot of links at once. It is much better to get link slowly, at a steady pace, which will likely not bring any SPAM signals.
  • It’s better to get 10 links from one web site than 1 link from 10 web sites – you can see this is wrong yourself, by asking yourself – how difficult is, in general, to get 10 links from a single site (answer – typically, relatively easy), compared to getting 1 link from 10 web sites (answer – typically, much more difficult).
  • It helps to SPAM on social networks for links – Google can’t see quite a lot of the content which is posted on Facebook. Twitter is more open to Google’s crawling, but even it has its limitations. Google Plus, the social network owned by Google, doesn’t have that many visitors. As you can see, Google doesn’t have much access to some social networks, so it’s relatively difficult for it to rank web sites based on social media presence. That being said, there is an almost certainty that Google uses some indices to rank web sites better based on social networks presence, it’s just that it’s not as important as most people tend to think.
  • Links don’t matter anymore – there are a lot of new signals which Google now processes, but links tend to be, still, the most important factor in deciding whether a web site ranks on the first page of results for a query or not. The number of criteria Google is using increases at a steady pace, but, still, links are the most important criteria.
  • Meta keywords – Both Google and Bing, the most important search engines, long asserted (click here and here) that they don’t value meta keywords anymore. Bing actually might use the tag to trigger some SPAM signals to your web site. Sure, at some point in time the tags were used by pages to determine what a specific page is about. People abused the tag so much, that as some point it became irrelevant and search engines just said: “Stop!”. Yet, some people continue to use it.
  • Tags on non-social network web sites – on social network web sites, tags are generally called “Hashtags” and serve a good purpose: they help with the discovery of content. YouTube uses tags to help understand what videos are about. But on general content web sites, tags are mostly used on WordPress-made web sites, initially to make the content easier to find. But it generally leads to duplicate content issues, so my recommendation is either not to use it at all, or to mark those pages as “noindex, follow”, using a meta robots tag.
  • Repeating texts – the problem with this method is that there are still some web sites with good SEO results when using this technique. What’s likely to happen is that the repeating texts are only the tip of the iceberg of the total number of techniques some sites use to get good results on Google, but since it is one of the most visible methods, it tends to spread. My advice – create your content for the users, not for the engines. If you think your users are happy with a large portion of repeated text, then use them. If not, then don’t.
  • Keyword stuffing – this method implies putting keywords pretty much everywhere: in H1 tags, URL, title, meta title, meta description, body text, ALT text for images, image URLs. As you might imagine, this method is likely to bring up some anti-SPAM triggers on Google. I would advise against using and abusing this method.
  • H1 tag on logo – this method implies that search engines don’t know that well what H1 tags are, and if you put the H1 tag on the logo, you will get more link value than if you’d put it someplace else. H1 tag should only be used for the title of the current page. On the homepage, you can safely have no H1 tag.
  • Repeating keywords in URLs – again, make your content easy to discover, share and understand by your users. If you think repeating keywords helps with this (although I seriously doubt it), then use it. If not, which is most likely the case, don’t use this method.
  • Optimal word count in articles – an article should be as long as it’s necessary. For certain topics, you might need more words (and you should also structure that content in sub-sections), for others even a small paragraph will do. There is no magic answer to the question. Make the content good for the users, no matter the length of the article.
  • It’s better to create a meta description yourself than leave Google for this task – when dealing with large web sites, I see this a lot: site owners prefer to put in meta description the first paragraph of text in the page, or the first 155 characters (with spaces) from that text. The trouble is, Google would be much better than you at getting the automated text from the web site, so, in case you don’t have the resources to create all of your meta descriptions by hand, allow Google to do this.
  • Meta descriptions should be a boring bit of text, unlike AdWords texts – a lot of people invest large budgets, efforts, and resources to create the optimal set of AdWords ads, but when talking about meta descriptions, they put some boring text in there, not optimizing the descriptions. I believe this solution is wrong, I would advise you to work on the meta descriptions with the same effort you do the AdWords ads.
  • Not using og:image tag – for a lot of pages, that tag can be automatically obtained with the first image in the page. But there are certainly lots of pages in large web sites where it would be better to have an og:image tag. I would advise you to create such tags.
  • Google can index JavaScript menus – while it’s certainly true to some extent, it’s still better to avoid JavaScript in menus.

I am a Digital Marketing Manager for The KPI Institute. My expertise is in SEO (Search Engine Optimization) / UX (user experience) / WordPress. Co-founder of lumeaseoppc.ro (series of events on SEO & PPC) and cetd.ro (Book on branding for MDs). On a personal level, I like self-development - events, sports, healthy living, volunteering, reading, watching movies, listening to music.

No comments yet.

Leave a comment

Your email address will not be published.