SEO optimization

Tactics to avoid when refining your SEO for best results

What not to do when working to improve your SEO

Tactics to avoid when refining your SEO

So far, we’ve covered the essential steps in optimizing your website to rank well in the search engines in our preview article called How to optimize web pages for SEO and rank better in Google  and tactics to avoid when refining your SEO: building a keyword list for your website, refining and prioritizing your keyword list, grouping your keywords for optimizing individual webpages, optimizing your page content and optimizing your page metadata. In addition to all of the “do’s” we’ve discussed, there are a number of important “do not do it”.

Some of these are tempting to do, especially as you are working long and hard on optimizing lots of web pages, but search engines penalize pages for these sins, so you definitely want to avoid them! Once upon a time, SEO practitioners did some of these things deliberately, because they could successfully “trick” search engines into giving their pages higher rankings. However, as search engine technology has advanced and improved, algorithms to detect these practices and demote pages that use them have made them obsolete.

Keyword stuffing : Tactics to avoid when refining your SEO

As mentioned earlier, this refers to repeating keywords many times throughout a page’s content and/or metadata. Search engines calculate “keyword density” for pages, which describes the percentage of times that a keyword appears in a page, as a proportion of all the words on a page.

As a rule of thumb, you want to make sure that none of your keywords have a keyword density of over about 3% (meaning, that they don’t appear more than three times for every hundred words). Another good rule of thumb, as we’ve already mentioned, is to always write for your human audience: if repeating a keyword so many times make the page read poorly, you know you’ve gone over the limit.

Hiding keywords : Technique avoid when refining your SEO

One way of doing keyword stuffing without affecting the end-user experience is repeating keywords many times in a page, but making this text invisible (for example, by displaying it in the same color as the page background, placing it behind an image or setting it to be invisible using CSS).

Search engines have gotten very good at detecting text that users don’t actually see – and demoting pages that contain it because it’s a big “red flag” that “#black-hat SEO” is at work. There are exceptions to this rule, but in general, it’s a good idea to avoid the use of invisible text.

This is tactics to avoid when refining your SEO.

Duplicate content

This refers to optimizing many pages to target the same keywords, using the same (or very similar content).

The technique behind doing this is to try to get as many pages in the top search results listings as possible, to increase the chances of a searcher clicking on one of those pages (and not one of a competitor).

If a search engine detects repeated content on different pages (sometimes, even on pages found on different sites: which might indicate plagiarism), it will demote those pages in search result rankings.

Sometimes having duplicate content is necessary: see below about how to use “canonical tags” to handle those situations.

Thin content

This is the creation of small, brief web pages containing very little actual content beyond the targeted keywords. Again, and sorry for repeating this so many times, your content needs to provide a good experience for your site visitors! Take in mind that for results it’s mandatory to take into consideration these tactics to avoid when refining your SEO

Creating short pages that contain a bunch of keywords, but that doesn’t provide value to visitors, will not help you in getting good rankings for those pages in search results.

If you’re curious how search engines know which pages provide real value to users and which don’t, it works like this: when most users click a particular link and quickly click.

Back in their browser, returning to the search engine results page, the search engine knows that users didn’t find what they were looking for on that page.

Thus, the search engine will demote that page in its search results in favor of other search results from which users don’t return quickly – or at all!

Meta keywords

Years ago, search engines would consider the contents of the meta-keywords tag, in which the website owner could specify the keywords that describe each page.

Since search engines ignore this tag these days, don’t bother with it. It won’t hurt you, but it won’t help you either.

Broken links

While not a keyword-related issue, you want to avoid having broken links on your site, whether to other pages (or images, videos, files, etc.) on your site or to other sites.

Search engines can penalize a page (or an entire site) for containing too many broken links.

There are many tools for scanning a website and reporting any broken links found (two free online ones are W3C Link Checker and Dr. Link Check); it’s a good technique to use one every now and then and make sure that all the links on your site are in good shape.


Leave a Reply