Search Engine Law. The web has a General. He’s big, he’s mean and he’s quick-on-the-draw. His name is Google. General Google keeps the internet frontier safe for law-abiding citizens and white hat content creators alike.
While white hat websites work within the law, search engines are locked in an ever-escalating shoot-out with black hat practitioners. Internet users get caught in the crossfire on a regular basis; unable to differentiate between reputable sites and those with harmful, spam-filled content.
1) Keyword stuffing is bad Trick. Proper keyword use is not the concern of this article, so for now we’ll focus only on the improper kind. Keyword overuse leads to synonym underuse, and makes for content that’s inaccessible to the average human user. Though people might not be able to read your content, search engine robots still will. Oversaturated pages will get you penalised.
2) Hidden text is invisible to human eyes. Keywords or links can be camouflaged by colour-matching text to background leaving them unreadable to human visitors, but perfectly readable to search engine bots.
More complex methods employ cascading style sheets (CSS) or layering to hide text beneath surface copy. Such text is also readable to a search engine spider and not a human user. Black hat operatives attempt to fill their sites with hidden content for the express purpose of achieving higher rankings in search lists, regardless of whether their pages are relevant to a user’s initial search request.
Google law basically states that you should build your website for users, not for search engines. Ignoring this advice by including hidden text or links is one of the quickest ways to get your site blacklist bound.
3) Doorway/gateway/bridge/portal pages are created specifically for search engine bots. They are designed to target particular keywords or phrases and will usually be extremely user-unfriendly and/or difficult to read. Because they are simple devices used to trick people towards actual websites, they rarely contain anything useful (other than prominent “CLICK HERE” links to the real destinations).
Black hat webmasters create portal or bridge pages that bypass the need to click on a link completely, using fast meta refresh commands that whisk you to another site (without so much as a by-your-leave). For this reason, many search engines now refuse to accept pages that use fast meta refresh.
4) Cloaking can be achieved either through IP address delivery or agent delivery. As with people, bots are identified by their user agent or their IP addresses. Two sets of content are created, one delivered to the Google-bot, the other to human visitors. The bot is deceived by the fake pages (the content of which is usually saturated with targeted keywords) and grants the website a higher relevancy ranking. When the user clicks on what they perceive to be a promising link, they’re promptly forwarded to a browser page that’s nothing to do with their original search.
5) Mirror websites are two or more separate sites that use identical content, but employ different keywords and meta tags to obtain rankings on alternative searches. This technique violates the rules of many search engines, and will probably get one or all of your mirrored sites banned.