This article discusses best practices as well as pitfalls of search engine optimization in regulated industries. Regulated industries include healthcare, banking, finance, pharmaceuticals, and publicly traded companies. These types of companies are subject to various government regulations but often lack sufficient guidance regarding acceptable practices in social media, search, and targeted advertising. Messing explains that one of the ways that search engines determine the relevancy of a web page is through link analysis. The search engine examines which websites link to that page, and what the text of those links says about the page, as well as surrounding content, to determine relevance. He explains that the difference between being on the first page and being on the second page of a search makes a huge impact on a company’s profit. One of the issues SEO companies are facing, especially when working for a regulated industry is that using paid affiliates, freelance bloggers, or other webpages under the SEO company’s control, is creating problems with the government. The search engines severely penalize for disclosing paid ads, but the Federal Trade Commission severely penalizes for nondisclosure. This has become an increasing problem because the FTC has increased its focus on paid links, content, and reviews.
Google unveiled a large number of changes to the algorithm which are already affecting or will affect the algorithm and thus the search results we see. Among the many (40) interesting updates were a number of updates to the Local Search options. There have been improvements to ranking for local search results across the board. This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal. Directly translated, the amendment may mean that Google has an even greater use of the usual organic results as a ranking signal for local listings.
Google's Direct changes: “Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.
"Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user."
What the Means:
Previously if you searched for a local service without city modifier (cupcakes vs. San Diego cupcakes) you’d see a set of blended MAPs based on local results. HOWEVER the organic listings would be national. They would be general research sites that ranked high or associations or national sellers. You would not normally see LOCAL results in organic if you didn’t add city to your query. That’s now changing due to Google’s new update Venice. Google has decided that most searches are for “cityless keywords” so this is a huge opportunity that didn’t exist before – to get your website on page one EVEN IF prior to this your algorithm ranking wasn’t good enough. Now due to this VENICE update, you have a shot at getting into the local ORGANIC results.
Is Google bias to Wikipedia? Bing proves that all search engines are.
Through the years Wikipedia has been a prominent result on Google’s organic search result listings. This fact, through the years has been used to criticize Google for having a bias in favor of Wikipedia and complaints have been filed as early as 2005. In 2007 the blog The Google Cache found that 96.6% of Wikipedia’s pages appeared on Google’s top ten search results. Most critics complain that Wikipedia is undeserving of such good search engine optimization because it is unreliable. Search Engine Watch decided to compare Wikipedia’s rank on Google with its rank on Bing. The results proved that actually Wikipedia is much more prominent on Bing’s search results than it is on Google.
Kevin Gibbons of eConsultancy points out that no one should be surprised Wikipedia has a strong presence on Google’s search results for the following reasons:
Wikipedia has pages written for individual search terms, providing it with a very long long-tail of information.
No one can rival Wikipedia’s domain authority. It has 6.13 million links to its pages.
Great internal linking structure.
Excellent page authority. Wikipedia provides citations from trusted and revered sources.
The Search Engine Watch study makes it clear that Wikipedia is slightly more prominent on Bing than it is on Google. These results prove that Google does not have a bias in favor of Wikipedia, but rather Wikipedia is highly successful in its search engine optimization. Below are the results from Search Engine Watch:
Search Engine Optimization Mistakes
When creating a website, there are many tactics that can be used to raise the ranking of a website, but abusing these tactics, is a common mistake that can harm the overall grade or rating of a website. One common abuse of an optimization tactic is using misleading keywords, commonly known as keyword stuffing. To avoid stuffing, a website must use the keyword as much as possible within the websites content. The content and keywords of a site should also be frequently updated to keep up with the latest search tendencies.
Back links are also important in search engine optimization. The quality of websites that link to your work is important when trying to gain a high rating. If the links on your website are of poor quality or if too many unreliable sources link to your work, Google and other search engines will not rate your website as well.
Text cloaking is another tactic that is used to try to trick the crawling spiders that search engines use to make their rating. Text cloaking is putting hidden text within a website, for example writing text that is the same color as the background. The spiders will detect these words even if they are not visible to a user, and this will in turn lower the rating of your website.
One realm of search engine optimization that has not been conquered by Google, Bing or any other search engine is the ability to detect flash files. Flash files may be detailed and have informative content but they have no value in search engine optimization because there is still no way to index flash files.
Finally, duplicating content is a way to lower the rating of your website. It is a common mistake that your website will be penalized for because search engines are looking to provide fresh content to users. Duplicate content could force companies like Google to mark your website as spam and take you off the search results page for common search queries.
By Trevor Straub Back to Search Engines Homepage
Privacy Lawyer Aaron Messing Speaks About FTC Compliance and Privacy at SES NY 2012
By Sonya ModiThis article discusses best practices as well as pitfalls of search engine optimization in regulated industries. Regulated industries include healthcare, banking, finance, pharmaceuticals, and publicly traded companies. These types of companies are subject to various government regulations but often lack sufficient guidance regarding acceptable practices in social media, search, and targeted advertising. Messing explains that one of the ways that search engines determine the relevancy of a web page is through link analysis. The search engine examines which websites link to that page, and what the text of those links says about the page, as well as surrounding content, to determine relevance. He explains that the difference between being on the first page and being on the second page of a search makes a huge impact on a company’s profit. One of the issues SEO companies are facing, especially when working for a regulated industry is that using paid affiliates, freelance bloggers, or other webpages under the SEO company’s control, is creating problems with the government. The search engines severely penalize for disclosing paid ads, but the Federal Trade Commission severely penalizes for nondisclosure. This has become an increasing problem because the FTC has increased its focus on paid links, content, and reviews.
Google's SEO goes LOCAL: The Venice Update
By Kelly HoweBased on Understand and Rock the Google Venice Update
Google unveiled a large number of changes to the algorithm which are already affecting or will affect the algorithm and thus the search results we see. Among the many (40) interesting updates were a number of updates to the Local Search options. There have been improvements to ranking for local search results across the board. This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal. Directly translated, the amendment may mean that Google has an even greater use of the usual organic results as a ranking signal for local listings.
Google's Direct changes:
“Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.
"Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user."
What the Means:
Previously if you searched for a local service without city modifier (cupcakes vs. San Diego cupcakes) you’d see a set of blended MAPs based on local results. HOWEVER the organic listings would be national. They would be general research sites that ranked high or associations or national sellers. You would not normally see LOCAL results in organic if you didn’t add city to your query. That’s now changing due to Google’s new update Venice. Google has decided that most searches are for “cityless keywords” so this is a huge opportunity that didn’t exist before – to get your website on page one EVEN IF prior to this your algorithm ranking wasn’t good enough. Now due to this VENICE update, you have a shot at getting into the local ORGANIC results.
Is Google bias to Wikipedia? Bing proves that all search engines are.
By Stephanie CostaBased on Bing, Not Google, Favors Wikipedia More Often in Search Results By Danny Goodwin
Through the years Wikipedia has been a prominent result on Google’s organic search result listings. This fact, through the years has been used to criticize Google for having a bias in favor of Wikipedia and complaints have been filed as early as 2005. In 2007 the blog The Google Cache found that 96.6% of Wikipedia’s pages appeared on Google’s top ten search results. Most critics complain that Wikipedia is undeserving of such good search engine optimization because it is unreliable. Search Engine Watch decided to compare Wikipedia’s rank on Google with its rank on Bing. The results proved that actually Wikipedia is much more prominent on Bing’s search results than it is on Google.
Kevin Gibbons of eConsultancy points out that no one should be surprised Wikipedia has a strong presence on Google’s search results for the following reasons:
- Wikipedia has pages written for individual search terms, providing it with a very long long-tail of information.
- No one can rival Wikipedia’s domain authority. It has 6.13 million links to its pages.
- Great internal linking structure.
- Excellent page authority. Wikipedia provides citations from trusted and revered sources.
The Search Engine Watch study makes it clear that Wikipedia is slightly more prominent on Bing than it is on Google. These results prove that Google does not have a bias in favor of Wikipedia, but rather Wikipedia is highly successful in its search engine optimization. Below are the results from Search Engine Watch:Search Engine Optimization Mistakes
When creating a website, there are many tactics that can be used to raise the ranking of a website, but abusing these tactics, is a common mistake that can harm the overall grade or rating of a website. One common abuse of an optimization tactic is using misleading keywords, commonly known as keyword stuffing. To avoid stuffing, a website must use the keyword as much as possible within the websites content. The content and keywords of a site should also be frequently updated to keep up with the latest search tendencies.
Back links are also important in search engine optimization. The quality of websites that link to your work is important when trying to gain a high rating. If the links on your website are of poor quality or if too many unreliable sources link to your work, Google and other search engines will not rate your website as well.
Text cloaking is another tactic that is used to try to trick the crawling spiders that search engines use to make their rating. Text cloaking is putting hidden text within a website, for example writing text that is the same color as the background. The spiders will detect these words even if they are not visible to a user, and this will in turn lower the rating of your website.
One realm of search engine optimization that has not been conquered by Google, Bing or any other search engine is the ability to detect flash files. Flash files may be detailed and have informative content but they have no value in search engine optimization because there is still no way to index flash files.
Finally, duplicating content is a way to lower the rating of your website. It is a common mistake that your website will be penalized for because search engines are looking to provide fresh content to users. Duplicate content could force companies like Google to mark your website as spam and take you off the search results page for common search queries.
By Trevor Straub
Back to Search Engines Homepage